Parents want TikTok to pay for addicting their kids, demanding a jury trial to decide whether TikTok's design needs to change. Damages sought right now are unspecified but are expected to cover intangible losses suffered by the kids before they died, as well as the loss of each kid's "future earning capacity" and "normal activities, pursuits, and pleasures."
Some of these steps include adding a process for age and identity verification at sign-up so that TikTok could notify parents of their app use and block content like Blackout Challenge videos from ever reaching kids' feeds.
So if a child goes to a friends place and their parent gives them a Bud Light and something happens to that child, is the responsibility on the parent, or Budweiser? It's a different world, and parents aren't adapting fast enough. I think part of it is that those (like me, although I don't have kids) grew up just as technology was starting, and we all did things that we shouldn't have with it. We aren't translating that to today, where we should know some tech is dangerous and we shouldn't give our kids, or our kids friends, access to everything that they want.
Another day, another scammer illegally aiming at the deep pockets instead of the guilty party.
TokTok is distributing this content and needs to be held accountable for the deaths it directly caused. Scammers are the ones pretending that corporations can’t be held accountable, like yourself. All internet companies should be held 100% responsible for content. YouTube should never have had rampant piracy, that’s illegal. There was a choice made and it was the wrong choice, we don’t need a free for all online, where free content is exploited irresponsibly. Ars takes responsibility, TikTok can too.
I disagree but it’s funny how YouTube and TikTok can be 300% successful removing infringing content like piracy (while accepting that quite a bit of news, commentary, remixing, and other fair use will be removed and punished as well. They’ve been forced to care about that! If we want them to stop promoting videos that incite violent revolution and hate crimes and self-harm we could force them to care about those issues too.
Yeah, we also did that at school long before TikTok existed. Actually, long before Facebook existed, and even before most of us had access to the internet.We used to do something like that. Not strangulation, but something similar that caused blackout. We were lucky I guess.
The TikTok algorithm produces kinds of virtual group. If you watch cat videos, you'll get more cat videos. Same with other subjects. These groups may not be reified as a software data structure, but they are persistent in the data, to the point where they get noticed and talked about. For example, as "Harry Potter TikTok". Some people use it in their marketing spiel. "If this is on your For You page, then you are interested in short tattooed Asian girls like me."TikTok doesn't have any features that really enable any kind of "groups". In fact, it is designed specifically to prevent that.Those assholes you see weaving around traffic at 30 over the speed limit? Often they're recording and posting on TikTok. There are entire groups on TikTok dedicated solely to making and posting videos of driving dangerously & illegally. They get people killed - gruesomely.
I'm getting so tired of hearing parents cry about their 8 year old being addicted to the internet. Take their phone away. It's truly that simple. Why did you give an 8 year old a phone to begin with? I'm betting the real answer is "to shut them up so I don't have to pay attention to them." I don't care if Purdue Pharma is making cartoons about how cool opioids are, you're the reason they keep watching it. Y'all cram a screen in their face every second they get and then wonder why they shoot up their schools or kill themselves. This kind of crap is why abortion should not only be legal but, in some cases, mandatory.
Why do you give a kid a phone?
Maybe because they leave the house occasionally and need a way to call home to get picked up? When was the last time you saw a pay phone?
You let your eight-year-old willy-nilly walk around the streets? Unless they're going to a friend's house no more than two or three blocks away, they need to be driven, or you walk with them. Once there, they can use the phone there to call for you to come get them, or the parent of the friend can drive them back. We're long past the time when suburban or rural children can walk about without care, and no way in hell should an urban child be left unattended and out of sight.
When I was 8 me and my friends would get dropped at the mall on a Saturday, watch a movie get a burger and call someone's parent to come and get us afterwards. By the time we were 12, we were riding our bikes to the theatre and calling for a ride if it started raining Somehow we survived.
You sound like the kind that keeps their kid on a leash and loses their shit if it dares to be in a different room from you.
A platform that is built almost solely to reduce average intelligence level of the society
My daughter sure as hell didn't have access to social media, etc. at 8 or 9 years old.
That's not actually a very good way to protect her. As people have pointed out, she WILL see that stuff even if you don't know about it and even if it's not very often.
Better is to make sure that, by 8 or 9 years old, she knows that the Internet is full of fakes and idiots, and anything that seems dodgy probably is. And specifically that anything labelled as a "challenge" is trying to get you to do something moronic and self destructive.
It's not an either/or. You can do both. And social media access at 8 or 9 years old is idiotic.
It kind of IS an either/or. If the kid "does not have access to" TikTok or Youtube or whatever, that implies that you are not watching TikTok or Youtube or whatever with the kid, which means you are not in a position to point out how various videos are stupid or manipulative.
At 8 or 9, not very many people can learn to recognize what's going on behind the scenes just by being abstractly told "there are a bunch of liars and fools on the Internet". They need specific examples.
You can either provide those examples, or not.
My god that is fantastic! I wish I'd thought of that.Your comment reminds me of the "Send Me To Heaven" app, which I did not realize until just now is apparently still available on the Google Play Store.
For those who don't know:
https://en.wikipedia.org/wiki/Send_Me_To_Heaven
You really want people to read all that?So, here's the only real solution I can think of, which will never happen: Break down all social media companies, period.
I'm not making this statement just because I hate social media, which I do, but it's about the core problem that all social media platforms have - they are not only too big to fail, they are also too big to monitor themselves or take any relevant action to stop stuff like these from happening.
Cases like these are always obvious how dangerous and stupid they are... after the fact. People who don't participate in the specific social network are completely unaware of what is happening, until it hits the news. It's not because they are stupid, it's not because parents don't care what happens to their kids, etc. It'd because that's a function of the platform itself.
They are all built with a relative degree of exclusivity or anonymity, sometimes painting themselves as sort of a refuge for specific class of people, sometimes forcibly taken by storm by a class of consumers and users that are just bored with what is currently out there.
The scale problem is easy to see, but perhaps pretty difficult to understand the full consequences of.
Platforms that host user submitted content which have not only billions of people using it, but also millions of people submitting content inevitably falls into the trap of permissiveness.
Why? Because no matter how many employees you have, and no matter much you slave them, it will never be enough to review the content that is in the platform itself, in an appropriate or satisfactory level. You could have an entire country's army trying to handle it with specific training and proficiency (doesn't exist and will never happen, but let's play hypotheticals here), it still wouldn't be enough.
See, let's look at a few statistics here... I've done this with Facebook and Twitter in the past.
TikTok has somewhere around 1 billion monthly active users. That's active, the app itself has been downloaded some 3 billion times. It's an absurd number, but let's say that only a smaller percentage of that really uses the platform with frequency. It'd still be too much. A hundred millions. Ten millions. Still too much.
167 million TikTok videos get watched in a single minute on average everyday. In a single minute. See the problem with that? How could something like that ever be monitored?
Several of the TikTok challenge videos have over a million views. Some have over 100 million.
The idea is huge because it's engaging, it racks up huge profits not only for creators, but also for the platforms themselves.
Much like several other damaging and damning social media trends, this one also shows up in all these market analysis statistics so that advertisers and brands drool over the numbers, and think of some way of exploiting it for profit.
You know what numbers like those do not allow for? Moderation, monitoring, effective policing, meaninful management and interference - control in general. It's out of control because it's too big. 7 kids dying is always awful to hear, but with sizes like those a fact like that becomes statistics for anyone involved that could act to stop or prevent it somehow.
Can these platforms do more? Yes, sure, but more what? With numbers like those, it's a rat's race. They might not be as blind as the general population on what's happening in their platform, but let me tell you - they are likely not too far ahead. People were just not built to deal with numbers like that in any meaningfull way.
A trend will pop up, more kids will die, and they'll have no way of knowing beforehand. They'll do something to prevent one thing, letting another hundreds or thousands others go by.
Because the only thing these platforms can do at that scale is employ some sort of algorithmic automated monitoring and management, and those are just not advanced enough to accurately pick and choose, judge, and execute fuzzy human policies to it. For that, we need a singularity, and despite news of recent past, we are nowhere close to it.
Worse yet, even if we had some sort of AI to handle it all, I doubt people would accept it's actions like it was final. It becomes a game of blame the AI devs or complain about the AI instead of going after the business behind it.
So, with all that laid down, what we could possibly do to avoid these worst case scenarios? Break them up in parts small enough that regional administration, monitoring, moderation and action is feasible at the hands of people.
Current social networks kinda evolved from that. Older social networks, discussion forums, chat rooms, newsgroups, etc.
Oh, but that's old, no one wants that! Well, it's not about what people want, it's about what has a chance of working. Putting these platforms on the spotlight, vilifying CEOs, calling for depositions, and the regulations, fines, penalties that have already passed against social platforms are clearly ineffective. It's because no matter what people think of doing to stop it, it doesn't get to the point. The point that the entire concept of social networks is like a drug, or gambling, or something that societies in general got addicted to, and can't let go. People have become dependant on it, but it's exactly that dependance that creates all these problems.
The closest thing I've seen to something more reasonable is descentralized networks. But another form could just be limiting the number of entries seen in a day, the time spent on the platform, the ammount of content that can be submited at a time, the number of views an entry can have, the time an entry can be kept up on the platform, how curation works, how advertisement works... plus a myriad of other things that have to go back to the drawing board and restart from scratch with consideration to what the platform is propagating, versus pure for profit motivations.
I used to think this wasn't necessary, before people started looking at social media as news sources. They should've never become that, just a form of harmless entertainment not to be taken too seriously, or socialization that you also treat as a strangers' gathering thing.
That time was too long ago, and now we are in this situation where every single method of forcing news sources to be somewhat responsible for what they propagate fell into the waysides and nothing of substance is being done about it.
And this isn't only about kids, this is about everyone, the entire media landscape, the entire consumer base, adults, old people.
I used to also think that people would adapt, cultures would prop up some framework to control the worst impulses that these platforms gives rise to... I was wrong. This is a self sustaining cycle of hatred and horrible sh*t, it has become it's own economy and it's own industry, it is supported by a status quo of giant corporations that will never let go of it because of guaranteed profits that only drives problems like the wage gap and late stage capitalism further, and it'll never end, unless it self destructs taking everyone else with it. There's your "destroyer of worlds", it looks like a cute pet in comparison to a nuke, right? Only it isn't.
Social networks are the ultimate form of the tragedy of the commons in the virtual world. The resource being exploited is attention, dedication, focus, time, effort. The consequences of depletion are far worse than depletion of any other single resource... it's literally killing people, leading to destructive pathways, causing divides in society, elevating hatred, and potentially being behind wars, the rise of populism, the rise of denialism, fake news and a whole ton of other crap. It's no surprise that it also fuels crap like challenges that kills kids.
We were in so much of a better position just a couple of decades ago as communities and societies that it's all self evident. In what period of time we can say clearly and without nostalgic interference or selective bias that we were better before as societies and communities as a whole?
I don't care what others say, and I have done my entire journey of checking if this isn't only nostalgia - the Internet was better before social networks. It should've stopped there. Or at the very least, taken a different route.
Spoiler tags are your friend. Use spoiler tags wisely.You really want people to read all that?So, here's the only real solution I can think of, which will never happen: Break down all social media companies, period.
I'm not making this statement just because I hate social media, which I do, but it's about the core problem that all social media platforms have - they are not only too big to fail, they are also too big to monitor themselves or take any relevant action to stop stuff like these from happening.
Cases like these are always obvious how dangerous and stupid they are... after the fact. People who don't participate in the specific social network are completely unaware of what is happening, until it hits the news. It's not because they are stupid, it's not because parents don't care what happens to their kids, etc. It'd because that's a function of the platform itself.
They are all built with a relative degree of exclusivity or anonymity, sometimes painting themselves as sort of a refuge for specific class of people, sometimes forcibly taken by storm by a class of consumers and users that are just bored with what is currently out there.
The scale problem is easy to see, but perhaps pretty difficult to understand the full consequences of.
Platforms that host user submitted content which have not only billions of people using it, but also millions of people submitting content inevitably falls into the trap of permissiveness.
Why? Because no matter how many employees you have, and no matter much you slave them, it will never be enough to review the content that is in the platform itself, in an appropriate or satisfactory level. You could have an entire country's army trying to handle it with specific training and proficiency (doesn't exist and will never happen, but let's play hypotheticals here), it still wouldn't be enough.
See, let's look at a few statistics here... I've done this with Facebook and Twitter in the past.
TikTok has somewhere around 1 billion monthly active users. That's active, the app itself has been downloaded some 3 billion times. It's an absurd number, but let's say that only a smaller percentage of that really uses the platform with frequency. It'd still be too much. A hundred millions. Ten millions. Still too much.
167 million TikTok videos get watched in a single minute on average everyday. In a single minute. See the problem with that? How could something like that ever be monitored?
Several of the TikTok challenge videos have over a million views. Some have over 100 million.
The idea is huge because it's engaging, it racks up huge profits not only for creators, but also for the platforms themselves.
Much like several other damaging and damning social media trends, this one also shows up in all these market analysis statistics so that advertisers and brands drool over the numbers, and think of some way of exploiting it for profit.
You know what numbers like those do not allow for? Moderation, monitoring, effective policing, meaninful management and interference - control in general. It's out of control because it's too big. 7 kids dying is always awful to hear, but with sizes like those a fact like that becomes statistics for anyone involved that could act to stop or prevent it somehow.
Can these platforms do more? Yes, sure, but more what? With numbers like those, it's a rat's race. They might not be as blind as the general population on what's happening in their platform, but let me tell you - they are likely not too far ahead. People were just not built to deal with numbers like that in any meaningfull way.
A trend will pop up, more kids will die, and they'll have no way of knowing beforehand. They'll do something to prevent one thing, letting another hundreds or thousands others go by.
Because the only thing these platforms can do at that scale is employ some sort of algorithmic automated monitoring and management, and those are just not advanced enough to accurately pick and choose, judge, and execute fuzzy human policies to it. For that, we need a singularity, and despite news of recent past, we are nowhere close to it.
Worse yet, even if we had some sort of AI to handle it all, I doubt people would accept it's actions like it was final. It becomes a game of blame the AI devs or complain about the AI instead of going after the business behind it.
So, with all that laid down, what we could possibly do to avoid these worst case scenarios? Break them up in parts small enough that regional administration, monitoring, moderation and action is feasible at the hands of people.
Current social networks kinda evolved from that. Older social networks, discussion forums, chat rooms, newsgroups, etc.
Oh, but that's old, no one wants that! Well, it's not about what people want, it's about what has a chance of working. Putting these platforms on the spotlight, vilifying CEOs, calling for depositions, and the regulations, fines, penalties that have already passed against social platforms are clearly ineffective. It's because no matter what people think of doing to stop it, it doesn't get to the point. The point that the entire concept of social networks is like a drug, or gambling, or something that societies in general got addicted to, and can't let go. People have become dependant on it, but it's exactly that dependance that creates all these problems.
The closest thing I've seen to something more reasonable is descentralized networks. But another form could just be limiting the number of entries seen in a day, the time spent on the platform, the ammount of content that can be submited at a time, the number of views an entry can have, the time an entry can be kept up on the platform, how curation works, how advertisement works... plus a myriad of other things that have to go back to the drawing board and restart from scratch with consideration to what the platform is propagating, versus pure for profit motivations.
I used to think this wasn't necessary, before people started looking at social media as news sources. They should've never become that, just a form of harmless entertainment not to be taken too seriously, or socialization that you also treat as a strangers' gathering thing.
That time was too long ago, and now we are in this situation where every single method of forcing news sources to be somewhat responsible for what they propagate fell into the waysides and nothing of substance is being done about it.
And this isn't only about kids, this is about everyone, the entire media landscape, the entire consumer base, adults, old people.
I used to also think that people would adapt, cultures would prop up some framework to control the worst impulses that these platforms gives rise to... I was wrong. This is a self sustaining cycle of hatred and horrible sh*t, it has become it's own economy and it's own industry, it is supported by a status quo of giant corporations that will never let go of it because of guaranteed profits that only drives problems like the wage gap and late stage capitalism further, and it'll never end, unless it self destructs taking everyone else with it. There's your "destroyer of worlds", it looks like a cute pet in comparison to a nuke, right? Only it isn't.
Social networks are the ultimate form of the tragedy of the commons in the virtual world. The resource being exploited is attention, dedication, focus, time, effort. The consequences of depletion are far worse than depletion of any other single resource... it's literally killing people, leading to destructive pathways, causing divides in society, elevating hatred, and potentially being behind wars, the rise of populism, the rise of denialism, fake news and a whole ton of other crap. It's no surprise that it also fuels crap like challenges that kills kids.
We were in so much of a better position just a couple of decades ago as communities and societies that it's all self evident. In what period of time we can say clearly and without nostalgic interference or selective bias that we were better before as societies and communities as a whole?
I don't care what others say, and I have done my entire journey of checking if this isn't only nostalgia - the Internet was better before social networks. It should've stopped there. Or at the very least, taken a different route.
Man ffs it's longer than the damn article!
Post it to a blog or ask Ars if they want to hire you.
I got bored this week and gave Tik-Tok as shot while on an elliptical. Starting from scratch, it was 2 hours of marking "not interested" to thirsting on K-pop bois (turns out my bias is exasperation), Harry Styles, candid shots of hot guys on public transportation, and Stranger Things reactions.
It felt like being on the bus in middle school again, a demographic that is completely immune to peer pressure. /s
Incidentally, did anyone else have a fad in your middle school where you would crouch down, hyperventilate, and then stand against a wall while someone puts pressure on your chest for 10 seconds making you pass out? History seems to repeat itself.
Has every scintilla of common sense suddenly evaporated? Even at that age I knew better than to deliberately strangle myself (or others). Geezus, are these kids eating paint chips?
Years ago, I read a biography of Ted Turner, which contained a passage on how Mr. Turner saw the influence of television. I don't recall the exact quote, but he said something to the effect of "I could go on the air during the Saturday morning cartoon block, show kids how to light a book of matches, and by the afternoon Atlanta would be a smoking pile of ashes".
My god that is fantastic! I wish I'd thought of that.Your comment reminds me of the "Send Me To Heaven" app, which I did not realize until just now is apparently still available on the Google Play Store.
For those who don't know:
https://en.wikipedia.org/wiki/Send_Me_To_Heaven
(I like how some people tossed their phones in the air without even downloading the app. Sociologists should study that one closely.)
Before the "parents shouldn't have let them use TikTok" comments flood in:
We decided as a society a ways back that parental responsibility did not give companies free-reign to market harmful items to minors.
Such as "Joe Cool" a cartoon camel advertising cigarettes: https://www.ftc.gov/news-events/news/pr ... w-ftc-says
There's certainly a need to prove (in court) that TikTok is addictive to children, and further that TikTok knowingly engages in pushing this addictiveness towards children.
But parental responsibility is not a shield for companies trying to entice children into harmful addiction generally.
A parent that gave their 8 year old a cigarette would be ignoring Surgeon General's Warnings plastered all over the product packaging, numerous signs at every cigarette retailer in the country stating the age limit to purchase and that "Buying cigarettes for minors is a crime", as well as decades of publicly available research on the harmfulness and addictiveness of smoking.Before the "parents shouldn't have let them use TikTok" comments flood in:
We decided as a society a ways back that parental responsibility did not give companies free-reign to market harmful items to minors.
Such as "Joe Cool" a cartoon camel advertising cigarettes: https://www.ftc.gov/news-events/news/pr ... w-ftc-says
There's certainly a need to prove (in court) that TikTok is addictive to children, and further that TikTok knowingly engages in pushing this addictiveness towards children.
But parental responsibility is not a shield for companies trying to entice children into harmful addiction generally.
Parental responsibility is not a shield and TikTok should be prosecuted for the crime of marketing an addictive and dangerous product to children. But if TikTok is responsible for the death of the children aged far too young to be using an addictive product (they are), the parents also share responsibility. A parent that gave a 8 year old a cigarette would be negligent (at best). The same applies to TikTok which is far more addictive as anyone who has spent any time on it knows.
Both TikTok and the parents can be in the wrong here, and are.
We used to do something like that. Not strangulation, but something similar that caused blackout. We were lucky I guess.
I got bored this week and gave Tik-Tok as shot while on an elliptical. Starting from scratch, it was 2 hours of marking "not interested" to thirsting on K-pop bois (turns out my bias is exasperation), Harry Styles, candid shots of hot guys on public transportation, and Stranger Things reactions.
It felt like being on the bus in middle school again, a demographic that is completely immune to peer pressure. /s
Incidentally, did anyone else have a fad in your middle school where you would crouch down, hyperventilate, and then stand against a wall while someone puts pressure on your chest for 10 seconds making you pass out? History seems to repeat itself.
The "Ice Bucket Challenge" is the only one that comes to mind, and that was pre-TikTok as far as I can tell.Are there any socially or personally good TikTok challenges? Is it that I only ever hear of the horrible ones? Do they make any effort to moderate those kinds of destructive memes?
FWIW, I don't even consider the age factor to be all that relevant. A friend of mine died attempting something similar in a college dorm. Another friend found them. Everyone in the vicinity was gutted. It's a terrible practice to suggest to anyone, of any age, for any reason.
Yeah, we also did that at school long before TikTok existed. Actually, long before Facebook existed, and even before most of us had access to the internet.We used to do something like that. Not strangulation, but something similar that caused blackout. We were lucky I guess.
Those setups terrify me on a professional level. The people who have made these designs have pretty much made every possible design choice to make this activity as dangerous as possible.I saw this one the other day, very tragic.Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.
https://youtu.be/GZrynWtBDTE
As an electrical engineer I can tell you that I wouldn't touch that with a ten foot pole (even of known insulative quality) - I certainly have the knowledge and skills to make a "safe" wood burning device for this, but the price of a mistake in design or implementation is quite high.
I happened on a video a while back that showed steps for making a tack welder out of a microwave transformer and was really disturbed by the lack of care about safe handling of very deadly electrical currents.
Has every scintilla of common sense suddenly evaporated? Even at that age I knew better than to deliberately strangle myself (or others). Geezus, are these kids eating paint chips?
Years ago, I read a biography of Ted Turner, which contained a passage on how Mr. Turner saw the influence of television. I don't recall the exact quote, but he said something to the effect of "I could go on the air during the Saturday morning cartoon block, show kids how to light a book of matches, and by the afternoon Atlanta would be a smoking pile of ashes".
To put it nicely, children are ignorant. They do not actually know better. Hell, they had to put a full-screen seconds-long warning on Beavis & Butthead because they were worried that teenagers would drink turpentine after seeing it done in a cartoon. That's why we need parents to actually lift a couple fingers and teach them about right from wrong and dangerous from safe, but we also need to demand some sort of responsibility from the corporations circulating misinformation.
You sound like someone who is completely detached from the realities faced by families with two parents who are likely to be working a combined (minimum) 80 hours a week. Edit, this on top of homemaking.This is dead easy to solve. Don't give your young children a phone and exercise constraints on when or where they can use other devices like games or computers.
It really is that easy. Our girls didn't get a phone until they were 13 and they didn't have a television or computer in their own bedrooms until they could buy their own. Sure they whinged about it solidly from maybe age 10 but hey, parents have responsibilities and we explained why. By 13 we'd had all the necessary conversations, and so had the school (which by the way, is one of the few state schools that doesn't allow any use of phones in school for anyone aged under 18 and hasn't so far undergone societal collapse).
This is not rocket science and although there are always going to be tragedies like these in the article, and sometimes it will make sense to prosecute or legislate, eventually parents do actually have to be responsible. (Someone else has already made the point that just because other parents are too lazy or daft or don't care doesn't mean you have to follow the herd.)
The other thing to be more terrified of is that those leads they're touching are having their voltage ratings exceeded by a couple orders of magnitude. There is no reason why at any point the things they're touching won't become live and electrocute them.It IS your EE background talking.Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.
https://youtu.be/GZrynWtBDTE
I agree with your point.
I believe the original video is back up for anyone who is interested.
https://www.youtube.com/watch?v=wzosDKcXQ0I&t=708s
When I heard of this I thought there was some sneakily dangerous aspect to this. But like, its pretty obvious that this is super dangerous (similar to people asphyxiating themselves on this story). So many "I almost tried this I didn't think it was dangerous" comments really make me lose faith in people. Sticking a fork in an electrical outlet is safe compared to what is going on here. Maybe that's just my electrical engineering background talking.
The average person out there has very limited knowledge of electricity, how it flows, what voltage/current mean, and especially how dangerous it can be when mishandled.
Even people who are residential electricians are often dangerously ignorant in areas that are outside the standard voltage and current scales seen during their daily occupation.
Hell, even for myself, I ducked out of the power/circuits direction of EE as early as course requirements let me, so I am well aware there are huge areas dealing with high voltages and high currents that I am unfamiliar with the properties. Thus, I avoid messing with them.
I didn't even have the option to take classes on AC power outside lower division. I learned a decent bit on the job about DC high power, but was still uneasy testing a 160A load at 35VDC even with all the precautions in place.
The videos showing people holding live leads to set fire to wood literally made me nauseous. I always feel like people treat electricity like magic, but still need to learn young that exposed electrical lines==DANGER.
Low voltage with high (and galvanically isolated) current is safER, but not completely safe and should still be treated with respect.I saw this one the other day, very tragic.Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.
https://youtu.be/GZrynWtBDTE
As an electrical engineer I can tell you that I wouldn't touch that with a ten foot pole (even of known insulative quality) - I certainly have the knowledge and skills to make a "safe" wood burning device for this, but the price of a mistake in design or implementation is quite high.
I happened on a video a while back that showed steps for making a tack welder out of a microwave transformer and was really disturbed by the lack of care about safe handling of very deadly electrical currents.
Welders are fine, beyond the basic "don't lick the live wire" safety steps. You cut off the high-voltage secondary and put in a two-turn low-voltage secondary instead. You can burn yourself badly with it, and you'll have a _very_ bad day if you stab yourself with both contacts but you'd have to work pretty hard to kill yourself with it.
The main reason low voltage is safer is that dry skin resistance is quite high ~100kOhm. But there are two key points, skin is the main resistance point, internal resistance across the chest is in the neighborhood of 500-1000Ohm. And it's greatly reduced when it's not dry, sweat, in particular, is not just water, but electrolytes, and it can bring skin resistance down by 2 orders of magnitude.
In that case even low voltages can be dangerous if the current is not limited. So if making one of those types of tack welders, take the time to put insulation on anything that sweaty skin might contact. And properly enclose the transformer - the video I saw had it mounted open on a board.
Of course, many orders of magnitude safer than the 2-4kV used in the wood burning, those videos are horrifying.
I'm an EE too - I understand exactly how this works. Here's the trick: 1kohm skin resistance (your two orders of magnitude reduction) means we get 1mA per volt. Those MOT welders typically have a volt or two on the output. A nine volt battery can exceed that output current! So can AA cells, and certainly 18650 cells can put a LOT more current out (and are ~4V to boot).
So that's about how dangerous a MOT spot welder is re: voltage. It's not that it can't kill you, but you're going to have to work for it. Something like stabbing your hands to get past the skin resistance entirely and getting the current through your heart. Or, far more likely, accidentally zapping yourself on the primary winding when you're not paying attention and getting unlucky with an arrhythmia.
That is the type of voltage that a welder should be running at. I would not trust that what the youtube video is showing will properly be at that unless they demonstrated it. I would not be surprised if they failed to explain what they were doing in addition to what the expected outcome was.Low voltage with high (and galvanically isolated) current is safER, but not completely safe and should still be treated with respect.I saw this one the other day, very tragic.Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.
https://youtu.be/GZrynWtBDTE
As an electrical engineer I can tell you that I wouldn't touch that with a ten foot pole (even of known insulative quality) - I certainly have the knowledge and skills to make a "safe" wood burning device for this, but the price of a mistake in design or implementation is quite high.
I happened on a video a while back that showed steps for making a tack welder out of a microwave transformer and was really disturbed by the lack of care about safe handling of very deadly electrical currents.
Welders are fine, beyond the basic "don't lick the live wire" safety steps. You cut off the high-voltage secondary and put in a two-turn low-voltage secondary instead. You can burn yourself badly with it, and you'll have a _very_ bad day if you stab yourself with both contacts but you'd have to work pretty hard to kill yourself with it.
The main reason low voltage is safer is that dry skin resistance is quite high ~100kOhm. But there are two key points, skin is the main resistance point, internal resistance across the chest is in the neighborhood of 500-1000Ohm. And it's greatly reduced when it's not dry, sweat, in particular, is not just water, but electrolytes, and it can bring skin resistance down by 2 orders of magnitude.
In that case even low voltages can be dangerous if the current is not limited. So if making one of those types of tack welders, take the time to put insulation on anything that sweaty skin might contact. And properly enclose the transformer - the video I saw had it mounted open on a board.
Of course, many orders of magnitude safer than the 2-4kV used in the wood burning, those videos are horrifying.
I'm an EE too - I understand exactly how this works. Here's the trick: 1kohm skin resistance (your two orders of magnitude reduction) means we get 1mA per volt. Those MOT welders typically have a volt or two on the output. A nine volt battery can exceed that output current! So can AA cells, and certainly 18650 cells can put a LOT more current out (and are ~4V to boot).
So that's about how dangerous a MOT spot welder is re: voltage. It's not that it can't kill you, but you're going to have to work for it. Something like stabbing your hands to get past the skin resistance entirely and getting the current through your heart. Or, far more likely, accidentally zapping yourself on the primary winding when you're not paying attention and getting unlucky with an arrhythmia.
If it's getting down that low in voltage, then safer than I expected (of course maybe if the how-to videos bothered to show the winding ratios, or even just put a multimeter reading up so folks following along know what to expect that would be easier to know.)
I had estimated more like 10-12V, but again purely based on watching a video that was completely non-informative of what outcomes to expect.
Thanks for the info.
Parents suing TikTok say it's obvious when kids post videos of themselves that they're too young for TikTok.
Even if you don't let your kid have access to TikTok, can you guarantee none of their friends have it?Parents suing TikTok say it's obvious when kids post videos of themselves that they're too young for TikTok.
uhhh, if the parents *realize* their kids are *too young* for TikTok, why are they letting their kids use it?
Or, if they are letting their kids use TikTok, why not also have the conversation first along the lines of "okay, you're going to see some pretty stupid shit here, so let's review if you think it's okay to follow along in these situations..." You know, like, "actually parent" in the situation where your kid needs it to, like, grow up successfully and avoid collecting any Darwin awards...?
This parent reading the article says "Parents who leave their young kids alone on TikTok have obviously not grasped who is ultimately responsible for the safety of their child."
Jeezuz, I mean it's not far off from giving your kid matches and a can of WD-40, no guidance, and then wanting to sue those companies when your kid burns themself.
Are you unfamiliar with the concept of emergencies?I'm getting so tired of hearing parents cry about their 8 year old being addicted to the internet. Take their phone away. It's truly that simple. Why did you give an 8 year old a phone to begin with? I'm betting the real answer is "to shut them up so I don't have to pay attention to them." I don't care if Purdue Pharma is making cartoons about how cool opioids are, you're the reason they keep watching it. Y'all cram a screen in their face every second they get and then wonder why they shoot up their schools or kill themselves. This kind of crap is why abortion should not only be legal but, in some cases, mandatory.
Why do you give a kid a phone?
Maybe because they leave the house occasionally and need a way to call home to get picked up? When was the last time you saw a pay phone?
You let your eight-year-old willy-nilly walk around the streets? Unless they're going to a friend's house no more than two or three blocks away, they need to be driven, or you walk with them. Once there, they can use the phone there to call for you to come get them, or the parent of the friend can drive them back. We're long past the time when suburban or rural children can walk about without care, and no way in hell should an urban child be left unattended and out of sight.
The parents of kids watching TikTok videos are complaining that TikTok is allowing kids who are clearly under 13 to post videos. Which is a fair complaint.Parents suing TikTok say it's obvious when kids post videos of themselves that they're too young for TikTok.
uhhh, if the parents *realize* their kids are *too young* for TikTok, why are they letting their kids use it?
Or, if they are letting their kids use TikTok, why not also have the conversation first along the lines of "okay, you're going to see some pretty stupid shit here, so let's review if you think it's okay to follow along in these situations..." You know, like, "actually parent" in the situation where your kid needs it to, like, grow up successfully and avoid collecting any Darwin awards...?
This parent reading the article says "Parents who leave their young kids alone on TikTok have obviously not grasped who is ultimately responsible for the safety of their child."
Jeezuz, I mean it's not far off from giving your kid matches and a can of WD-40, no guidance, and then wanting to sue those companies when your kid burns themself.
My brain isn't working so maybe it is there, but I couldn't find an explanation of what the Blackout Challenge is. Did I overlook it in the article?
I'm getting so tired of hearing parents cry about their 8 year old being addicted to the internet. Take their phone away. It's truly that simple. Why did you give an 8 year old a phone to begin with? I'm betting the real answer is "to shut them up so I don't have to pay attention to them." I don't care if Purdue Pharma is making cartoons about how cool opioids are, you're the reason they keep watching it. Y'all cram a screen in their face every second they get and then wonder why they shoot up their schools or kill themselves. This kind of crap is why abortion should not only be legal but, in some cases, mandatory.
Why do you give a kid a phone?
Maybe because they leave the house occasionally and need a way to call home to get picked up? When was the last time you saw a pay phone?
You let your eight-year-old willy-nilly walk around the streets? Unless they're going to a friend's house no more than two or three blocks away, they need to be driven, or you walk with them. Once there, they can use the phone there to call for you to come get them, or the parent of the friend can drive them back. We're long past the time when suburban or rural children can walk about without care, and no way in hell should an urban child be left unattended and out of sight.
When I was 8 me and my friends would get dropped at the mall on a Saturday, watch a movie get a burger and call someone's parent to come and get us afterwards. By the time we were 12, we were riding our bikes to the theatre and calling for a ride if it started raining Somehow we survived.
You sound like the kind that keeps their kid on a leash and loses their shit if it dares to be in a different room from you.
35 years ago at age 6-7, I used to ride on the streets with my cousins alone and play football. The regulations, traffic are insane now. You just cannot let a kid outside alone nowadays. Add to that all those predators out there and irresponsible f**ks, it's just not the same world.
This applies to you giving your child unrestricted access to anything on a phone. AND posting videos of themselves online. Don't you think some perv someone is getting off those videos? Or he is getting influenced and his brain shaped by those images and videos that he sees? Don't you think it's damaging his attention span, overstimulating his brain and promoting the instant gratification culture?
Kids have no business using those garbage social websites and even less business posting pics and videos of themselves where this can be used to locate and t rack them or those stuffs be used someway or another....