Parents sue TikTok after 7 kids die from profitable Blackout Challenge videos

NoyaNoya

Smack-Fu Master, in training
19
Parents want TikTok to pay for addicting their kids, demanding a jury trial to decide whether TikTok's design needs to change. Damages sought right now are unspecified but are expected to cover intangible losses suffered by the kids before they died, as well as the loss of each kid's "future earning capacity" and "normal activities, pursuits, and pleasures."

How do you even calculate this? Working adults, sure, but kids?

Some of these steps include adding a process for age and identity verification at sign-up so that TikTok could notify parents of their app use and block content like Blackout Challenge videos from ever reaching kids' feeds.

Sure, hand out more private data to TikTok. That'll solve things.
 
Upvote
7 (7 / 0)

alexr

Ars Tribunus Militum
2,000
So if a child goes to a friends place and their parent gives them a Bud Light and something happens to that child, is the responsibility on the parent, or Budweiser? It's a different world, and parents aren't adapting fast enough. I think part of it is that those (like me, although I don't have kids) grew up just as technology was starting, and we all did things that we shouldn't have with it. We aren't translating that to today, where we should know some tech is dangerous and we shouldn't give our kids, or our kids friends, access to everything that they want.


A better analogy would be that a child goes to a friend’s place. Every time they open their friend’s fancy new smart fridge, it tells them to try the Bud Light, because the fridge manufacturer’s recommendation algorithms have determined that people often keep coming back for more when drinking them. The child pauses for a moment, considering by this suggestion, and then gets the apple juice out. But next time they open the fridge, it suggests the Bud Light again, because the child paused a bit last time. This time, once it’s finished telling them to drink alcohol, it follows up by telling them a fun trick they can do to open the bottle with only a shoe. Then it tells them to try drinking 40 bud lights as a challenge. Then it tells them how to use the bottle to make a Molotov cocktail.

It isn’t a huge stretch of the imagination to call the manufacturer of this hypothetical fridge irresponsible, I think.
 
Upvote
15 (15 / 0)

bomx

Wise, Aged Ars Veteran
133
Another day, another scammer illegally aiming at the deep pockets instead of the guilty party.

TokTok is distributing this content and needs to be held accountable for the deaths it directly caused. Scammers are the ones pretending that corporations can’t be held accountable, like yourself. All internet companies should be held 100% responsible for content. YouTube should never have had rampant piracy, that’s illegal. There was a choice made and it was the wrong choice, we don’t need a free for all online, where free content is exploited irresponsibly. Ars takes responsibility, TikTok can too.

I disagree but it’s funny how YouTube and TikTok can be 300% successful removing infringing content like piracy (while accepting that quite a bit of news, commentary, remixing, and other fair use will be removed and punished as well. They’ve been forced to care about that! If we want them to stop promoting videos that incite violent revolution and hate crimes and self-harm we could force them to care about those issues too.

First of all, I agree with you that we should force them to care.

The copyright infringement flags are largely automated. There isn't much human effort involved compared to the amount of audio and video that needs to be checked for copyright infringement.
Nudity and therefore porn is also not totally difficult to filter out with automated checks (skin colors to other colors ratio in the picture, etc).
The things you mention however have traditionally not been easy to filter. It's fellow human beings who have to watch all that filth and evil content, and they get burnt out and depressed so fast, trying to keep up with the incredible mountains of content they need to screen. If they make a single mistake, or rather, if someone doesn't report harmful content fast enough, it gets out of hand real quick.
With machine learning, it should be eventually possible to not rely on people to filter out all crap, but right now it's just not there. So right now it's a matter of forcing companies like TikTok to hire more people who inevitably will become burnt out in record amounts of time. Imagine having to watch harmful content all day. I wish they could easily automate that part like now, it sounds like torture what those moderators have to watch every day as part of their job.
 
Upvote
9 (9 / 0)

BrangdonJ

Ars Praefectus
4,612
Subscriptor
Those assholes you see weaving around traffic at 30 over the speed limit? Often they're recording and posting on TikTok. There are entire groups on TikTok dedicated solely to making and posting videos of driving dangerously & illegally. They get people killed - gruesomely.
TikTok doesn't have any features that really enable any kind of "groups". In fact, it is designed specifically to prevent that.
The TikTok algorithm produces kinds of virtual group. If you watch cat videos, you'll get more cat videos. Same with other subjects. These groups may not be reified as a software data structure, but they are persistent in the data, to the point where they get noticed and talked about. For example, as "Harry Potter TikTok". Some people use it in their marketing spiel. "If this is on your For You page, then you are interested in short tattooed Asian girls like me."

I'm like esqadinfinitum above: I use TikTok a lot, but I've never seen any of this bad stuff. I'm the same on Twitter and Reddit. There I can avoid the cess-pit areas by not subscribing to them. On TikTok it happens automatically. I don't use YouTube or Facebook much; my impression is that their algorithms push people more towards bad stuff than TikTok's does, but that may just be me and my lack of interest in such.
 
Upvote
6 (6 / 0)

The Lurker Beneath

Ars Tribunus Militum
6,636
Subscriptor
I'm getting so tired of hearing parents cry about their 8 year old being addicted to the internet. Take their phone away. It's truly that simple. Why did you give an 8 year old a phone to begin with? I'm betting the real answer is "to shut them up so I don't have to pay attention to them." I don't care if Purdue Pharma is making cartoons about how cool opioids are, you're the reason they keep watching it. Y'all cram a screen in their face every second they get and then wonder why they shoot up their schools or kill themselves. This kind of crap is why abortion should not only be legal but, in some cases, mandatory.

Why do you give a kid a phone?

Maybe because they leave the house occasionally and need a way to call home to get picked up? When was the last time you saw a pay phone?

You let your eight-year-old willy-nilly walk around the streets? Unless they're going to a friend's house no more than two or three blocks away, they need to be driven, or you walk with them. Once there, they can use the phone there to call for you to come get them, or the parent of the friend can drive them back. We're long past the time when suburban or rural children can walk about without care, and no way in hell should an urban child be left unattended and out of sight.

When I was 8 me and my friends would get dropped at the mall on a Saturday, watch a movie get a burger and call someone's parent to come and get us afterwards. By the time we were 12, we were riding our bikes to the theatre and calling for a ride if it started raining Somehow we survived.

You sound like the kind that keeps their kid on a leash and loses their shit if it dares to be in a different room from you.


It's pretty ironic. Fifty years ago, nobody turned a hair when kids were out on the streets, or travelled miles on public buses. Our parents warned us about strange men in white vans but otherwise generally left it up to common sense. They would have been amazed and excited, I'm sure, to hear about modern phone tech (the likes of which at that time would have been seen only on Star Trek - not even sure they had videophones, Dick Tracy was an outlier but wasn't really part of the culture) thinking it would surely make us safer...
 
Upvote
16 (16 / 0)

jm_leviathan

Ars Scholae Palatinae
940
A platform that is built almost solely to reduce average intelligence level of the society

A considerable and growing proportion of capitalist enterprises are actively harmful to society. Modern capitalism is mostly about capturing the minds of the masses so as to vampirically feed off the energies of an otherwise surplus population. Gambling, alcohol, cigarettes, drugs, junk food, social media, advertising, pornography, most of the beauty and wellness industries, some of the pharmaceutical and "mental health" industries, a large and growing proportion of video games, films and entertainment products devoid of intrinsic value, politics as cage-fight entertainment

On the one hand, the masses need to be pacified if not outright stupefied because they are a potential threat. On the other hand, an increasing proportion of the population is increasingly unnecessary to the processes that generate wealth. The gold standard is that depicted by The Matrix: humans as literal batteries for capital and power, but otherwise inert.
 
Upvote
9 (10 / -1)

Ptahhotep

Ars Scholae Palatinae
789
Relevant: it doesn't help that it's 100% "normal" in movies and TV series that people who get shot, run over or beaten unconscious quickly (or immediately) recover without adverse effects.

In the real world these things more often than not lead to lasting disability, brain damage, death or all of those (joke).
And even if that does not happen (which in my experience is quite rare) recovery is never quick.

So people really are stupid but we largely have ourselves to blame for that.

That one time a "little" accident in my youth rendered me unconscious I woke up in hospital and after I was released I was unable to walk without crutches for weeks. I did not have open wounds or broken bones.
 
Upvote
-1 (2 / -3)

slipleft

Ars Scholae Palatinae
701
My daughter sure as hell didn't have access to social media, etc. at 8 or 9 years old.

That's not actually a very good way to protect her. As people have pointed out, she WILL see that stuff even if you don't know about it and even if it's not very often.

Better is to make sure that, by 8 or 9 years old, she knows that the Internet is full of fakes and idiots, and anything that seems dodgy probably is. And specifically that anything labelled as a "challenge" is trying to get you to do something moronic and self destructive.

It's not an either/or. You can do both. And social media access at 8 or 9 years old is idiotic.

It kind of IS an either/or. If the kid "does not have access to" TikTok or Youtube or whatever, that implies that you are not watching TikTok or Youtube or whatever with the kid, which means you are not in a position to point out how various videos are stupid or manipulative.

At 8 or 9, not very many people can learn to recognize what's going on behind the scenes just by being abstractly told "there are a bunch of liars and fools on the Internet". They need specific examples.

You can either provide those examples, or not.

Using TikTok as a teaching by prop is not the same as giving access to TikTok for unsupervised leisure use. It’s not different from heroin in this regard.

IMO a child at that age shouldn’t have unsupervised access to social media, though they may be exposed to it outside of the home.
 
Upvote
0 (1 / -1)

Ptahhotep

Ars Scholae Palatinae
789
I haven't read all the comments so bear with me and sorry if I got ninja'd...

In my opinion we shouldn't just go after TikTok but first and foremost after the turds who come up with these "challenges" and put them on social media in order to trick people into doing super stupid dangerous things.

Take their stuff and toss them in jail. Let them invent their own "challenges" there, like "insult the kingpin".

Okay I'll admit I'm angry otherwise I'd keep my medieval suggestions to myself.
 
Upvote
7 (8 / -1)

Fuzzypiggy

Ars Scholae Palatinae
1,108
TikTok is a cancer on humanity, turning people into simpering braindead morons who've been trained to have the mental focus of a goldfish. Let's not even get into how TikTok is basically owned by the Chinese government, and it's odd how they titghtly control TikTok content to be wholesome within China and outside they let any old crud be on the platform, almost like they trying to turn the rest of the world into morons.
 
Upvote
0 (4 / -4)

dlux

Ars Legatus Legionis
25,514
Your comment reminds me of the "Send Me To Heaven" app, which I did not realize until just now is apparently still available on the Google Play Store.

For those who don't know:
https://en.wikipedia.org/wiki/Send_Me_To_Heaven
My god that is fantastic! I wish I'd thought of that.

(I like how some people tossed their phones in the air without even downloading the app. Sociologists should study that one closely.)
 
Upvote
13 (13 / 0)

Ptahhotep

Ars Scholae Palatinae
789
So, here's the only real solution I can think of, which will never happen: Break down all social media companies, period.

I'm not making this statement just because I hate social media, which I do, but it's about the core problem that all social media platforms have - they are not only too big to fail, they are also too big to monitor themselves or take any relevant action to stop stuff like these from happening.

Cases like these are always obvious how dangerous and stupid they are... after the fact. People who don't participate in the specific social network are completely unaware of what is happening, until it hits the news. It's not because they are stupid, it's not because parents don't care what happens to their kids, etc. It'd because that's a function of the platform itself.

They are all built with a relative degree of exclusivity or anonymity, sometimes painting themselves as sort of a refuge for specific class of people, sometimes forcibly taken by storm by a class of consumers and users that are just bored with what is currently out there.

The scale problem is easy to see, but perhaps pretty difficult to understand the full consequences of.
Platforms that host user submitted content which have not only billions of people using it, but also millions of people submitting content inevitably falls into the trap of permissiveness.

Why? Because no matter how many employees you have, and no matter much you slave them, it will never be enough to review the content that is in the platform itself, in an appropriate or satisfactory level. You could have an entire country's army trying to handle it with specific training and proficiency (doesn't exist and will never happen, but let's play hypotheticals here), it still wouldn't be enough.

See, let's look at a few statistics here... I've done this with Facebook and Twitter in the past.
TikTok has somewhere around 1 billion monthly active users. That's active, the app itself has been downloaded some 3 billion times. It's an absurd number, but let's say that only a smaller percentage of that really uses the platform with frequency. It'd still be too much. A hundred millions. Ten millions. Still too much.

167 million TikTok videos get watched in a single minute on average everyday. In a single minute. See the problem with that? How could something like that ever be monitored?
Several of the TikTok challenge videos have over a million views. Some have over 100 million.
The idea is huge because it's engaging, it racks up huge profits not only for creators, but also for the platforms themselves.

Much like several other damaging and damning social media trends, this one also shows up in all these market analysis statistics so that advertisers and brands drool over the numbers, and think of some way of exploiting it for profit.

You know what numbers like those do not allow for? Moderation, monitoring, effective policing, meaninful management and interference - control in general. It's out of control because it's too big. 7 kids dying is always awful to hear, but with sizes like those a fact like that becomes statistics for anyone involved that could act to stop or prevent it somehow.

Can these platforms do more? Yes, sure, but more what? With numbers like those, it's a rat's race. They might not be as blind as the general population on what's happening in their platform, but let me tell you - they are likely not too far ahead. People were just not built to deal with numbers like that in any meaningfull way.
A trend will pop up, more kids will die, and they'll have no way of knowing beforehand. They'll do something to prevent one thing, letting another hundreds or thousands others go by.

Because the only thing these platforms can do at that scale is employ some sort of algorithmic automated monitoring and management, and those are just not advanced enough to accurately pick and choose, judge, and execute fuzzy human policies to it. For that, we need a singularity, and despite news of recent past, we are nowhere close to it.

Worse yet, even if we had some sort of AI to handle it all, I doubt people would accept it's actions like it was final. It becomes a game of blame the AI devs or complain about the AI instead of going after the business behind it.

So, with all that laid down, what we could possibly do to avoid these worst case scenarios? Break them up in parts small enough that regional administration, monitoring, moderation and action is feasible at the hands of people.
Current social networks kinda evolved from that. Older social networks, discussion forums, chat rooms, newsgroups, etc.

Oh, but that's old, no one wants that! Well, it's not about what people want, it's about what has a chance of working. Putting these platforms on the spotlight, vilifying CEOs, calling for depositions, and the regulations, fines, penalties that have already passed against social platforms are clearly ineffective. It's because no matter what people think of doing to stop it, it doesn't get to the point. The point that the entire concept of social networks is like a drug, or gambling, or something that societies in general got addicted to, and can't let go. People have become dependant on it, but it's exactly that dependance that creates all these problems.

The closest thing I've seen to something more reasonable is descentralized networks. But another form could just be limiting the number of entries seen in a day, the time spent on the platform, the ammount of content that can be submited at a time, the number of views an entry can have, the time an entry can be kept up on the platform, how curation works, how advertisement works... plus a myriad of other things that have to go back to the drawing board and restart from scratch with consideration to what the platform is propagating, versus pure for profit motivations.

I used to think this wasn't necessary, before people started looking at social media as news sources. They should've never become that, just a form of harmless entertainment not to be taken too seriously, or socialization that you also treat as a strangers' gathering thing.

That time was too long ago, and now we are in this situation where every single method of forcing news sources to be somewhat responsible for what they propagate fell into the waysides and nothing of substance is being done about it.
And this isn't only about kids, this is about everyone, the entire media landscape, the entire consumer base, adults, old people.

I used to also think that people would adapt, cultures would prop up some framework to control the worst impulses that these platforms gives rise to... I was wrong. This is a self sustaining cycle of hatred and horrible sh*t, it has become it's own economy and it's own industry, it is supported by a status quo of giant corporations that will never let go of it because of guaranteed profits that only drives problems like the wage gap and late stage capitalism further, and it'll never end, unless it self destructs taking everyone else with it. There's your "destroyer of worlds", it looks like a cute pet in comparison to a nuke, right? Only it isn't.

Social networks are the ultimate form of the tragedy of the commons in the virtual world. The resource being exploited is attention, dedication, focus, time, effort. The consequences of depletion are far worse than depletion of any other single resource... it's literally killing people, leading to destructive pathways, causing divides in society, elevating hatred, and potentially being behind wars, the rise of populism, the rise of denialism, fake news and a whole ton of other crap. It's no surprise that it also fuels crap like challenges that kills kids.

We were in so much of a better position just a couple of decades ago as communities and societies that it's all self evident. In what period of time we can say clearly and without nostalgic interference or selective bias that we were better before as societies and communities as a whole?
I don't care what others say, and I have done my entire journey of checking if this isn't only nostalgia - the Internet was better before social networks. It should've stopped there. Or at the very least, taken a different route.
You really want people to read all that?
Man ffs it's longer than the damn article!
Post it to a blog or ask Ars if they want to hire you.
 
Upvote
2 (7 / -5)

Edgar Allan Esquire

Ars Praefectus
3,093
Subscriptor
I got bored this week and gave Tik-Tok as shot while on an elliptical. Starting from scratch, it was 2 hours of marking "not interested" to thirsting on K-pop bois (turns out my bias is exasperation), Harry Styles, candid shots of hot guys on public transportation, and Stranger Things reactions.

It felt like being on the bus in middle school again, a demographic that is completely immune to peer pressure. /s

Incidentally, did anyone else have a fad in your middle school where you would crouch down, hyperventilate, and then stand against a wall while someone puts pressure on your chest for 10 seconds making you pass out? History seems to repeat itself.
 
Upvote
6 (7 / -1)

dlux

Ars Legatus Legionis
25,514
So, here's the only real solution I can think of, which will never happen: Break down all social media companies, period.

I'm not making this statement just because I hate social media, which I do, but it's about the core problem that all social media platforms have - they are not only too big to fail, they are also too big to monitor themselves or take any relevant action to stop stuff like these from happening.

Cases like these are always obvious how dangerous and stupid they are... after the fact. People who don't participate in the specific social network are completely unaware of what is happening, until it hits the news. It's not because they are stupid, it's not because parents don't care what happens to their kids, etc. It'd because that's a function of the platform itself.

They are all built with a relative degree of exclusivity or anonymity, sometimes painting themselves as sort of a refuge for specific class of people, sometimes forcibly taken by storm by a class of consumers and users that are just bored with what is currently out there.

The scale problem is easy to see, but perhaps pretty difficult to understand the full consequences of.
Platforms that host user submitted content which have not only billions of people using it, but also millions of people submitting content inevitably falls into the trap of permissiveness.

Why? Because no matter how many employees you have, and no matter much you slave them, it will never be enough to review the content that is in the platform itself, in an appropriate or satisfactory level. You could have an entire country's army trying to handle it with specific training and proficiency (doesn't exist and will never happen, but let's play hypotheticals here), it still wouldn't be enough.

See, let's look at a few statistics here... I've done this with Facebook and Twitter in the past.
TikTok has somewhere around 1 billion monthly active users. That's active, the app itself has been downloaded some 3 billion times. It's an absurd number, but let's say that only a smaller percentage of that really uses the platform with frequency. It'd still be too much. A hundred millions. Ten millions. Still too much.

167 million TikTok videos get watched in a single minute on average everyday. In a single minute. See the problem with that? How could something like that ever be monitored?
Several of the TikTok challenge videos have over a million views. Some have over 100 million.
The idea is huge because it's engaging, it racks up huge profits not only for creators, but also for the platforms themselves.

Much like several other damaging and damning social media trends, this one also shows up in all these market analysis statistics so that advertisers and brands drool over the numbers, and think of some way of exploiting it for profit.

You know what numbers like those do not allow for? Moderation, monitoring, effective policing, meaninful management and interference - control in general. It's out of control because it's too big. 7 kids dying is always awful to hear, but with sizes like those a fact like that becomes statistics for anyone involved that could act to stop or prevent it somehow.

Can these platforms do more? Yes, sure, but more what? With numbers like those, it's a rat's race. They might not be as blind as the general population on what's happening in their platform, but let me tell you - they are likely not too far ahead. People were just not built to deal with numbers like that in any meaningfull way.
A trend will pop up, more kids will die, and they'll have no way of knowing beforehand. They'll do something to prevent one thing, letting another hundreds or thousands others go by.

Because the only thing these platforms can do at that scale is employ some sort of algorithmic automated monitoring and management, and those are just not advanced enough to accurately pick and choose, judge, and execute fuzzy human policies to it. For that, we need a singularity, and despite news of recent past, we are nowhere close to it.

Worse yet, even if we had some sort of AI to handle it all, I doubt people would accept it's actions like it was final. It becomes a game of blame the AI devs or complain about the AI instead of going after the business behind it.

So, with all that laid down, what we could possibly do to avoid these worst case scenarios? Break them up in parts small enough that regional administration, monitoring, moderation and action is feasible at the hands of people.
Current social networks kinda evolved from that. Older social networks, discussion forums, chat rooms, newsgroups, etc.

Oh, but that's old, no one wants that! Well, it's not about what people want, it's about what has a chance of working. Putting these platforms on the spotlight, vilifying CEOs, calling for depositions, and the regulations, fines, penalties that have already passed against social platforms are clearly ineffective. It's because no matter what people think of doing to stop it, it doesn't get to the point. The point that the entire concept of social networks is like a drug, or gambling, or something that societies in general got addicted to, and can't let go. People have become dependant on it, but it's exactly that dependance that creates all these problems.

The closest thing I've seen to something more reasonable is descentralized networks. But another form could just be limiting the number of entries seen in a day, the time spent on the platform, the ammount of content that can be submited at a time, the number of views an entry can have, the time an entry can be kept up on the platform, how curation works, how advertisement works... plus a myriad of other things that have to go back to the drawing board and restart from scratch with consideration to what the platform is propagating, versus pure for profit motivations.

I used to think this wasn't necessary, before people started looking at social media as news sources. They should've never become that, just a form of harmless entertainment not to be taken too seriously, or socialization that you also treat as a strangers' gathering thing.

That time was too long ago, and now we are in this situation where every single method of forcing news sources to be somewhat responsible for what they propagate fell into the waysides and nothing of substance is being done about it.
And this isn't only about kids, this is about everyone, the entire media landscape, the entire consumer base, adults, old people.

I used to also think that people would adapt, cultures would prop up some framework to control the worst impulses that these platforms gives rise to... I was wrong. This is a self sustaining cycle of hatred and horrible sh*t, it has become it's own economy and it's own industry, it is supported by a status quo of giant corporations that will never let go of it because of guaranteed profits that only drives problems like the wage gap and late stage capitalism further, and it'll never end, unless it self destructs taking everyone else with it. There's your "destroyer of worlds", it looks like a cute pet in comparison to a nuke, right? Only it isn't.

Social networks are the ultimate form of the tragedy of the commons in the virtual world. The resource being exploited is attention, dedication, focus, time, effort. The consequences of depletion are far worse than depletion of any other single resource... it's literally killing people, leading to destructive pathways, causing divides in society, elevating hatred, and potentially being behind wars, the rise of populism, the rise of denialism, fake news and a whole ton of other crap. It's no surprise that it also fuels crap like challenges that kills kids.

We were in so much of a better position just a couple of decades ago as communities and societies that it's all self evident. In what period of time we can say clearly and without nostalgic interference or selective bias that we were better before as societies and communities as a whole?
I don't care what others say, and I have done my entire journey of checking if this isn't only nostalgia - the Internet was better before social networks. It should've stopped there. Or at the very least, taken a different route.
You really want people to read all that?
Man ffs it's longer than the damn article!
Post it to a blog or ask Ars if they want to hire you.
Spoiler tags are your friend. Use spoiler tags wisely.
 
Upvote
11 (12 / -1)
I got bored this week and gave Tik-Tok as shot while on an elliptical. Starting from scratch, it was 2 hours of marking "not interested" to thirsting on K-pop bois (turns out my bias is exasperation), Harry Styles, candid shots of hot guys on public transportation, and Stranger Things reactions.

It felt like being on the bus in middle school again, a demographic that is completely immune to peer pressure. /s

Incidentally, did anyone else have a fad in your middle school where you would crouch down, hyperventilate, and then stand against a wall while someone puts pressure on your chest for 10 seconds making you pass out? History seems to repeat itself.

I don't get why kids do shit like that to themselves. Hell, I get dizzy watching a rotisserie chicken. I'm not going to put my life in the hands of some kid who doesn't even know how to properly wipe.
 
Upvote
5 (5 / 0)

garapito

Ars Scholae Palatinae
1,199
Subscriptor++
Has every scintilla of common sense suddenly evaporated? Even at that age I knew better than to deliberately strangle myself (or others). Geezus, are these kids eating paint chips?

Years ago, I read a biography of Ted Turner, which contained a passage on how Mr. Turner saw the influence of television. I don't recall the exact quote, but he said something to the effect of "I could go on the air during the Saturday morning cartoon block, show kids how to light a book of matches, and by the afternoon Atlanta would be a smoking pile of ashes".

William Tecumseh Sherman approves this message.
 
Upvote
8 (8 / 0)

The Lurker Beneath

Ars Tribunus Militum
6,636
Subscriptor
Your comment reminds me of the "Send Me To Heaven" app, which I did not realize until just now is apparently still available on the Google Play Store.

For those who don't know:
https://en.wikipedia.org/wiki/Send_Me_To_Heaven
My god that is fantastic! I wish I'd thought of that.

(I like how some people tossed their phones in the air without even downloading the app. Sociologists should study that one closely.)


"Send me to Heaven... before I send you!"
 
Upvote
0 (0 / 0)
Before the "parents shouldn't have let them use TikTok" comments flood in:

We decided as a society a ways back that parental responsibility did not give companies free-reign to market harmful items to minors.

Such as "Joe Cool" a cartoon camel advertising cigarettes: https://www.ftc.gov/news-events/news/pr ... w-ftc-says

There's certainly a need to prove (in court) that TikTok is addictive to children, and further that TikTok knowingly engages in pushing this addictiveness towards children.

But parental responsibility is not a shield for companies trying to entice children into harmful addiction generally.

So, you're saying TikTok purposely marketed harmful materials via these videos?
 
Upvote
0 (0 / 0)
Before the "parents shouldn't have let them use TikTok" comments flood in:

We decided as a society a ways back that parental responsibility did not give companies free-reign to market harmful items to minors.

Such as "Joe Cool" a cartoon camel advertising cigarettes: https://www.ftc.gov/news-events/news/pr ... w-ftc-says

There's certainly a need to prove (in court) that TikTok is addictive to children, and further that TikTok knowingly engages in pushing this addictiveness towards children.

But parental responsibility is not a shield for companies trying to entice children into harmful addiction generally.


Parental responsibility is not a shield and TikTok should be prosecuted for the crime of marketing an addictive and dangerous product to children. But if TikTok is responsible for the death of the children aged far too young to be using an addictive product (they are), the parents also share responsibility. A parent that gave a 8 year old a cigarette would be negligent (at best). The same applies to TikTok which is far more addictive as anyone who has spent any time on it knows.

Both TikTok and the parents can be in the wrong here, and are.
A parent that gave their 8 year old a cigarette would be ignoring Surgeon General's Warnings plastered all over the product packaging, numerous signs at every cigarette retailer in the country stating the age limit to purchase and that "Buying cigarettes for minors is a crime", as well as decades of publicly available research on the harmfulness and addictiveness of smoking.

A parent that gave their 8 year old access to TikTok has none of those warnings or information available to them, and from the outside looking in, it's "YouTube but for short videos", which isn't inherently harmful. The harm comes in where TikTok (and YT Shorts, and Instagram, and Facebook) continually serves users content that is just a little further down the rabbit hole than they were before, putting it on a "For You" or "Recommended" or similar page that is the first thing that appears when starting the app, and continually reinforces this curated feed via autoplaying the next video in the list.

For an object lesson in how bad this truly is, create a new Google account. Look at some left-wing hit pieces on right-wing public personalities (Joe Rogan, Jordan Peterson, Steven Crowder, etc), then click through to see the right-wing content that the left-wing piece was criticizing. Almost immediately, your suggested shorts in YouTube become filled with right-wing content snippets, from dozens of channels clipping and reposting the content from the actual creators, such that no matter how many times you click "don't recommend this channel", you are still fed similar content over and over ad infinitum.
 
Upvote
1 (3 / -2)

garapito

Ars Scholae Palatinae
1,199
Subscriptor++
I got bored this week and gave Tik-Tok as shot while on an elliptical. Starting from scratch, it was 2 hours of marking "not interested" to thirsting on K-pop bois (turns out my bias is exasperation), Harry Styles, candid shots of hot guys on public transportation, and Stranger Things reactions.

It felt like being on the bus in middle school again, a demographic that is completely immune to peer pressure. /s

Incidentally, did anyone else have a fad in your middle school where you would crouch down, hyperventilate, and then stand against a wall while someone puts pressure on your chest for 10 seconds making you pass out? History seems to repeat itself.

Kind of related but there are several internet researchers who have started fresh tiktoks and tracked how long it takes to get exclusively radical CHUD content. 2 hours.

https://www.mediamatters.org/tiktok/tik ... bbit-holes

https://twitter.com/abbieasr/status/144 ... AV8_JBWMDQ
 
Upvote
0 (0 / 0)
Are there any socially or personally good TikTok challenges? Is it that I only ever hear of the horrible ones? Do they make any effort to moderate those kinds of destructive memes?

FWIW, I don't even consider the age factor to be all that relevant. A friend of mine died attempting something similar in a college dorm. Another friend found them. Everyone in the vicinity was gutted. It's a terrible practice to suggest to anyone, of any age, for any reason.
The "Ice Bucket Challenge" is the only one that comes to mind, and that was pre-TikTok as far as I can tell.

The original challenge was "dump a 5gal bucket of ice water over yourself, and/or donate to ALS-related charities". Donating money to medical charities is an objective good, and briefly dousing yourself with freezing water is uncomfortable but not dangerous barring some extreme edge cases.
 
Upvote
10 (10 / 0)

Moedius

Ars Scholae Palatinae
1,098
Subscriptor
We used to do something like that. Not strangulation, but something similar that caused blackout. We were lucky I guess.
Yeah, we also did that at school long before TikTok existed. Actually, long before Facebook existed, and even before most of us had access to the internet.

Yes.. far back corner of the schoolyard, behind a big tree where teachers couldn't see us. I think we called it 'the blackout game' and I only vaguely remember that it was some sequence of steps that included hyperventilating. It probably mutated as time went on, but it wasn't a constant thing; it just made the rounds in the schools and playgrounds and parks, kind of like it was trending on tiktok, but on a much slower, smaller, and localized scale. While I can't remember the exact steps, the rest is so clear, because some kids had been caught doing it and parents freaked out. We were told we could die, which I'm not sure if we believed or even understood then, but it lent an edge of danger and excitement to the stupid thing.
 
Upvote
3 (3 / 0)

ranthog

Ars Legatus Legionis
15,240
Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.

https://youtu.be/GZrynWtBDTE
I saw this one the other day, very tragic.

As an electrical engineer I can tell you that I wouldn't touch that with a ten foot pole (even of known insulative quality) - I certainly have the knowledge and skills to make a "safe" wood burning device for this, but the price of a mistake in design or implementation is quite high.

I happened on a video a while back that showed steps for making a tack welder out of a microwave transformer and was really disturbed by the lack of care about safe handling of very deadly electrical currents.
Those setups terrify me on a professional level. The people who have made these designs have pretty much made every possible design choice to make this activity as dangerous as possible.

From a professional perspective the only "safe" way to design something like this is to assume the entire wood burning setup is going to be a 2kV and actively on fire. With high voltage I also would also assume a failure could become violent as well and the operator probably should have some shielding physically from that as well.

I figure though it probably would require a dedicated space and probably at least $10k to do this properly. So not something the average hobbyist can afford.
 
Upvote
6 (6 / 0)

Xepherys

Ars Scholae Palatinae
942
Subscriptor
Has every scintilla of common sense suddenly evaporated? Even at that age I knew better than to deliberately strangle myself (or others). Geezus, are these kids eating paint chips?

Years ago, I read a biography of Ted Turner, which contained a passage on how Mr. Turner saw the influence of television. I don't recall the exact quote, but he said something to the effect of "I could go on the air during the Saturday morning cartoon block, show kids how to light a book of matches, and by the afternoon Atlanta would be a smoking pile of ashes".

To put it nicely, children are ignorant. They do not actually know better. Hell, they had to put a full-screen seconds-long warning on Beavis & Butthead because they were worried that teenagers would drink turpentine after seeing it done in a cartoon. That's why we need parents to actually lift a couple fingers and teach them about right from wrong and dangerous from safe, but we also need to demand some sort of responsibility from the corporations circulating misinformation.

I don't disagree - there should be some responsibility of society-at-large (to include corporations). However, parents not being involved in teaching their children common sense means that these children, once they become adults, still don't really have common sense. We've just pushed the problem to a higher age bracket. Additionally, when the now-adult has no common sense, they don't have the tools or background to instill common sense in their own children and it become a perpetual dumbness machine.
 
Upvote
5 (5 / 0)
This is dead easy to solve. Don't give your young children a phone and exercise constraints on when or where they can use other devices like games or computers.

It really is that easy. Our girls didn't get a phone until they were 13 and they didn't have a television or computer in their own bedrooms until they could buy their own. Sure they whinged about it solidly from maybe age 10 but hey, parents have responsibilities and we explained why. By 13 we'd had all the necessary conversations, and so had the school (which by the way, is one of the few state schools that doesn't allow any use of phones in school for anyone aged under 18 and hasn't so far undergone societal collapse).

This is not rocket science and although there are always going to be tragedies like these in the article, and sometimes it will make sense to prosecute or legislate, eventually parents do actually have to be responsible. (Someone else has already made the point that just because other parents are too lazy or daft or don't care doesn't mean you have to follow the herd.)
You sound like someone who is completely detached from the realities faced by families with two parents who are likely to be working a combined (minimum) 80 hours a week. Edit, this on top of homemaking.

This statement sounds like it is from a parent who is overworked and tired - and has given in to the digital babysitter.

Don't. Period.

Actual parent here - working full time, laundry, dishes, making food etc.

My kids don't watch anything without me seeing it first.

Devices are heavily managed and monitored - discipline is actually enforced - no video games I don't play, no apps I don't review - anything.

Also, parents need to start teaching them early about peer pressure and that most things they see online are just individuals who are desperate to be popular and do stupid things.

Yes it is work.
Yes it is hard.

Yes it takes time - but my kids are more important than my netflix queue or raid night.

(I pick out movies ahead of time for them to watch on DND nights, or now that my son is older, he rolls dice for me sometimes).

Yes, there will always be things we cannot control - but that is no excuse to throw in the towel at home, or not help them learn critical thinking or self worth so they are less susceptible to influence.
 
Upvote
-10 (3 / -13)

ranthog

Ars Legatus Legionis
15,240
Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.

https://youtu.be/GZrynWtBDTE

I agree with your point.

I believe the original video is back up for anyone who is interested.
https://www.youtube.com/watch?v=wzosDKcXQ0I&t=708s

When I heard of this I thought there was some sneakily dangerous aspect to this. But like, its pretty obvious that this is super dangerous (similar to people asphyxiating themselves on this story). So many "I almost tried this I didn't think it was dangerous" comments really make me lose faith in people. Sticking a fork in an electrical outlet is safe compared to what is going on here. Maybe that's just my electrical engineering background talking.
It IS your EE background talking.

The average person out there has very limited knowledge of electricity, how it flows, what voltage/current mean, and especially how dangerous it can be when mishandled.

Even people who are residential electricians are often dangerously ignorant in areas that are outside the standard voltage and current scales seen during their daily occupation.

Hell, even for myself, I ducked out of the power/circuits direction of EE as early as course requirements let me, so I am well aware there are huge areas dealing with high voltages and high currents that I am unfamiliar with the properties. Thus, I avoid messing with them.

I didn't even have the option to take classes on AC power outside lower division. I learned a decent bit on the job about DC high power, but was still uneasy testing a 160A load at 35VDC even with all the precautions in place.

The videos showing people holding live leads to set fire to wood literally made me nauseous. I always feel like people treat electricity like magic, but still need to learn young that exposed electrical lines==DANGER.
The other thing to be more terrified of is that those leads they're touching are having their voltage ratings exceeded by a couple orders of magnitude. There is no reason why at any point the things they're touching won't become live and electrocute them.
 
Upvote
1 (1 / 0)

bburdge

Ars Tribunus Militum
2,504
Subscriptor++
Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.

https://youtu.be/GZrynWtBDTE
I saw this one the other day, very tragic.

As an electrical engineer I can tell you that I wouldn't touch that with a ten foot pole (even of known insulative quality) - I certainly have the knowledge and skills to make a "safe" wood burning device for this, but the price of a mistake in design or implementation is quite high.

I happened on a video a while back that showed steps for making a tack welder out of a microwave transformer and was really disturbed by the lack of care about safe handling of very deadly electrical currents.

Welders are fine, beyond the basic "don't lick the live wire" safety steps. You cut off the high-voltage secondary and put in a two-turn low-voltage secondary instead. You can burn yourself badly with it, and you'll have a _very_ bad day if you stab yourself with both contacts but you'd have to work pretty hard to kill yourself with it.
Low voltage with high (and galvanically isolated) current is safER, but not completely safe and should still be treated with respect.

The main reason low voltage is safer is that dry skin resistance is quite high ~100kOhm. But there are two key points, skin is the main resistance point, internal resistance across the chest is in the neighborhood of 500-1000Ohm. And it's greatly reduced when it's not dry, sweat, in particular, is not just water, but electrolytes, and it can bring skin resistance down by 2 orders of magnitude.

In that case even low voltages can be dangerous if the current is not limited. So if making one of those types of tack welders, take the time to put insulation on anything that sweaty skin might contact. And properly enclose the transformer - the video I saw had it mounted open on a board.

Of course, many orders of magnitude safer than the 2-4kV used in the wood burning, those videos are horrifying.

I'm an EE too - I understand exactly how this works. Here's the trick: 1kohm skin resistance (your two orders of magnitude reduction) means we get 1mA per volt. Those MOT welders typically have a volt or two on the output. A nine volt battery can exceed that output current! So can AA cells, and certainly 18650 cells can put a LOT more current out (and are ~4V to boot).

So that's about how dangerous a MOT spot welder is re: voltage. It's not that it can't kill you, but you're going to have to work for it. Something like stabbing your hands to get past the skin resistance entirely and getting the current through your heart. Or, far more likely, accidentally zapping yourself on the primary winding when you're not paying attention and getting unlucky with an arrhythmia.

If it's getting down that low in voltage, then safer than I expected (of course maybe if the how-to videos bothered to show the winding ratios, or even just put a multimeter reading up so folks following along know what to expect that would be easier to know.)

I had estimated more like 10-12V, but again purely based on watching a video that was completely non-informative of what outcomes to expect.

Thanks for the info.
 
Upvote
1 (1 / 0)

ranthog

Ars Legatus Legionis
15,240
Ann Reardon wrote about a trend in YouTube that has killed over thirty adults verified to date so far. The wood burning thing with electricity. Nobody cares though and her video was pulled because it was dangerous... These companies need to show the most bare minimum of due diligence imo.

https://youtu.be/GZrynWtBDTE
I saw this one the other day, very tragic.

As an electrical engineer I can tell you that I wouldn't touch that with a ten foot pole (even of known insulative quality) - I certainly have the knowledge and skills to make a "safe" wood burning device for this, but the price of a mistake in design or implementation is quite high.

I happened on a video a while back that showed steps for making a tack welder out of a microwave transformer and was really disturbed by the lack of care about safe handling of very deadly electrical currents.

Welders are fine, beyond the basic "don't lick the live wire" safety steps. You cut off the high-voltage secondary and put in a two-turn low-voltage secondary instead. You can burn yourself badly with it, and you'll have a _very_ bad day if you stab yourself with both contacts but you'd have to work pretty hard to kill yourself with it.
Low voltage with high (and galvanically isolated) current is safER, but not completely safe and should still be treated with respect.

The main reason low voltage is safer is that dry skin resistance is quite high ~100kOhm. But there are two key points, skin is the main resistance point, internal resistance across the chest is in the neighborhood of 500-1000Ohm. And it's greatly reduced when it's not dry, sweat, in particular, is not just water, but electrolytes, and it can bring skin resistance down by 2 orders of magnitude.

In that case even low voltages can be dangerous if the current is not limited. So if making one of those types of tack welders, take the time to put insulation on anything that sweaty skin might contact. And properly enclose the transformer - the video I saw had it mounted open on a board.

Of course, many orders of magnitude safer than the 2-4kV used in the wood burning, those videos are horrifying.

I'm an EE too - I understand exactly how this works. Here's the trick: 1kohm skin resistance (your two orders of magnitude reduction) means we get 1mA per volt. Those MOT welders typically have a volt or two on the output. A nine volt battery can exceed that output current! So can AA cells, and certainly 18650 cells can put a LOT more current out (and are ~4V to boot).

So that's about how dangerous a MOT spot welder is re: voltage. It's not that it can't kill you, but you're going to have to work for it. Something like stabbing your hands to get past the skin resistance entirely and getting the current through your heart. Or, far more likely, accidentally zapping yourself on the primary winding when you're not paying attention and getting unlucky with an arrhythmia.

If it's getting down that low in voltage, then safer than I expected (of course maybe if the how-to videos bothered to show the winding ratios, or even just put a multimeter reading up so folks following along know what to expect that would be easier to know.)

I had estimated more like 10-12V, but again purely based on watching a video that was completely non-informative of what outcomes to expect.

Thanks for the info.
That is the type of voltage that a welder should be running at. I would not trust that what the youtube video is showing will properly be at that unless they demonstrated it. I would not be surprised if they failed to explain what they were doing in addition to what the expected outcome was.
 
Upvote
2 (2 / 0)

Bring the Irons

Smack-Fu Master, in training
82
Parents suing TikTok say it's obvious when kids post videos of themselves that they're too young for TikTok.

uhhh, if the parents *realize* their kids are *too young* for TikTok, why are they letting their kids use it?

Or, if they are letting their kids use TikTok, why not also have the conversation first along the lines of "okay, you're going to see some pretty stupid shit here, so let's review if you think it's okay to follow along in these situations..." You know, like, "actually parent" in the situation where your kid needs it to, like, grow up successfully and avoid collecting any Darwin awards...?

This parent reading the article says "Parents who leave their young kids alone on TikTok have obviously not grasped who is ultimately responsible for the safety of their child."

Jeezuz, I mean it's not far off from giving your kid matches and a can of WD-40, no guidance, and then wanting to sue those companies when your kid burns themself.
 
Upvote
0 (2 / -2)

ranthog

Ars Legatus Legionis
15,240
Parents suing TikTok say it's obvious when kids post videos of themselves that they're too young for TikTok.

uhhh, if the parents *realize* their kids are *too young* for TikTok, why are they letting their kids use it?

Or, if they are letting their kids use TikTok, why not also have the conversation first along the lines of "okay, you're going to see some pretty stupid shit here, so let's review if you think it's okay to follow along in these situations..." You know, like, "actually parent" in the situation where your kid needs it to, like, grow up successfully and avoid collecting any Darwin awards...?

This parent reading the article says "Parents who leave their young kids alone on TikTok have obviously not grasped who is ultimately responsible for the safety of their child."

Jeezuz, I mean it's not far off from giving your kid matches and a can of WD-40, no guidance, and then wanting to sue those companies when your kid burns themself.
Even if you don't let your kid have access to TikTok, can you guarantee none of their friends have it?
 
Upvote
8 (8 / 0)
I'm getting so tired of hearing parents cry about their 8 year old being addicted to the internet. Take their phone away. It's truly that simple. Why did you give an 8 year old a phone to begin with? I'm betting the real answer is "to shut them up so I don't have to pay attention to them." I don't care if Purdue Pharma is making cartoons about how cool opioids are, you're the reason they keep watching it. Y'all cram a screen in their face every second they get and then wonder why they shoot up their schools or kill themselves. This kind of crap is why abortion should not only be legal but, in some cases, mandatory.

Why do you give a kid a phone?

Maybe because they leave the house occasionally and need a way to call home to get picked up? When was the last time you saw a pay phone?

You let your eight-year-old willy-nilly walk around the streets? Unless they're going to a friend's house no more than two or three blocks away, they need to be driven, or you walk with them. Once there, they can use the phone there to call for you to come get them, or the parent of the friend can drive them back. We're long past the time when suburban or rural children can walk about without care, and no way in hell should an urban child be left unattended and out of sight.
Are you unfamiliar with the concept of emergencies?

I'll give you an all-too-common example from the US: school shootings. Your child is where they are supposed to be, with adults supervising them, yet sometimes shit goes sideways and they are placed in life threatening danger. Do you want to bet on your child's ability to reach a landline and call you? Do you want to bet on their ability to remember your phone number and dial it properly (does it need area code? 1+ area code? Is there a prefix to dial out on the school phone? Does every school phone even have the ability to dial out, or only certain phones?)

My kids don't have phones. I've made a calculated decision that the harms of a smartphone at their ages (7 and 5) outweigh any potential benefits, and they have an older cousin (10) in the same school who does have a phone and will contact my brother in law in case of emergency. I do regularly use the internet with my kids, they have Amazon Fire Kids tablets which are heavily age restricted, and we browse the internet together on my computer if they want to look at anything that isn't whitelisted. Sometimes I'll whitelist content for them on their tablets that isn't pre-approved. Sometimes I'll blacklist content on their tablets that I'm not personally okay with, while explaining why (Freemium games has been the biggest culprit here).

Now consider that no matter how careful I am, no matter how well I teach them to be safe and kind, they could run into some other kid on the playground with complete dogshit parents who shows them deadly challenges or neonazi bullshit on their phone, and I have no recourse whatsoever but to unwind the damage after the fact.
 
Upvote
9 (9 / 0)
Parents suing TikTok say it's obvious when kids post videos of themselves that they're too young for TikTok.

uhhh, if the parents *realize* their kids are *too young* for TikTok, why are they letting their kids use it?

Or, if they are letting their kids use TikTok, why not also have the conversation first along the lines of "okay, you're going to see some pretty stupid shit here, so let's review if you think it's okay to follow along in these situations..." You know, like, "actually parent" in the situation where your kid needs it to, like, grow up successfully and avoid collecting any Darwin awards...?

This parent reading the article says "Parents who leave their young kids alone on TikTok have obviously not grasped who is ultimately responsible for the safety of their child."

Jeezuz, I mean it's not far off from giving your kid matches and a can of WD-40, no guidance, and then wanting to sue those companies when your kid burns themself.
The parents of kids watching TikTok videos are complaining that TikTok is allowing kids who are clearly under 13 to post videos. Which is a fair complaint.
 
Upvote
6 (6 / 0)
My brain isn't working so maybe it is there, but I couldn't find an explanation of what the Blackout Challenge is. Did I overlook it in the article?

Second paragraph states: (The Blackout Challenge encourages TikTok users to post videos where they choke themselves until they pass out.)
 
Upvote
1 (1 / 0)

Dawnrazor

Ars Tribunus Militum
1,941
I'm getting so tired of hearing parents cry about their 8 year old being addicted to the internet. Take their phone away. It's truly that simple. Why did you give an 8 year old a phone to begin with? I'm betting the real answer is "to shut them up so I don't have to pay attention to them." I don't care if Purdue Pharma is making cartoons about how cool opioids are, you're the reason they keep watching it. Y'all cram a screen in their face every second they get and then wonder why they shoot up their schools or kill themselves. This kind of crap is why abortion should not only be legal but, in some cases, mandatory.

Why do you give a kid a phone?

Maybe because they leave the house occasionally and need a way to call home to get picked up? When was the last time you saw a pay phone?

You let your eight-year-old willy-nilly walk around the streets? Unless they're going to a friend's house no more than two or three blocks away, they need to be driven, or you walk with them. Once there, they can use the phone there to call for you to come get them, or the parent of the friend can drive them back. We're long past the time when suburban or rural children can walk about without care, and no way in hell should an urban child be left unattended and out of sight.

When I was 8 me and my friends would get dropped at the mall on a Saturday, watch a movie get a burger and call someone's parent to come and get us afterwards. By the time we were 12, we were riding our bikes to the theatre and calling for a ride if it started raining Somehow we survived.

You sound like the kind that keeps their kid on a leash and loses their shit if it dares to be in a different room from you.

35 years ago at age 6-7, I used to ride on the streets with my cousins alone and play football. The regulations, traffic are insane now. You just cannot let a kid outside alone nowadays. Add to that all those predators out there and irresponsible f**ks, it's just not the same world.
This applies to you giving your child unrestricted access to anything on a phone. AND posting videos of themselves online. Don't you think some perv someone is getting off those videos? Or he is getting influenced and his brain shaped by those images and videos that he sees? Don't you think it's damaging his attention span, overstimulating his brain and promoting the instant gratification culture?
Kids have no business using those garbage social websites and even less business posting pics and videos of themselves where this can be used to locate and t rack them or those stuffs be used someway or another....

I keep forgetting, that there were no pedophiles or predators before the internet was invented. The world was a paradise and everyone lived in harmony.
 
Upvote
0 (3 / -3)