As teens await sentencing for nudifying girls, parents aim to sue school

The show did a good job of taking the holodeck to it's logical end with Barclay, but they deeply undersold the reality that a vast majority of every crew on every ship would need that counseling.
I'm not so sure about that. Barclay was a special case using it as escapism, but for the rest of the crew, all popular well loved and being their best future selves, the holodeck is just a different flavor of the adventure that is their every single day. Frankly, I'm not sure I COULD get bored of all the stuff they run into, and would probably instead want to go screaming back to Paradise where I'm not getting my blood turned to plastic or some messed up thing.

Plus, frankly the holodeck is exhausting. I'm not the most physically active person. I keep in shape due to my strict diet but I'm not exactly doing well for stamina. I think after a few kayaking adventures I'd be down for just simulating an 80's wood paneled basement full of my old video games where I can just lay back in a bean bag chair to play my stories. In fact, screw that, I'd just replicate all of that for my quarters, because the holodeck is one clipping glitch away from beheading me.

Edit: Oh right the sex thing... No thanks.
 
Upvote
9 (9 / 0)

Uragan

Ars Legatus Legionis
11,172
Articles about child pornography aren't the place for comedy attempts.
I don't even the right word for this. Making a bunch of teens sift through a stack of porn to sort out the (fake but extremely realistic) pictures of themselves.

Just an FYI, it isn’t “porn” or “child pornography”. It’s “CSAM” or “child sex abuse material”.
 
Upvote
6 (15 / -9)

MagStone

Ars Centurion
205
Subscriptor
Legal loophole present at the time or not, didn't the school have some sort of anti-bullying measure in place they could have leaned on at the bare minimum? Or was their precious reputation so important that they were eager to look the other way?

Fuck this school and fuck those kids who did it. And as someone else suggested, go after whatever company made the AI that they used.
Pre-internet, this is the same story as “beloved sports hero rapes the homecoming queen, town sides with hero”
 
Upvote
29 (30 / -1)

MagStone

Ars Centurion
205
Subscriptor
Look, the school should have reported it, if only to the parents of the affected kids.

But seriously, the fact that these boys were acting like sociopathic pricks is not the school's fault. That would be a) the parents, and b) the kids themselves. But I doubt either group is eager to look in the mirror and assign blame where it belongs.

Also the school's pockets are deeper, so naturally that's where the lawsuits go.
I agree, this situation is the age old problem of assholes being assholes with a technological force multiplier
 
Upvote
11 (12 / -1)
Did the terminology change recently? I mean I'll go with whatever, I'm just a little confused as to the reasoning behind it.
"Porn" is not necessarily abusive/coercive (though often has been); when the material involves children, it is always abusive/coercive (even that which was originally willingly produced by the subject to a peer, but then distributed broadly)

Pornography is produced by consenting adults.
Ethical phonography is produced by consenting adults; "consenting" has been historically tenuous regarding the broader definition of porn.
 
Upvote
20 (20 / 0)

JoHBE

Ars Praefectus
4,132
Subscriptor++
I'm not sure it's possible to form a well-considered opinion about this case, and what would be a proper reaction, without a lot more details about the images and the perpetrators.

As a society, we chose to let this fester loooong after a lot of well-informed people saw it coming. Certain billionaires actively encourage it with a wink. Young people with immature brains should not be the only to blame.

I would also say there's a wide spectrum available both in terms of the personalities, intentions and group dynamics involved, and also when it comes to the exact images. Were they ultra-realistic both in quality and context, or stylized, over-the-top, clearly identifiable as fake on superficial scrutiny?

I don't trust the media or the genral public opinion in this, without a lot more specific information.
 
Upvote
-18 (5 / -23)

GodParticle

Smack-Fu Master, in training
7
They absolutely need to be punished and learn their lesson once and for all...but they do NOT deserve to go to prison. Ive been there. It will turn them into even worse criminals and they'll do it again. Right now, it was a very bad idea thay spun out of control. If they go to prison, it will become who they are. Lmao. Society is so gd naive when it comes to putting people away with other criminals.
 
Upvote
13 (15 / -2)

bushrat011899

Ars Scholae Palatinae
658
Subscriptor
In Star Trek: TNG, Barclay uses the holodeck for sex, including creating versions of crew mates like Troi. It wasn't her, but it looked like her.

I used to think the holodeck would be incredible to have. And it would be. But it'd turn into a debauched sex chamber that most crew members would become addicted to in about a week.
Obligatory debauchery:
 
Upvote
2 (2 / 0)
I'm very curious as to what that "loophole" looked like. We normally hear "loophole" and think it has to be something you're reading in bad faith to exploit, but sometimes a "loophole" is confusing or doesn't leave its weird logic up to the reader's discretion. Many people will follow weird and poorly written rules to the letter with a good faith assumption that the system has placed a certain duty on someone else's desk.
 
Upvote
-1 (1 / -2)

Ecnhoffer

Wise, Aged Ars Veteran
170
Subscriptor
I was wondering why criminal charges were not brought against school staff, and I guess this is the reason why. The perpetrators being under 18 is one thing, but the adults in the room literally did nothing. There's a difference between what's legally right, and morally right.
which adults in which room? the parents who know their children?
 
Upvote
-15 (1 / -16)

markgo

Ars Praefectus
3,776
Subscriptor++
Private juvie and prison owners are popping champagne. Now that weed is being phased out as their source of income, sexually frustrated teenage boys will provide a much needed boost for prison population.

Edit: correct autocorrect

IMG_2757.jpeg
 
Upvote
-9 (1 / -10)

Unclebugs

Ars Praefectus
3,036
Subscriptor++
"A concerned student saw it and reported the image through a state tip line," Sounds like the student pretty much knew the school would ignore or would retaliate against them for reporting.
My public high school newspaper staff did a deep dive on sexting at school 9 years ago. Students were pretty nonchalant it during interviews with other students because we had to keep their names our of the interviews as they didn't care. We also published all of the state and federal penalties they could be prosecuted for sexting. It had next to no impact. This series of stories was done with the full cooperation of my principal and district superintendent. In this case it would seem that this private school administration had the same attitude as the students about it, or maybe the offending students had some sort of extra influence on administration.
 
Upvote
8 (8 / 0)

Unclebugs

Ars Praefectus
3,036
Subscriptor++
I was wondering why criminal charges were not brought against school staff, and I guess this is the reason why. The perpetrators being under 18 is one thing, but the adults in the room literally did nothing. There's a difference between what's legally right, and morally right.
Under the loco parentis theory championed by Justice Antonin Scalia, the public school is the parent while the child is on campus. If this was going on in school then the school should have handled it from that perspective. This being a private school maybe there are different rules, so tell me again why private schools are better?
 
Upvote
5 (5 / 0)

Uragan

Ars Legatus Legionis
11,172
Did the terminology change recently? I mean I'll go with whatever, I'm just a little confused as to the reasoning behind it.
Here’s what RAINN has to say about it:
Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.

A child cannot legally consent to any sexual act, let alone to being recorded in one. That’s why RAINN and other child protection experts use the term “CSAM” instead of “child porn” or “deepfakes.” By calling it what it is—sexual abuse—you stop minimizing the harm and you call it out as the crime it is.

As for when the terminology changed, I’m not 100% certain.
 
Upvote
16 (18 / -2)

rain shadow

Ars Tribunus Angusticlavius
6,357
Subscriptor++
Similar to holding bars/pubs/restaurants accountable for serving too much alcohol to drivers who cause accidents that kill people?
It's not universal, but several states will hold the establishment liable when the driver was underage, so the analogy kind of works.
 
Upvote
6 (6 / 0)

Zhengyi

Ars Praetorian
454
Subscriptor
I am sorry, but I don't think the school has a leg to stand on in their case.

I work for a Texas school district and every year go through mandated training. I cannot see how the boys' actions don't fall under sexual harassment. That is the area where the school has a responsibility and didn't meet it.

Edit: I am assuming that state requirements would fall on a private school.
At the risk of starting a larger fight, not having to be subject to standardized requirements is exactly the point of the broader movement away from standardized, public education, however much it's cloaked in the language of "freedom" or "choice".
 
Upvote
11 (11 / 0)

Ravant

Ars Scholae Palatinae
1,355
LLMs aren’t “tools” like Adobe Photoshop is.

Given a prompt, LLMs lacking guardrails will generate CSAM and NCII all on its own.
LLM's still have training data that generated the model. In order for it to generate something, it has to be in the training data to pull from. Not sure how having the data in the training set is not an equivalent to possession.
 
Upvote
4 (5 / -1)

DarthSlack

Ars Legatus Legionis
23,059
Subscriptor++
I'm not sure it's possible to form a well-considered opinion about this case, and what would be a proper reaction, without a lot more details about the images and the perpetrators.

As a society, we chose to let this fester loooong after a lot of well-informed people saw it coming. Certain billionaires actively encourage it with a wink. Young people with immature brains should not be the only to blame.

I would also say there's a wide spectrum available both in terms of the personalities, intentions and group dynamics involved, and also when it comes to the exact images. Were they ultra-realistic both in quality and context, or stylized, over-the-top, clearly identifiable as fake on superficial scrutiny?

I don't trust the media or the genral public opinion in this, without a lot more specific information.

Why? The perpetrators have been caught and admitted to using an app to nudify their classmates and pass around the images. Without consent. The school ignored a tip that this was an ongoing thing. How much more do you need to know?
 
Upvote
14 (14 / 0)
Post content hidden for low score. Show…

Doomlord_uk

Account Banned
25,977
Subscriptor++
There is a book called Code Dependent that goes into this (and other topics) in some detail. The book makes it plain the harm this causes women (of all ages, not just women). I think the book is just old enough that everyone was still calling this 'deepfake' technology, though where children are involved CSAM is the appropriate term.

According to the book, there are many AI programs that can be downloaded, and at the time of publishing at least, many could be run on personal computers, so it doesn't require a major AI company to enable this. I've also seen it explained that with a modern gpu or two you can created your own AI to do deepfakes fairly easily (if you know what you are doing). Many deepfake/CSAM operations are run through programs like Telegram from third-world countries where it's impossible to identify or prosecute the people doing it.

Laws GLOBALLY remain in urgent need of improving, as well as resources to support victims.

As for the boys who used the software, I have at least a degree of sympathy - depending on whether or how much they really understood the harm they would cause. I'll be honest - if in the 1980s or early 90s someone gave me a magic doodad that let me see what a girl looked like naked I would have wanted to try it, simply because I was a horny teenager. You would have had to explain to me the harm caused, because 'just looking at a picture' wasn't obviously harmful by itself. The idea of sharing is another matter of course, and the internet is what I think truly amplifies the original harm. And makes it almost impossible to remove CSAM and other deepfakes. So anyway, I wonder what education had been provided to that date to warn young people of the dangers of the technology they will encounter. Dangers both to victims and to themselves. I think every kid for the last 40 years knew the dangers of narcotics but the risks of digital imagery and generative AI?
 
Upvote
2 (2 / 0)

senatori

Seniorius Lurkius
38
Subscriptor
Even in the context of the show, this was considered crossing an inappropriate boundary. The MOST charitable I could be with that is Barclay wasn't PUBLISHING that material as a holonovel on subspace channels... so I suppose strictly speaking he wasn't violating space-law, but it was still a violation, and more so in space land because that computer knew EXACTLY what Troi looked like. Honestly Jordei or whoever's in charge of IT on that ship need to lock down access to those files better.
I think Jordei did something similar with the holodeck.
 
Upvote
1 (1 / 0)

Doomlord_uk

Account Banned
25,977
Subscriptor++
Child porn = consenting child, if that was legally possible. “Consent” from a child doesn’t count legally.
Children can't consent, and not just as a matter of law. They are quite unable to understand consequences or handle the pressures, manipulations etc of abusive adults. It's not JUST a matter of the law. This is why people are saying the phrase 'child porn' is deprecated. And not only because of the non-concensual nature but because it glosses over the true harm - and crime - revealed in the imagery. CSAM calls it what it IS. NCII (Non-Consensual Intimate Imagery) is what you call it when it's not children who are being deepfaked without their consent (and who consents to it???).

People will know what you 'mean' by the phrase child porn but it's a bad phrase to use. The world has moved on quite some time ago.
 
Upvote
13 (14 / -1)
Private juvie and prison owners are popping champagne. Now that weed is being phased out as their source of income, sexually frustrated teenage boys will provide a much needed boost for prison population.

Edit: correct autocorrect
I was a sexually frustrated teenager and adult and yet, somehow, incredibly, I never creeped on my classmates. I certainly never took creep shots of them. This stuff wasn't available then, but I can't see myself using it, either.

Cut out the 'boys will be boys' bullshit (which this absolutely is, even if you don't say those specific words). Boys are that way because they are allowed to be. They are told it's normal, and they are often excused when it happens. As long as we pretend this is something natural that can't be avoided, it will be unavoidable.
 
Upvote
12 (13 / -1)
I'm not obligated to tell the police if I see someone shoot someone else.
However, any action that could be interpreted as concealing the crime is a Federal offense. For example, telling someone else "Don't look. Someone just had their brains blown all over your shoes."
 
Upvote
-11 (0 / -11)
Pre-internet, this is the same story as “beloved sports hero rapes the homecoming queen, town sides with hero”
I will NEVER understand how that happens. How does someone take that side without feeling like such a piece of human garbage that I can't finish this sentence about what they should do?
 
Upvote
7 (7 / 0)
I'm not sure it's possible to form a well-considered opinion about this case, and what would be a proper reaction, without a lot more details about the images and the perpetrators.

As a society, we chose to let this fester loooong after a lot of well-informed people saw it coming. Certain billionaires actively encourage it with a wink. Young people with immature brains should not be the only to blame.

I would also say there's a wide spectrum available both in terms of the personalities, intentions and group dynamics involved, and also when it comes to the exact images. Were they ultra-realistic both in quality and context, or stylized, over-the-top, clearly identifiable as fake on superficial scrutiny?

I don't trust the media or the genral public opinion in this, without a lot more specific information.
Nothing in your third paragraph matters in the least. If a kid was drawing pictures of one of their classmates naked and passing them around, I would want very similar consequences.
 
Upvote
3 (4 / -1)

AliSard

Smack-Fu Master, in training
66
Subscriptor
Ask yourselves: how much difference is there between these boys opportunism and Trumps?
Nothing more than a whole lot of lawyers.
I have to strongly disagree. This case involved kids. Obviously not excusing them, but they were still kids.

Trump just acts like one.
 
Upvote
1 (1 / 0)

MechR

Ars Praefectus
3,212
Subscriptor
Porn = consenting adult. Child porn = consenting child, if that was legally possible. “Consent” from a child doesn’t count legally. CSAM = child who is sexually abused, plus taking sexual pictures against their will or without their knowledge and proving the crime of sexual abuse.

So CSAM is a lot worse than child porn which is in turn a lot worse than porn.
This always strikes me as a weird and clunky amount of semantic effort to defend pornography's honor that only sex-worker advocates would care about strictly delineating. It's not like "child porn" didn't sound abhorrent enough to people.
 
Upvote
5 (9 / -4)
Post content hidden for low score. Show…