I'm not so sure about that. Barclay was a special case using it as escapism, but for the rest of the crew, all popular well loved and being their best future selves, the holodeck is just a different flavor of the adventure that is their every single day. Frankly, I'm not sure I COULD get bored of all the stuff they run into, and would probably instead want to go screaming back to Paradise where I'm not getting my blood turned to plastic or some messed up thing.The show did a good job of taking the holodeck to it's logical end with Barclay, but they deeply undersold the reality that a vast majority of every crew on every ship would need that counseling.
Articles about child pornography aren't the place for comedy attempts.
I don't even the right word for this. Making a bunch of teens sift through a stack of porn to sort out the (fake but extremely realistic) pictures of themselves.
Did the terminology change recently? I mean I'll go with whatever, I'm just a little confused as to the reasoning behind it.Just an FYI, it isn’t “porn” or “child pornography”. It’s “CSAM” or “child sex abuse material”.
Pre-internet, this is the same story as “beloved sports hero rapes the homecoming queen, town sides with hero”Legal loophole present at the time or not, didn't the school have some sort of anti-bullying measure in place they could have leaned on at the bare minimum? Or was their precious reputation so important that they were eager to look the other way?
Fuck this school and fuck those kids who did it. And as someone else suggested, go after whatever company made the AI that they used.
And Red Dwarf's "Better Than Life"The addiction aspect was done on The Orville.
I agree, this situation is the age old problem of assholes being assholes with a technological force multiplierLook, the school should have reported it, if only to the parents of the affected kids.
But seriously, the fact that these boys were acting like sociopathic pricks is not the school's fault. That would be a) the parents, and b) the kids themselves. But I doubt either group is eager to look in the mirror and assign blame where it belongs.
Also the school's pockets are deeper, so naturally that's where the lawsuits go.
Years ago, yes. Pornography is produced by consenting adults.Did the terminology change recently? I mean I'll go with whatever, I'm just a little confused as to the reasoning behind it.
"Porn" is not necessarily abusive/coercive (though often has been); when the material involves children, it is always abusive/coercive (even that which was originally willingly produced by the subject to a peer, but then distributed broadly)Did the terminology change recently? I mean I'll go with whatever, I'm just a little confused as to the reasoning behind it.
Ethical phonography is produced by consenting adults; "consenting" has been historically tenuous regarding the broader definition of porn.Pornography is produced by consenting adults.
Obligatory debauchery:In Star Trek: TNG, Barclay uses the holodeck for sex, including creating versions of crew mates like Troi. It wasn't her, but it looked like her.
I used to think the holodeck would be incredible to have. And it would be. But it'd turn into a debauched sex chamber that most crew members would become addicted to in about a week.
Similar to holding bars/pubs/restaurants accountable for serving too much alcohol to drivers who cause accidents that kill people?Was any major corporation's AI involved in producing the images?
I would love for them to be held accountable.
Good catch thanks!Just an FYI, it isn’t “porn” or “child pornography”. It’s “CSAM” or “child sex abuse material”.
which adults in which room? the parents who know their children?I was wondering why criminal charges were not brought against school staff, and I guess this is the reason why. The perpetrators being under 18 is one thing, but the adults in the room literally did nothing. There's a difference between what's legally right, and morally right.
Private juvie and prison owners are popping champagne. Now that weed is being phased out as their source of income, sexually frustrated teenage boys will provide a much needed boost for prison population.
Edit: correct autocorrect
My public high school newspaper staff did a deep dive on sexting at school 9 years ago. Students were pretty nonchalant it during interviews with other students because we had to keep their names our of the interviews as they didn't care. We also published all of the state and federal penalties they could be prosecuted for sexting. It had next to no impact. This series of stories was done with the full cooperation of my principal and district superintendent. In this case it would seem that this private school administration had the same attitude as the students about it, or maybe the offending students had some sort of extra influence on administration."A concerned student saw it and reported the image through a state tip line," Sounds like the student pretty much knew the school would ignore or would retaliate against them for reporting.
Under the loco parentis theory championed by Justice Antonin Scalia, the public school is the parent while the child is on campus. If this was going on in school then the school should have handled it from that perspective. This being a private school maybe there are different rules, so tell me again why private schools are better?I was wondering why criminal charges were not brought against school staff, and I guess this is the reason why. The perpetrators being under 18 is one thing, but the adults in the room literally did nothing. There's a difference between what's legally right, and morally right.
Here’s what RAINN has to say about it:Did the terminology change recently? I mean I'll go with whatever, I'm just a little confused as to the reasoning behind it.
Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.
A child cannot legally consent to any sexual act, let alone to being recorded in one. That’s why RAINN and other child protection experts use the term “CSAM” instead of “child porn” or “deepfakes.” By calling it what it is—sexual abuse—you stop minimizing the harm and you call it out as the crime it is.
It's not universal, but several states will hold the establishment liable when the driver was underage, so the analogy kind of works.Similar to holding bars/pubs/restaurants accountable for serving too much alcohol to drivers who cause accidents that kill people?
Well, here we are.But officials—who at the time weren’t legally required to act—failed to notify parents or police for six months
Each high school victim had to go through “binders of photos,” marking their own faces to help cops track the total number of victims...
I assumed the images had been cut down to faces only, so maybe still drugdework but nothing offensive.Like they had to see all the AI images, not just theirs? There had to be a better solution than that right? FFS!
At the risk of starting a larger fight, not having to be subject to standardized requirements is exactly the point of the broader movement away from standardized, public education, however much it's cloaked in the language of "freedom" or "choice".I am sorry, but I don't think the school has a leg to stand on in their case.
I work for a Texas school district and every year go through mandated training. I cannot see how the boys' actions don't fall under sexual harassment. That is the area where the school has a responsibility and didn't meet it.
Edit: I am assuming that state requirements would fall on a private school.
LLM's still have training data that generated the model. In order for it to generate something, it has to be in the training data to pull from. Not sure how having the data in the training set is not an equivalent to possession.LLMs aren’t “tools” like Adobe Photoshop is.
Given a prompt, LLMs lacking guardrails will generate CSAM and NCII all on its own.
I'm not sure it's possible to form a well-considered opinion about this case, and what would be a proper reaction, without a lot more details about the images and the perpetrators.
As a society, we chose to let this fester loooong after a lot of well-informed people saw it coming. Certain billionaires actively encourage it with a wink. Young people with immature brains should not be the only to blame.
I would also say there's a wide spectrum available both in terms of the personalities, intentions and group dynamics involved, and also when it comes to the exact images. Were they ultra-realistic both in quality and context, or stylized, over-the-top, clearly identifiable as fake on superficial scrutiny?
I don't trust the media or the genral public opinion in this, without a lot more specific information.
I think Jordei did something similar with the holodeck.Even in the context of the show, this was considered crossing an inappropriate boundary. The MOST charitable I could be with that is Barclay wasn't PUBLISHING that material as a holonovel on subspace channels... so I suppose strictly speaking he wasn't violating space-law, but it was still a violation, and more so in space land because that computer knew EXACTLY what Troi looked like. Honestly Jordei or whoever's in charge of IT on that ship need to lock down access to those files better.
Children can't consent, and not just as a matter of law. They are quite unable to understand consequences or handle the pressures, manipulations etc of abusive adults. It's not JUST a matter of the law. This is why people are saying the phrase 'child porn' is deprecated. And not only because of the non-concensual nature but because it glosses over the true harm - and crime - revealed in the imagery. CSAM calls it what it IS. NCII (Non-Consensual Intimate Imagery) is what you call it when it's not children who are being deepfaked without their consent (and who consents to it???).Child porn = consenting child, if that was legally possible. “Consent” from a child doesn’t count legally.
I was a sexually frustrated teenager and adult and yet, somehow, incredibly, I never creeped on my classmates. I certainly never took creep shots of them. This stuff wasn't available then, but I can't see myself using it, either.Private juvie and prison owners are popping champagne. Now that weed is being phased out as their source of income, sexually frustrated teenage boys will provide a much needed boost for prison population.
Edit: correct autocorrect
However, any action that could be interpreted as concealing the crime is a Federal offense. For example, telling someone else "Don't look. Someone just had their brains blown all over your shoes."I'm not obligated to tell the police if I see someone shoot someone else.
I will NEVER understand how that happens. How does someone take that side without feeling like such a piece of human garbage that I can't finish this sentence about what they should do?Pre-internet, this is the same story as “beloved sports hero rapes the homecoming queen, town sides with hero”
Nothing in your third paragraph matters in the least. If a kid was drawing pictures of one of their classmates naked and passing them around, I would want very similar consequences.I'm not sure it's possible to form a well-considered opinion about this case, and what would be a proper reaction, without a lot more details about the images and the perpetrators.
As a society, we chose to let this fester loooong after a lot of well-informed people saw it coming. Certain billionaires actively encourage it with a wink. Young people with immature brains should not be the only to blame.
I would also say there's a wide spectrum available both in terms of the personalities, intentions and group dynamics involved, and also when it comes to the exact images. Were they ultra-realistic both in quality and context, or stylized, over-the-top, clearly identifiable as fake on superficial scrutiny?
I don't trust the media or the genral public opinion in this, without a lot more specific information.
I have to strongly disagree. This case involved kids. Obviously not excusing them, but they were still kids.Ask yourselves: how much difference is there between these boys opportunism and Trumps?
Nothing more than a whole lot of lawyers.
This always strikes me as a weird and clunky amount of semantic effort to defend pornography's honor that only sex-worker advocates would care about strictly delineating. It's not like "child porn" didn't sound abhorrent enough to people.Porn = consenting adult. Child porn = consenting child, if that was legally possible. “Consent” from a child doesn’t count legally. CSAM = child who is sexually abused, plus taking sexual pictures against their will or without their knowledge and proving the crime of sexual abuse.
So CSAM is a lot worse than child porn which is in turn a lot worse than porn.