As teens await sentencing for nudifying girls, parents aim to sue school

Post content hidden for low score. Show…
Post content hidden for low score. Show…

daveok

Ars Centurion
319
Subscriptor
But officials—who at the time weren’t legally required to act—failed to notify parents or police for six months

Soon after, both its head, Matt Micciche, and the school board’s president, Angela Ang-Alhadeff, resigned.

Rather than act on the images, Micciche, the school board president, did nothing, parents alleged, until more information emerged in May 2024. The next month, the school filed a ChildLine report, but law enforcement did not launch a criminal investigation until parents eventually got the tip.

Lancaster District Attorney Heather Adams concluded that Micciche was not obligated to report the images because of a “loophole” that lawmakers immediately sought to close, USA Today reported. That loophole excused schools from mandatory reporting requirements for child-on-child abuse. Parents and lawmakers are pushing to close the loophole to ensure schools must report any AI nudes after the first detection.

I was wondering why criminal charges were not brought against school staff, and I guess this is the reason why. The perpetrators being under 18 is one thing, but the adults in the room literally did nothing. There's a difference between what's legally right, and morally right.
 
Upvote
87 (94 / -7)
To me, this is a Hill to die on in the fight against technology. If we can't control our technology from decimating the TRUST and LIVES of our Sons & Daughters growing up in society then the technology shouldn't be allowed. Figure it out. Make it right. The fact this stuff can happen is unacceptable AF and should not be tolerated. People are getting badly hurt.
 
Last edited:
Upvote
116 (129 / -13)

gothmog1114

Ars Praetorian
457
Subscriptor++
I'm not obligated to tell the police if I see someone shoot someone else.
But I'm going to.
Sure, but working in a school, you're a mandatory reporter, so that's a pretty different environment. I think it's good they are trying to change the loophole that allowed them to not report it.
 
Upvote
96 (99 / -3)

LeftCoastRusty

Ars Scholae Palatinae
1,355
Subscriptor
the school “updated its reenrollment contracts to discourage students and families from publicly speaking poorly of the school.”

Yet another public institution that has everything absolutely backwards. The school’s reputation comes first. I remember a time when students actually mattered.

I’m assuming this is a private school. A good way for parents to hold them accountable is to withdraw kids from school. No students, no school.
 
Upvote
141 (143 / -2)
Post content hidden for low score. Show…

jgee43

Ars Scholae Palatinae
702
Subscriptor++
Not defending the school districts' actions completely here, but schools have pretty limited options when it comes to dealing with violations of law that happen online--and if the sharing doesn't occur on school grounds during the school day those options diminish even further. As soon as you, as a school or school district, try to regulate student behavior outside of that time window you end up getting sued by the other end of the spectrum. I don't know the full details of the case (and it's not clear to me here if the accidental discord was at school or not at school), so I have to reserve some judgement.

With that big caveat out of the way, the district still screwed up big time by not reporting to the police. We had a sextortion ring that we became aware of at our school (we're in a relatively small district, just one small-to-medium sized high school), which was not operating at school. That being said, because we became aware of it, our principal held an assembly with every kid in the school at once, explained what was going on, explained exactly what our response would be to any reporting that was done, and encouraged any victims or people with knowledge of the situation to come forward and meet with either admin or counselors. Within two weeks we had two perpetrators identified and reported to police and the other participants so scared that it ended pretty quickly.
 
Upvote
158 (160 / -2)
Yet another public institution that has everything absolutely backwards. The school’s reputation comes first. I remember a time when students actually mattered.

I’m assuming this is a private school. A good way for parents to hold them accountable is to withdraw kids from school. No students, no school.
And no money, which is what most private schools cherish over everything else. Bankrupting the school would definitely get their attention, but it sounds like it's mainly the 12 victim' families who care and the other parents largely don't. Winning their lawsuit is likely the only way to make the school accountable.
 
Upvote
51 (52 / -1)

citizencoyote

Ars Tribunus Militum
1,576
Subscriptor++
The incident could have been caught early, after the school learned of the images following an anonymous report to a state-run tipline. But officials—who at the time weren’t legally required to act—failed to notify parents or police for six months, as the number of victims continued to grow.
Legal loophole present at the time or not, didn't the school have some sort of anti-bullying measure in place they could have leaned on at the bare minimum? Or was their precious reputation so important that they were eager to look the other way?

Fuck this school and fuck those kids who did it. And as someone else suggested, go after whatever company made the AI that they used.
 
Upvote
62 (65 / -3)

jgee43

Ars Scholae Palatinae
702
Subscriptor++
"A concerned student saw it and reported the image through a state tip line," Sounds like the student pretty much knew the school would ignore or would retaliate against them for reporting.
It may not have been an intimidation/apathy thing, so I wouldn't jump straight to that conclusion. Social pressure can be rough, and even just the fear of being perceived as being part of it could keep a kid from reporting it to an administrator. As an adult, it's easy for me to forget the social pressures involved with being a teenager--and I spend all my working hours with them.

Teenagers have always had it rough because of biology. Now it's even tougher because of the technology climate they have to grow up in. Your worst decisions can be well-documented in a way that they never could before, and the conflicts and struggles you have follow you back home every day and flash their notifications in your face.

Edit: As I said, I'm giving the school staff the benefit of the doubt. However, with the updates to the policy on not speaking negatively of the school, there's a very good chance that it's not warranted in this case. (I missed that on my first reading.)
 
Last edited:
Upvote
32 (33 / -1)
In Star Trek: TNG, Barclay uses the holodeck for sex, including creating versions of crew mates like Troi. It wasn't her, but it looked like her.

I used to think the holodeck would be incredible to have. And it would be. But it'd turn into a debauched sex chamber that most crew members would become addicted to in about a week.
 
Upvote
64 (66 / -2)
Post content hidden for low score. Show…

jonah

Ars Tribunus Angusticlavius
6,608
Look, the school should have reported it, if only to the parents of the affected kids.

But seriously, the fact that these boys were acting like sociopathic pricks is not the school's fault. That would be a) the parents, and b) the kids themselves. But I doubt either group is eager to look in the mirror and assign blame where it belongs.

Also the school's pockets are deeper, so naturally that's where the lawsuits go.
 
Upvote
-8 (25 / -33)
Post content hidden for low score. Show…
In Star Trek: TNG, Barclay uses the holodeck for sex, including creating versions of crew mates like Troi. It wasn't her, but it looked like her.

I used to think the holodeck would be incredible to have. And it would be. But it'd turn into a debauched sex chamber that most crew members would become addicted to in about a week.
Quark beat you to it with the holosuites on DS9. And his brother Rom occasionally complained about having to mop them...
 
Upvote
45 (45 / 0)

JohnCarter17

Ars Praefectus
5,734
Subscriptor++
I am sorry, but I don't think the school has a leg to stand on in their case.

I work for a Texas school district and every year go through mandated training. I cannot see how the boys' actions don't fall under sexual harassment. That is the area where the school has a responsibility and didn't meet it.

Edit: I am assuming that state requirements would fall on a private school.
 
Upvote
54 (55 / -1)

SGJ

Ars Praetorian
519
Subscriptor++
Not defending the school districts' actions completely here, but schools have pretty limited options when it comes to dealing with violations of law that happen online--and if the sharing doesn't occur on school grounds during the school day those options diminish even further. As soon as you, as a school or school district, try to regulate student behavior outside of that time window you end up getting sued by the other end of the spectrum. I don't know the full details of the case (and it's not clear to me here if the accidental discord was at school or not at school), so I have to reserve some judgement.

With that big caveat out of the way, the district still screwed up big time by not reporting to the police. We had a sextortion ring that we became aware of at our school (we're in a relatively small district, just one small-to-medium sized high school), which was not operating at school. That being said, because we became aware of it, our principal held an assembly with every kid in the school at once, explained what was going on, explained exactly what our response would be to any reporting that was done, and encouraged any victims or people with knowledge of the situation to come forward and meet with either admin or counselors. Within two weeks we had two perpetrators identified and reported to police and the other participants so scared that it ended pretty quickly.
I think this is exactly the right way to deal with this issue within a school. Ignoring it and pretending it never happened is clearly wrong. However, there is only so much a school, or all schools, can do and the companies enabling this also need to be tackled.
 
Upvote
17 (18 / -1)

Super King

Ars Praetorian
468
Subscriptor
Private juvie and prison owners are popping champagne. Now that weed is being phased out as their source of income, sexually frustrated teenage boys will provide a much needed boost for prison population.

Edit: correct autocorrect
That's quite the false equivalency. Choosing to smoke weed and generating CSAM are not remotely similar. The old "boys will be boys" excuse is pathetic.
 
Upvote
56 (65 / -9)
I'm glad the school's being held responsible for not reporting this earlier. This abusive and humiliating material is something that's going to stick with the victims it was depicting for a long time.

I'm also glad the kids who solicited it are being held accountable, because of course they should be. It's a shame the only way they were able to go about this was child abuse laws. Not that this doesn't violate that, but it means the person over the age of 18 wouldn't have gotten justice otherwise. It's taking a lot of time for the law to catch up to this new level of abuse, and it does need it's own category so everyone victimized by it is caught in this net.

And EVERYONE responsible is punished. This is why I specify that the alleged perpetrators, the students specifically I mean, SOLICITED the abusive material. They didn't "create" it. It's a distinction with one difference, there's an ADDITIONAL party to sue who actually DID create it, and that's the company hosting the software.
 
Upvote
29 (29 / 0)
Post content hidden for low score. Show…
Did the sexual harassment happen on school grounds? Were the victims even aware of it before the culprits got caught?
The right question is "when did school staff become aware of it?". The moment they become aware of it, it becomes their responsibility to act. That's what the law has said for some time.
 
Upvote
53 (54 / -1)

tgx

Ars Scholae Palatinae
1,403
In Star Trek: TNG, Barclay uses the holodeck for sex, including creating versions of crew mates like Troi. It wasn't her, but it looked like her.

I used to think the holodeck would be incredible to have. And it would be. But it'd turn into a debauched sex chamber that most crew members would become addicted to in about a week.
Don't forget Barclay was also treated for holo-addiction.
 
Upvote
15 (15 / 0)
Post content hidden for low score. Show…
Don't forget Barclay was also treated for holo-addiction.
1774290116763.png
 
Upvote
39 (40 / -1)
Don't forget Barclay was also treated for holo-addiction.
Even in the context of the show, this was considered crossing an inappropriate boundary. The MOST charitable I could be with that is Barclay wasn't PUBLISHING that material as a holonovel on subspace channels... so I suppose strictly speaking he wasn't violating space-law, but it was still a violation, and more so in space land because that computer knew EXACTLY what Troi looked like. Honestly Jordei or whoever's in charge of IT on that ship need to lock down access to those files better.
 
Upvote
28 (29 / -1)
Even in the context of the show, this was considered crossing an inappropriate boundary.
The show did a good job of taking the holodeck to it's logical end with Barclay, but they deeply undersold the reality that a vast majority of every crew on every ship would need that counseling.
 
Upvote
26 (26 / 0)

Uragan

Ars Legatus Legionis
11,172
And if it was an OSS tool? Should we sue Hugging Face out of existence?

If an AI company knowingly facilitated this, sure - sue away, but I don't think that holding tool makers liable for the use of those tools is a good idea in most cases.
LLMs aren’t “tools” like Adobe Photoshop is.

Given a prompt, LLMs lacking guardrails will generate CSAM and NCII all on its own.
 
Upvote
29 (31 / -2)

Fatesrider

Ars Legatus Legionis
24,977
Subscriptor
With the assumption that children are, well, children, the notion that they can be trusted to generally act in any kind of adult or even legal manner is genuinely laughable. They're barely civilized savages. And I use "civilized" very conditionally.

What to do? I guess if it was up to me, I'd mandate that anyone under 18 can't have an Internet-connectable phone. No data, no wifi and no MMS in text. Not just disabled, but just not included as parts in the phones. This provides the child with communication via telephone and text, which back in my day would have been enormously popular even if it may be seen as completely lame today.

No Google PlayStore for them. And no way to do their AI altering except at home, which vastly increases the risk of them getting caught. So no e-sharing of images. No e-bullying with robust built-in easy-to-use blocking of text messages Without the hardware to do it, it's a lot harder to side-load apps, too.

Granted, it won't solve the problem. Human nature is such that solutions to shit caused by human nature are, well non-existent. There's ALWAYS going to be exceptions. But it would likely knock down the volume of shit behavior and influences a lot. For all that kids today are growing up with tech that we'd have killed for back in the day, they seem to be getting stupider and more malicious. It's not "kids". It's a flaw in humanity that we don't mature until our mid-20s. Everyone YOUNGER is highly prone to severe lapses in judgment. It's not even a generational thing. If we had that tech back then, we'd have done the same shit kids are doing today.

Humans will human, after all. And this is a human nature thing.

Kids today are just finding many other new and exciting ways to prove that Heinlein's theory about how to raise boys should be the law of the land for every child (Quote at the bottom of the list), just like we did back in the day. Only back then, it couldn't get out to the world and fuck up lives for decades with the same frequency it does today.

Hence why I think an interactive Internet was the worst weapon humanity ever invented.
 
Upvote
-19 (12 / -31)

Scifigod

Ars Tribunus Angusticlavius
8,677
Subscriptor++
Each high school victim had to go through “binders of photos,” marking their own faces to help cops track the total number of victims
I don't even the right word for this. Making a bunch of teens sift through a stack of CSAM to sort out the (fake but extremely realistic) pictures of themselves.

Edit: terminology correction...
 
Last edited:
Upvote
53 (53 / 0)