It's a principal. It's their school. The idea that there is nothing they could do to investigate this or pass along info is simply absurd.would you? interesting... how would you check? do you know how Discord private servers work? Threads?
what if they shared the same images via snail mail? would you expect federal post offices to open every letter and check if there are AI CSAM pics inside?
The kids were sharing the pictures on Discord and accidently posted one or some on the wrong chat so they were distributing the pictures.So now a bunch of parents want a lot of money from the shareholders of the school?
That looks a lot more like greed than seeking justice. I wonder how many "affected" families would be willing to let some more AI pics be created just so they can have another excuse to sue for money. Certainly the money won't wash away embarrassing childhood memories (if any.. since it's not clear who saw those pics).
Ultimately, I'm having trouble understanding where we draw the line on "victim" of AI deepfakes. From the story, I gather those two pricks didn't share the pics with anyone in the school, intentionally. Only one other kid saw them, accidentally. So how are the girls "victims" in this case? They didn't know... until their parents chose to turn it into a public scandal. Else they would have said something earlier, right? Say some poor sod with a fetish for small feet may pick random people pics from a public school event and picture all of them with small feet, barefoot. Are they all "victims"? I'd post a still form Fast Times at Ridgemont High, but don't want some of you to become victims and go for my money.. I'm too poor anyway.
Dude, I'm not here to be judge, jury and executioner, but you're sure bending over backwards to defend his behavior. Speaks volumes about your moral compass.You're assuming there was actionable info in the first place. On what basis? Do you know what sort of data he got from the state tip line? Certainly not pics.
"Allegedly" is not used randomly here. If I tell you now there's some more AI CSAM shared on Discord, what are you going to do, exactly? Go trawl all Discord channels, to make sure you keep your moral standing? According to you, there's no excuse for doing nothing. Go do something...
You're also ignoring the fact that later he contacted law enforcement, when he received additional information.
For 6 months he knew the issue was sexualized deepfakes of minor students, he knew it's child-on-child abuse and sat on it because the law didn't force him to do more. He knew images were involved even if he did not see them with his own eyes. His internal moral compass did only what the law forced him to do. No moral person sits on the knowledge that sexualized pictures of young girls are floating around in their own school and does nothing for so long, while more girls could become victims.Lancaster District Attorney Heather Adams concluded that Micciche was not obligated to report the images because of a “loophole” [...] the loophole excused schools from mandatory reporting requirements for child-on-child abuse
Except Discord can filter images to trigger warnings to the administrators/moderators.would you? interesting... how would you check? do you know how Discord private servers work? Threads?
what if they shared the same images via snail mail? would you expect federal post offices to open every letter and check if there are AI CSAM pics inside?
That is an optional feature for admins and moderators of the server. Basically, the same people that create these shady image exchange groups..Except Discord can filter images to trigger warnings to the administrators/moderators.
But what if you never find out your family died? Then it would be fine, right?
And if you were a sociopath, your family dying wouldn't bother you anyways, so we should all just be sociopaths, right? I mean, if the onus is on society and the victim to not be bothered by things instead of punishing the person violating others.
The main point we seem to disagree upon is your certainty that he had enough info. Solely from the wording of this ARS article.. that doesn't actually quote the DA. And you conveniently ignore the fact that even law enforcement didn't act when given more information than the principal had, initially. Yet you are certain he was sitting on this actionable info? What makes you so certain?.. mob mentality?Dude, I'm not here to be judge, jury and executioner, but you're sure bending over backwards to defend his behavior. Speaks volumes about your moral compass.
He clearly had reasonable and relevant information and he chose to do nothing about it for 6+ months. The information is in the damn article. He got a tip from the state tip line and the fine article tells us what he received in the words of a district attorney:
For 6 months he knew the issue was sexualized deepfakes of minor students, he knew it's child-on-child abuse and sat on it because the law didn't force him to do more. He knew images were involved even if he did not see them with his own eyes. His internal moral compass did only what the law forced him to do. No moral person sits on the knowledge that sexualized pictures of young girls are floating around in their own school and does nothing for so long, while more girls could become victims.
You can play with words all you want, he might not have broken the law but he's as terrible as anyone else who pretends this is fine and less than immoral and abhorrent behavior, especially from a parent.
Didn't cross my mind, since that's not something I'd ever have to worry about...
In your case, I guess you would be OK with people distributing a picture of you having sex with an underage child, or a goat.
Something the girls thought they would never have to worry about as well. But it happened.Didn't cross my mind, since that's not something I'd ever have to worry about.
Unlike you, apparently...
If you don't have actionable information, but do have a tip that something is going on you investigate it, not sit on your hands for 6 months.The main point we seem to disagree upon is your certainty that he had enough info. Solely from the wording of this ARS article.. that doesn't actually quote the DA. And you conveniently ignore the fact that even law enforcement didn't act when given more information than the principal had, initially. Yet you are certain he was sitting on this actionable info? What makes you so certain?.. mob mentality?
If you take the time to look through the links, you'll find only one including a statement from DA Adams. You'll also find that this Micchiche wasn't the only one who knew since Nov'23 about the safe line tip, though nowhere is clearly written what info he got then. And that the DA didn't refer to him directly at any point in her statements.
Except your explicit argument was that it isn't a big deal. You said that in multiple ways in your original comment. You're just walking that back now. GTFO here with this dishonest bullshit.I am going to have to spell this out even more.
I never said things would be fine or that there was no harm done. I am talking about the effect of an act on one person. Just because an act does not affect one person, does not prevent it from affecting another or others.
In your family example, if person A never finds out their family died, then the effect on person A is probably minimal. Now the follow up question to ask is why did person A not find out their family died.
Your sociopath example is an interesting one and is one that I can actually use to further expand my point. You said a sociopath wouldn’t be bothered if their family died. But we actually cannot be certain of that. Yes, the sociopath’s family dying might not affect the person’s mental state, but it might affect them materially. If the family was providing financial support and as a result the financial support is removed or reduced, the sociopath would definitely be negatively affected.
This is the difference that I am trying to point out and not doing a very good job. A family dying has direct physical consequences for most individuals. The affect of a naked picture depends solely on the individual and the society a person lives in.
Yes, we need to help the current victims, and we need to prevent further crimes of this sort.
However, we also need to educate people to minimize the negative effects of these crimes on the victims. For example, we need to educate potential employers that a naked picture of a potential employee could be a fake and that the naked picture cannot be solely used as evidence to not hire the potential employee.
Thanks to Lower Decks, it's official canon now that emptying the holodeck biofilters is one of the worst jobs onboard Federation ships.In Star Trek: TNG, Barclay uses the holodeck for sex, including creating versions of crew mates like Troi. It wasn't her, but it looked like her.
I used to think the holodeck would be incredible to have. And it would be. But it'd turn into a debauched sex chamber that most crew members would become addicted to in about a week.
If you don't have actionable information, what action are you going to take, in order to investigate?If you don't have actionable information, but do have a tip that something is going on you investigate it, not sit on your hands for 6 months.
This is some "don't bring yourself moral troubles" bullshit.
AI CSAM happened. It's not the same thing. You're confused on so many levels by technology, I'm surprised you're reading ARS.. Or maybe just this one article, due to its title?Something the girls thought they would never have to worry about as well. But it happened.
The law doesn't work that way. Once a school district employee has knowledge or suspicion of a crime, they have a duty to report it to the authorities (with very strict time limits depending upon the abuse). Sexual abuse, human trafficking and a lot of other depressing things.Did the sexual harassment happen on school grounds? Were the victims even aware of it before the culprits got caught?
I don't know....maybe ask around? Have teachers be on the lookout? Warn parents that something might be happening? I'm not talking about doing anything dramatic, like phone searches. But you should try.If you don't have actionable information, what action are you going to take, in order to investigate?
Have teachers be on the lookout for discord channels?I don't know....maybe ask around? Have teachers be on the lookout? Warn parents that something might be happening? I'm not talking about doing anything dramatic, like phone searches. But you should try.
The people RUNNING discord know that, because Discord doesn't actually have private servers. They host them all themselves, and they ROUTINELY will check them for violations of TOS because there isn't actually an expectation of privacy. Granted, this is WHY a lot of people are leaving discord in droves, because they want personally hosted chat solutions where they actually DO have that expectation of privacy. Had Discord instead set things up so that not even THEY could monitor the encrypted contents of their servers, then there wouldn't be an expectation to examine the servers routinely for offending content, because there IS an expectation of privacy. It's the difference between watching someone acting suspiciously at a park... vs following them home and shoving a camera against their window to record what's happening in there. Because of the nature of Discord, there is an expectation they do some basic monitoring of behavior from time to time, especially after hearing reports. This ALREADY happens. Again, this is coming from someone who holds a right to privacy in high regard and is not willing to sacrifice that in the name of "protecting kids", but I absolutely WILL push for such investigations in a service like Discord where you already agreed they can peruse your communications at any time they feel like for TOS violations. Investigating the distribution of NCII absolutely qualifies.would you? interesting... how would you check? do you know how Discord private servers work? Threads?
what if they shared the same images via snail mail? would you expect federal post offices to open every letter and check if there are AI CSAM pics inside?
Alright you can stop right there, because no one is suggesting going to any of those extremes. But, teachers ARE regularly involved in their student's lives. It comes with the job. There are some bad ones that just clock in, read from a book, and clock out, but the majority actually care. I know this because I used to work with teachers, and I know that the end of their shift is not the end of the job. There's several hours AFTER school's out where the teacher is not just grading work, but in conversation with other teachers when they notice issues with the assignments, and conversations with administration and counseling if they've noticed behavioral patterns or other students they don't get along with well (or conversely, other students that they seem to get along great with). There's a LOT of behind the scenes planning that goes into making sure their entire lesson plan lines up with graduating as a fully educated student and in making sure they do well in general. And, yes, a LOT of contact with the parents. Some send out e-mails multiple times a day to specific parents about specific matters regarding a student.Have teachers be on the lookout for discord channels?
Teenage kids snap pics all the time, that's hardly suspicious. Remember, they didn't share these ai fakes inside the school. You can't search their phones, that would be a civil rights violation. And even if they gave you their credentials, you'd be more likely to find their own nude selfies than someone else's.. Is that what you want to see?
Except your explicit argument was that it isn't a big deal. You said that in multiple ways in your original comment. You're just walking that back now. GTFO here with this dishonest bullshit.
Only in that you are presumably an adult. Once can never tell these day. For all we know you are one of the boys?AI CSAM happened. It's not the same thing. You're confused on so many levels by technology, I'm surprised you're reading ARS.. Or maybe just this one article, due to its title
So the defense you mount is that authorities and others also did nothing? Solid. Yeah, they all suck, Micciche is in good company there. Doesn't make him a man with morals, just a man surrounded by others like him.The main point we seem to disagree upon is your certainty that he had enough info. Solely from the wording of this ARS article.. that doesn't actually quote the DA. And you conveniently ignore the fact that even law enforcement didn't act when given more information than the principal had, initially. Yet you are certain he was sitting on this actionable info? What makes you so certain?.. mob mentality?
If you take the time to look through the links, you'll find only one including a statement from DA Adams. You'll also find that this Micchiche wasn't the only one who knew since Nov'23 about the safe line tip, though nowhere is clearly written what info he got then. And that the DA didn't refer to him directly at any point in her statements.
“I honestly feel that the immediate breakdown was in the adults in the room not immediately contacting the police,” [senator] Malone said Monday. “Regardless of you know, one day, six days, 10 days, every bit of time that you’re a victim of that kind of action is reprehensible… I don’t feel that this should have happened at all”
How do you know that they weren’t passing around the images via Discord while at school?Have teachers be on the lookout for discord channels?
Teenage kids snap pics all the time, that's hardly suspicious. Remember, they didn't share these ai fakes inside the school. You can't search their phones, that would be a civil rights violation. And even if they gave you their credentials, you'd be more likely to find their own nude selfies than someone else's.. Is that what you want to see?
No, but PTA members do. There've been cases of teachers being removed after a member of the PTA notified the school that there was a nude of them online.What's your point? School principals watch all the porn on the internet?
You seem to be trying to justify making CSAM by arguing that if we just change the whole of Western civilization to not regard it as a big deal, it won't be one anymore.Please point out to me where I said this wasn’t a big deal.
It seems people cannot understand difference of degrees.
The point I keep trying to make is that this act is only devastating if society and the individual makes it so which makes this sort of crime different from many others.
Related to this, we really are just a bunch of avatars saying stuff on the internet, and none of us are CERTAIN these people are to blame beyond any reasonable doubt. That's what trials are FOR, determining guilt. What the people complaining about this lawsuit seem to be upset at is the lawsuit itself. No, I don't know of any guilt, but I know there's certainly PLENTY of cause to bring this to court and then TRY them. Why is THAT a problem?No, but PTA members do. There've been cases of teachers being removed after a member of the PTA notified the school that there was a nude of them online.
I know. Please note I was specifically replying in isolation to someone who was using the "People shouldn't be punished for a Stupid Prank" defense in order to point out that even if it were a stupid prank, it has gone horribly wrong and they should absolutely be punished.There are absolutely victims in this case, and there is also active malice involved. So, the decision to prosecute is a no brainer, unless you are one of the chuds here trying to justify CSAM. The real difficulty is what is the appropriate punishment? And that is a very difficult question to answer. There has to be enough deterrence to have a chilling effect on future would-be offenders, because this is going to get worse if not nipped in the bud, but to ruin a life for the next 60 years over this crime is in my opinion, ludicrous and counterproductive overkill. Most teenage boys grow out of the brainfree asshole stage spontaneously, so having them live under a bridge for the next 60 years merely adds to overall societal problems, as well as being a wildly disproportionate response. Shaming, forced apologies, and a couple of hundred hours of community service, followed by sealed records when majority reached, so an assumption of rehabilitation once the adult world is reached, may be the right mix for the majority, most of whom will not reoffend. But enough of a sting that the non-sociopathic majority think twice, and societal disapproval is unambiguous.
Please point out to me where I said this wasn’t a big deal.
It seems people cannot understand difference of degrees.
The point I keep trying to make is that this act is only devastating if society and the individual makes it so which makes this sort of crime different from many others.
Again something can be a big deal and not be devastating. Losing your job is a big deal for most, but for many it will not be devastating unless they cannot find another job relatively soon. Somebody stealing your car is a big deal for most, but again is not devastating for most. There are several degrees lower than devastation that are still a big deal.
If we as a society have the option to minimize the harm of a crime, why would we not take steps to minimize it?
There is nothing intelligent about advocating for CSAMGet with the programme!
This is an outrage echo chamber, not an intelligent discussion forum.
You seem to be trying to justify making CSAM by arguing that if we just change the whole of Western civilization to not regard it as a big deal, it won't be one anymore.
Do I even need to point out the flaw in this plan?
but don’t you see, if we ignore how non psychopath brains work, represss our trauma, and spend our lives dead inside, then this won’t be a problem! And if that sounds stupid, that’s not what I meant, even though I said it! It just means I’m right, that’s why everyone is telling me I’m wrong!It’s very funny that we have someone here using two accounts to advocate against consent.
Oh wait, the other thing. Tedious
they show up on every article about any non consensual intimate images to argue that it’s not actually a problem, and everyone should just post nudes of themselves online. it’s not a stupid take they’re posting, more a red flag.I can practically hear the voice getting shriller. Everyone posts stupid takes now and then. I know I do. Just let it go with what shreds of dignity remain to you.
In such a society, the woman would be devastated whether she knew or not because it’s likely if anyone saw it, they would fucking kill her. Thanks for pointing out an example of why your hot take sucks.Actually, I just thought of another scenario. What if a woman in a society that requires head coverings and full body clothing has a fake picture of her in shorts and a t-shirt without a head covering made of her. Should this woman be devastated? Should this woman be made to suffer because this fake picture is out there?
but don’t you see, if we ignore how non psychopath brains work, represss our trauma, and spend our lives dead inside, then this won’t be a problem! And if that sounds stupid, that’s not what I meant, even though I said it! It just means I’m right, that’s why everyone is telling me I’m wrong!
/S
they show up on every article about any non consensual intimate images to argue that it’s not actually a problem, and everyone should just post nudes of themselves online. it’s not a stupid take they’re posting, more a red flag.
You saying it's so doesn't make it OK. Why are you so bent on removing people's privacy and bodily autonomy? Does easier access to images to jerk off to matter that much to you?I have proved beyond a reasonable doubt that there are no direct effects of these crimes and the indirect effects are solely proportional to society and the beliefs of the individual.
Why does this matter?Here is my take on this... first, I don't see where it shows that the boys that created the illegal AI CSAM was done AT the school.
The school was notified that there was a severe and significant problem going on with its students and then sat on it for six months, during which additional students were potentially targeted and affected.If they did do most of the AI at the school, then I can understand how this is moving forward.
Who’s to say that they aren’t? There may be other legal cases going forward that seek to hold the parents accountable.But establishing the fact that the crime (creating illegal AI CSAM) was possibly performed ANYWHERE, then why aren't the parents ALSO responsible?
Nice strawman you’re building there.It just seems the parents of these offending boys are getting off scott-free and not accoutable to there darling little devils!
"Sins of the father" is not a recognized legal concept in U.S. law, nor should it be. It takes a LOT of extenuating circumstances before parents can be considered partly responsible for the actions of their charges. If it can be demonstrated that the parents KNEW their kids were doing this and did nothing, yes, they can be tried as well. An argument could even be made that they SHOULD have known, but these kids are at the age where the parents are "letting go of the reigns" a bit and allowing some autonomy, so perhaps not. In any case, the reason the school is being charged is apparently because the prosecutors believe there is sufficient evidence to show THEY knew and did nothing.Here is my take on this... first, I don't see where it shows that the boys that created the illegal AI CSAM was done AT the school. If they did do most of the AI at the school, then I can understand how this is moving forward. But establishing the fact that the crime (creating illegal AI CSAM) was possibly performed ANYWHERE, then why aren't the parents ALSO responsible? It just seems the parents of these offending boys are getting off scott-free and not accoutable to there darling little devils!
A nasty ex-boyfriend takes nude pictures of you and publishes them. You know nothing about it. Then these pictures reach your workplace. There are quickly rumours that you are a pervert publishing nude photos of herself, and that makes you obviously a pedophile who cannot be trusted and damages the reputation of the company. You are fired, and HR is full of prudes who cannot even tell you what you have supposedly done.I have proved beyond a reasonable doubt that there are no direct effects of these crimes and the indirect effects are solely proportional to society and the beliefs of the individual.
Why would we not strive on minimizing the bad effects of such a crime?
You are the one desiring women to be devastated by these crimes when there is no physical reason for the victims of these crimes to be devastated.