Sharing deepfake porn could lead to lengthy prison time under proposed law

Hold up a sec.

What is your actual point here?
When I was replying to that person, my point was that having a 1660Ti from 2019 with 6 GB memory isn't a shitty graphics card that low income people should be able to afford. It's better than my laptop GPU from 2021 with 4 GB memory that is my current computer, and I'm not low income.
 
Upvote
-10 (1 / -11)
If you literally applied the "someone under the age of consent" nudity rule you've specified here, every artist who ever painted a nude cherub or a naked baby Jesus would be in prison.
No one is confusing a painting for a real photo. AI photos CAN be confused for it. See the difference?
 
Upvote
2 (5 / -3)

cyberfunk

Ars Scholae Palatinae
1,400
I me
Intent to harm the victim. Intent to portray the victim naked. Intent to make the character appear in their likeness. Intent is currently what separates fraud from incompetent bookeeping.
So , to me intent is an important but very difficult to prove component here . How do you prove a persons mental state when they compose a piece of “art” ?
 
Upvote
-3 (2 / -5)

cyberfunk

Ars Scholae Palatinae
1,400
No one is confusing a painting for a real photo. AI photos CAN be confused for it. See the difference?
I think part of the issue here is the bill text does not specify photo or photo realism at ALL. They merely say “digital depiction”, which is crazy broad without any narrowing language , which I cannot find in the text.

They reference the following definition from 18 USC § 2256(5)

(5) “visual depiction” includes undeveloped film and videotape, data stored on computer disk or by electronic means which is capable of conversion into a visual image, and data which is capable of conversion into a visual image that has been transmitted by any means, whether or not stored in a permanent format;

Again , there’s nothing here that’s photorealistic at all . The devil is in the details , and this is a poorly written and scoped bill .. which matters immensely when it comes to scope of law.

If they mean photorealistic , or photo like , or anything of that nature , they need to say so , and be very clear. A rushed poorly written law helps nobody and can deal a great deal of harm .
 
Last edited:
Upvote
0 (5 / -5)

drewcoo

Wise, Aged Ars Veteran
134
I really can't see how making and distributing a deepfake porn of someone would be different than distributing revenge porn of them. The end goal is the same: to humiliate the target.
It's because we're a nation of prudes. Sex is taboo, so anything sex-related that's not about the sex (rape, revenge pr0n, etc.) is overshadowed by and confused with the sexual element.

As an aside, I wonder if today's porn works as a sort of A Clockwork Orange aversion therapy, making us more puritanical. And if that's not the case, what would happen if we ran some A Clockwork Orange-style aversion therapy sessions with today's porn on all the puritans?
 
Upvote
-10 (1 / -11)

r0twhylr

Ars Praefectus
3,358
Subscriptor++
When I was replying to that person, my point was that having a 1660Ti from 2019 with 6 GB memory isn't a shitty graphics card that low income people should be able to afford. It's better than my laptop GPU from 2021 with 4 GB memory that is my current computer, and I'm not low income.
I guess my question is why that's relevant here, and how it relates to the original point you were replying to, that this would disproportionately impact less-privileged kids. Not that they would make more of this stuff, but that it would impact them disproportionately, IE, more likely to be prosecuted, convicted, have their lives ruined than a more privileged kid.

No one said kids in poverty have high-end GPUs. If you have internet connectivity and a little bit of curiosity & savvy, it's not hard to get something like that going in Google Colab.
 
Upvote
-3 (1 / -4)

gafx

Ars Scholae Palatinae
906
That's not universally true. There are some jurisdictions where you need consent 100% of the time, excluding very narrow carve-outs.
Where? The law in the in the US is pretty consistent that you can take photos of anyone in public and anywhere where there is not a realistic expectation of privacy. The photographer/artist owns the file and does not need consent to use it or edit it.

If woman goes topless on the beach, anyone can legally take her photo and post it online without consent as the photographer owns the rights to the image. It may be rude, but it's legal unless she's underage. There are groups of sleaze balls that walk down Haulover Beach videotaping and photographing nude women sunbathing and the cops tell everyone that there is nothing they can do to stop it.

If you needed peoples consent to take photos in public, you would not be able to take any photos in public. The laws were created a long time ago to protect artists/photographers/news cameramen.
 
Last edited:
Upvote
9 (10 / -1)

gafx

Ars Scholae Palatinae
906
If you literally applied the "someone under the age of consent" nudity rule you've specified here, every artist who ever painted a nude cherub or a naked baby Jesus would be in prison.
I like to think of Baby Jesus not as nude, but dressed in one of those little tuxedo t-shirts. The kind that says I'm Jesus, but I'm here to party!
 
Upvote
2 (3 / -1)

Uragan

Ars Legatus Legionis
11,175
It's because we're a nation of prudes. Sex is taboo, so anything sex-related that's not about the sex (rape, revenge pr0n, etc.) is overshadowed by and confused with the sexual element.

As an aside, I wonder if today's porn works as a sort of A Clockwork Orange aversion therapy, making us more puritanical. And if that's not the case, what would happen if we ran some A Clockwork Orange-style aversion therapy sessions with today's porn on all the puritans?
You're wondering what would happen if we exposed people to pornography against their will and under duress?
 
Upvote
3 (3 / 0)
I think part of the issue here is the bill text does not specify photo or photo realism at ALL. They merely say “digital depiction”, which is crazy broad without any narrowing language , which I cannot find in the text.

They reference the following definition from 18 USC § 2256(5)

(5) “visual depiction” includes undeveloped film and videotape, data stored on computer disk or by electronic means which is capable of conversion into a visual image, and data which is capable of conversion into a visual image that has been transmitted by any means, whether or not stored in a permanent format;

Again , there’s nothing here that’s photorealistic at all . The devil is in the details , and this is a poorly written and scoped bill .. which matters immensely when it comes to scope of law.

If they mean photorealistic , or photo like , or anything of that nature , they need to say so , and be very clear. A rushed poorly written law helps nobody and can deal a great deal of harm .
The meaning seems clear to everyone else here, but if you're seriously arguing that this bill as written would man cartoon caricatures or scribbling devil horns on a photo of someone you hate, well no harm in tightening up the language. That's clearly all you want, right? Right...
 
Upvote
1 (1 / 0)

jsully2549

Ars Scholae Palatinae
718
https://www.merriam-webster.com/dictionary/deepfake
A watermarked deepfake is a contradiction in terms. I can think of no use of deepfakes that would use watermarks except possibly journalism (which could probably get out of it citing 1st amendment).

Edit: I'm surprised malicious deepfake distribution isn't covered by existing defamation statutes.
I think the intent was that all AI images would contain watermarks to identify the source.
 
Upvote
0 (0 / 0)

Plorkie

Ars Centurion
312
Subscriptor++
Forcing a public statement and apology for creating and sharing what amounts to libel goes a long way towards restoring the damage.

But, in the end, it's not really up to YOU to decide what best serves victims of this, is it? Victims themselves have been asking for this, and they're in the best position to decide. This is a very serious thing they're doing, not some light-hearted fun like drawing a mustache on a picture. This is something passing itself off as reality.

(I'm going to just mention that it's an odd coincidence that this proposed method of "worrying about making the victim whole" involves protecting the perpetrators from consequence. Very odd.)
I think I'm not quite communicating my meaning well here, but I also think you're putting words in my mouth. In no way do I want to protect the perpetrators, and I'm not in any way accusing these people pushing for these laws of being capricious. If this is what will help them, then I specifically said that I support it. I simply want to say that the laws SHOULD be about the victims and focusing on giving them what they need.

In my post, I was also trying to raise my own experience seeing kids (specifically) destroy their own lives and the lives around them. I don't want to defend their crimes, but I do lament that the consequences of kids being idiots keep getting more and more severe. And by consequences I DON'T mean punishment-- I mean the damage causes
Forcing a public statement and apology for creating and sharing what amounts to libel goes a long way towards restoring the damage.

But, in the end, it's not really up to YOU to decide what best serves victims of this, is it? Victims themselves have been asking for this, and they're in the best position to decide. This is a very serious thing they're doing, not some light-hearted fun like drawing a mustache on a picture. This is something passing itself off as reality.

(I'm going to just mention that it's an odd coincidence that this proposed method of "worrying about making the victim whole" involves protecting the perpetrators from consequence. Very odd.)
I'm not intending to defend the perpetrators, nor am I calling these people mentioned in the article capricious. I just think that when writing these laws, it should be about helping the victim. I actually think public statements and libel laws are exactly the kind of things I mean by that. And I'm not interested in defending the perpetrators-- I just think their punishment should be whatever best serves the victims, whether that's $150k or $1.5 million or a public statement or whatever. It's not about what I want, nor is it about what you want, to me it's about what the victims want.
Forcing a public statement and apology for creating and sharing what amounts to libel goes a long way towards restoring the damage.

But, in the end, it's not really up to YOU to decide what best serves victims of this, is it? Victims themselves have been asking for this, and they're in the best position to decide. This is a very serious thing they're doing, not some light-hearted fun like drawing a mustache on a picture. This is something passing itself off as reality.

(I'm going to just mention that it's an odd coincidence that this proposed method of "worrying about making the victim whole" involves protecting the perpetrators from consequence. Very odd.)
I suck at communicating in these posts. I'm not trying to defend the perpetrators. I just think it's sad that dumb kids can cause so much more damage while they're being dumb. That, and that I think that laws shouldn't be about what you OR I want-- they should be about what the victim wants. Whatever can best help them heal.
Really, you're the one suggesting that public statements and libel are good punishments.
I'm only saying that if the victim wants those things, then that's what they should have.
 
Upvote
0 (1 / -1)

fixate

Wise, Aged Ars Veteran
129
While I agree with this in spirit, I don't expect a law to have much of a change on teen boys doing this sort of stupid thing on a lark.
Threat of punishment does work in deterring, it just doesn't work 100%. Also, laws like this are put in place to help victims and to give them recourse to do something about it to stop it from happening.

If there is no law, victims can do absolutely nothing.
 
Upvote
2 (2 / 0)
I guess my question is why that's relevant here, and how it relates to the original point you were replying to, that this would disproportionately impact less-privileged kids. Not that they would make more of this stuff, but that it would impact them disproportionately, IE, more likely to be prosecuted, convicted, have their lives ruined than a more privileged kid.
Behind the argument that more under privileged kids than privileged kids would be convicted is the assumption that both rich and poor have equal access to technology. This premise is completely false and divorced from reality. The computers of people who can even afford to buy a new computer but have a limited budget are generally netbooks, not gaming desktops, especially if they rent tiny apartments and move a lot because they live in problematic buildings (desktops take up space and are less portable).

No one said kids in poverty have high-end GPUs. If you have internet connectivity and a little bit of curiosity & savvy, it's not hard to get something like that going in Google Colab.
Creating deepfakes is not allowed in Google Colab.
 
Last edited:
Upvote
-7 (0 / -7)
D

Deleted member 807857

Guest
If perpetuators are adults, no problem at all. It's the teens and early teens "doing it for fun" and still too immature to appreciate the harm done to their victims that I am more concerned about. Would hate to see 10 years slapped onto a still-idiotic 13 year old. Hopefully prosecutors and judges will proceed judiciously...
So, you openly admit that it does harm to the victims, but are ok with that as long as the person is "mature" enough to know that it does harm to the victims?
 
Upvote
0 (0 / 0)
D

Deleted member 807857

Guest
It's strange the law doesn't already account for whatever penalty there should be for posting explicit photos of someone without their consent. Maybe such a law doesn't exist, in which case I am now wondering what congresscritters are profiting from it...
If there is a law then muggie taylor-greene needs to be arrested and charged.
 
Upvote
1 (1 / 0)
D

Deleted member 807857

Guest
Ok, what if somebody were to draw an image of Donald Trump felating Vladimir Putin? That's absolutely and clearly political speech, not anything intended for titillation. Wanting things to be clearly defined isn't always so that people know how close to the line they can go; it's also how you prevent the law from being abused by bad actors.
Oh my god that would be a horrible picture!
Where can I find it?
 
Upvote
0 (0 / 0)

theSeb

Ars Praefectus
4,500
Subscriptor
If perpetuators are adults, no problem at all. It's the teens and early teens "doing it for fun" and still too immature to appreciate the harm done to their victims that I am more concerned about. Would hate to see 10 years slapped onto a still-idiotic 13 year old. Hopefully prosecutors and judges will proceed judiciously...
But you are not worried about the victims of these ‘boys will be boys‘ malicious little wankers?
 
Upvote
-2 (1 / -3)
History suggests otherwise. Certainly punishment may be required, but a possible 10 years appears to be rather over the top.
There have been a plethora of studies showing that longer sentences, making a crime a felony over a misdemeanor, etc in and of themselves do not increase legal compliance with a law (e.g. https://www.vera.org/news/research-shows-that-long-prison-sentences-dont-actually-improve-safety).

When people break a law , assuming they're aware it's a law in the first place (which is becoming more and more difficult as laws have increased in count substantially; see this classic YT video:
View: https://www.youtube.com/watch?v=d-7o9xYp7eE

), they're not considering the exact sentence length possibility, if they're even aware of the actual length. Have you SEEN the federal sentencing guidelines calculations?! Rather, there might be a vague understanding that a thing is wrong, and that some things obviously are considered more wrong than others, and might have stronger consequences, but no one is counting years and making the decision to commit a crime immediately prior to committing a crime. "Welp, this sentence is only 5 years if I don't do XYZ, so I'd better commit my crime without XYZ!", said no one ever.

There is no doubt that there needs to be consequences for crime. Sometimes that might indeed be incarceration. There ARE dangerous people who harm others in such ways that society needs physical protection from them, particularly if they repeat the same dangerous acts without remorse. Sometimes though the consequences need to be more aligned to real education on the wrongdoing, victim recompense, mental health assistance, community service, and so on and so forth. We're VERY quick to slap jail time on everything here in the US as if that's a magical fix for all things we call "crime", even when we KNOW it doesn't work that way.

In this case, does a HS kid need consequences for doing this... YES! No question! Should it be 10 years in jail? Probably not.
 
Upvote
7 (7 / 0)

happyraul

Ars Praetorian
451
Subscriptor++
If a known compulsive liar tells you things meant to humiliate someone, are they doing harm? The sooner everyone views audio/visual media as a compulsive liar, the sooner deepfakes, gen AI, Photoshop, etc... lose their power to humiliate, mislead, and misinform us, at the cost of being unable to "prove" anything with audio/video/photo recordings any longer.
 
Upvote
-7 (0 / -7)

StuiWooi

Ars Scholae Palatinae
799
If perpetuators are adults, no problem at all. It's the teens and early teens "doing it for fun" and still too immature to appreciate the harm done to their victims that I am more concerned about. Would hate to see 10 years slapped onto a still-idiotic 13 year old. Hopefully prosecutors and judges will proceed judiciously...
Case in point the occurrence pushing law makers here, wasn't it reported that none of the images were seen by victims or school staff? They boys were getting their rocks off and deleted everything when word spread... Not arguing that is wasn't harmful but materially different to an adult making and posting up online!

In the same vein back in my day cliques at school would keep "fit" (British slang I despite for attractive) lists, and that goes for both boys and girls.

Along those lines I also take issue with the quote in the article about imagining it was your wife/sister/daughter, it could surely also be used to humiliate males?
 
Upvote
-3 (0 / -3)
Still-idiotic 13 year olds can get ten years for murder as a juvenile. Some girls commiting suicide would not be a surprising result of deepfake porn.

At age 23 the former-idiot can put their life back together.
Murder is completely different from making pictures though. Kid can be making them "for research purposes" and leak unintentionally. He also can leak them anonymously - good luck investigating it. Also, girls can be really mean to each other.

Suicide is a choice. Some girls do it for stupidest reasons.

Prohibiting kid's stupidity is futile, they are dumb, that's just how it works. Much more realistic approach would be to stop demonizing nudity. "Stick and stones may break my bones but pictures shall never hurt me". If faking pictures become trivial there is no point to treat them as an evidence of wickedness.

Current attitude to nudity is what makes bullying, blackmailing and suicides possible.
 
Upvote
-4 (3 / -7)

r0twhylr

Ars Praefectus
3,358
Subscriptor++
Behind the argument that more under privileged kids than privileged kids would be convicted is the assumption that both rich and poor have equal access to technology. This premise is completely false and divorced from reality. The computers of people who can even afford to buy a new computer but have a limited budget are generally netbooks, not gaming desktops, especially if they rent tiny apartments and move a lot because they live in problematic buildings (desktops take up space and are less portable).


Creating deepfakes is not allowed in Google Colab.
I disagree with your assumption. I read "disproportionately impact less-privileged kids" as "in the same legal situation, the low-income person faces a bigger struggle and worse outcome". There may be fewer cases, but that doesn't mean the outcome isn't disproportional.

As to your second point, don't assume that apparent lack of resources will prevent this activity. Kids are smart, inventive, and resourceful. PCs that are powerful today will become freebee giveaways in a few years. As for what Google allows ... lets just say I don't think it is foolproof or couldn't be abused. Plus, there are other resources out there as well, and Stable Diffusion will run on regular CPUs just fine, only slower. My point is that you can't just assume that low income kids will be magically protected by their lack of resources.
 
Upvote
2 (2 / 0)

cyberfunk

Ars Scholae Palatinae
1,400
The meaning seems clear to everyone else here, but if you're seriously arguing that this bill as written would man cartoon caricatures or scribbling devil horns on a photo of someone you hate, well no harm in tightening up the language. That's clearly all you want, right? Right...

The problem isn’t what everyone agrees should be the case , but how the law is written. The way it’s written, it’s very possible to read it over broadly , and you can bet that someone will .
 
Upvote
4 (5 / -1)

Uragan

Ars Legatus Legionis
11,175
Case in point the occurrence pushing law makers here, wasn't it reported that none of the images were seen by victims or school staff? They boys were getting their rocks off and deleted everything when word spread... Not arguing that is wasn't harmful but materially different to an adult making and posting up online!
You're really splitting hairs here.

In the same vein back in my day cliques at school would keep "fit" (British slang I despite for attractive) lists, and that goes for both boys and girls.
Okay? And what's the point of this red herring? A list is totally different from making sexually suggestive or explicit images of a person who hasn't given their consent to do so.

Along those lines I also take issue with the quote in the article about imagining it was your wife/sister/daughter, it could surely also be used to humiliate males?
Sure it could. And it would be equally as damaging. However, girls and women are far more sexualized by society than men.

Murder is completely different from making pictures though. Kid can be making them "for research purposes" and leak unintentionally. He also can leak them anonymously - good luck investigating it. Also, girls can be really mean to each other.
What "research" would this constitute as exactly?

Suicide is a choice. Some girls do it for stupidest reasons.
So we're victim blaming people now? Yikes.

Prohibiting kid's stupidity is futile, they are dumb, that's just how it works. Much more realistic approach would be to stop demonizing nudity. "Stick and stones may break my bones but pictures shall never hurt me". If faking pictures become trivial there is no point to treat them as an evidence of wickedness.

Current attitude to nudity is what makes bullying, blackmailing and suicides possible.
Do you realize what you're advocating for? And are you sure you want to be doing that?
 
Upvote
4 (5 / -1)
I disagree with your assumption. I read "disproportionately impact less-privileged kids" as "in the same legal situation, the low-income person faces a bigger struggle and worse outcome". There may be fewer cases, but that doesn't mean the outcome isn't disproportional.
That is like saying prosecuting offshore tax evasion will disproportionately impact low income people: in the same situation, the low income person faces a bigger struggle and worse outcome. Hypothetically true, but the implication that deciding to prosecute offshore tax evasion is unfortunate because the poor would be disproportionately affected is completely asinine.

As to your second point, don't assume that apparent lack of resources will prevent this activity. Kids are smart, inventive, and resourceful.
Yes, kids are smart, inventive, and resourceful, but poverty is a real barrier, and poor kids can't just pull themselves up from their own bootstraps to have the same resources as rich kids.

PCs that are powerful today will become freebee giveaways in a few years.
Most poor people haven't been getting free outdated PCs for most of PC history. The pandemic seems to be the exception, but kids went back to the classroom after. Again, only social/political change will solve wealth inequality, not computers getting better over time.

As for what Google allows ... lets just say I don't think it is foolproof or couldn't be abused.
It is unrealistic to assume that poor kids would be able to bypass Google's abuse detection algorithm and not get caught creating what would be classified as deepfake child porn.

Plus, there are other resources out there as well, and Stable Diffusion will run on regular CPUs just fine, only slower.
It would not run on regular CPUs "just fine". The slowness is not workable on a practical level, especially not on old computers that cannot even handle Zoom calls for remote learning, being time shared with parents and siblings.

My point is that you can't just assume that low income kids will be magically protected by their lack of resources.
This is like saying you can't assume low income people will be magically protected from prosecution of offshore tax evasion by their lack of resources. It may be theoretically possible for a low income person to end up in that situation by unusual circumstance, but such a law is not part of the system that disproportionately impacts low income people.
 
Upvote
-4 (0 / -4)

JBinFla

Wise, Aged Ars Veteran
143
Agreed.

What did the parents do?

The adult parents are responsible for their children whether they were active participants are merely “sperm donors”.

It may not take away the pain and humiliation this girl felt, but it will send a strong message to parents you better keep your kids in check.

For the record, this is nothing different than holding a parent financially responsible for a window a kid breaks. This is nothing new and I suspect current laws provide an avenue in the civil courts for recompense from the kids parents.
 
Upvote
-2 (0 / -2)
I think the intent was that all AI images would contain watermarks to identify the source.
So they're proposing a law to mandate creative tools embed tracking features? I understand the desire to stop an awful practice, but this seems like something that would be quickly abused by government for those using it for other purposes.

Keeping in mind what resulted from the Patriot Act and given the magnitude of some of the threats generative AI presents to society, I need to think about whether that tradeoff makes sense.
 
Upvote
-1 (1 / -2)

icypioneer

Smack-Fu Master, in training
31
Financial compensation to the victim sounds great. I hope prison time isn't liberally given to children, and strictly limited to those that result or call to violence. We've enough children prisoners.

At the rate of which misinformation travels, with how feepfake quality is increasing, we're overdue for a culture change on how we react to new information. If we were less puritanical, children wouldn't so easily manipulate society into overwhelming and traumatizing their victims.

Troubling times ahead for the quality of fake news.
 
Upvote
-2 (0 / -2)

Uragan

Ars Legatus Legionis
11,175
So they're proposing a law to mandate creative tools embed tracking features? I understand the desire to stop an awful practice, but this seems like something that would be quickly abused by government for those using it for other purposes.
Watermarks don't "track" a person. They identify the provenance of a object. Watermarking something generated by AI tools would identify it as such, inform a consumer and allow them to do whatever they want or need to do based on that knowledge.
 
Upvote
2 (2 / 0)

IncorrigibleTroll

Ars Tribunus Angusticlavius
9,228
Watermarks don't "track" a person. They identify the provenance of a object. Watermarking something generated by AI tools would identify it as such, inform a consumer and allow them to do whatever they want or need to do based on that knowledge.

I assume you mean some sort of digital signature, not a visible watermark. An actual displayed watermark is simply unacceptable for a lot of perfectly legitimate uses of generative AI tools.
 
Upvote
-1 (0 / -1)

Uragan

Ars Legatus Legionis
11,175
I assume you mean some sort of digital signature, not a visible watermark. An actual displayed watermark is simply unacceptable for a lot of perfectly legitimate uses of generative AI tools.
It doesn't need to be visual to be a "watermark". As long as there is some readily available and apparent way to identify something as AI generated, that's what I meant.
 
Upvote
0 (0 / 0)