Cherubs and Jesus aren't real...If you literally applied the "someone under the age of consent" nudity rule you've specified here, every artist who ever painted a nude cherub or a naked baby Jesus would be in prison.
When I was replying to that person, my point was that having a 1660Ti from 2019 with 6 GB memory isn't a shitty graphics card that low income people should be able to afford. It's better than my laptop GPU from 2021 with 4 GB memory that is my current computer, and I'm not low income.Hold up a sec.
What is your actual point here?
Not from what I read.You’re missing the point. The way the idiots in Congress have worded the law is that it’s any digital file effectively , regardless of realism .
No one is confusing a painting for a real photo. AI photos CAN be confused for it. See the difference?If you literally applied the "someone under the age of consent" nudity rule you've specified here, every artist who ever painted a nude cherub or a naked baby Jesus would be in prison.
So , to me intent is an important but very difficult to prove component here . How do you prove a persons mental state when they compose a piece of “art” ?Intent to harm the victim. Intent to portray the victim naked. Intent to make the character appear in their likeness. Intent is currently what separates fraud from incompetent bookeeping.
Can you please cite the part of the bill where it narrows it beyond just “digital depictions”? I cannot find a place in the bill that materially narrows the definition .Not from what I read.
I think part of the issue here is the bill text does not specify photo or photo realism at ALL. They merely say “digital depiction”, which is crazy broad without any narrowing language , which I cannot find in the text.No one is confusing a painting for a real photo. AI photos CAN be confused for it. See the difference?
It's because we're a nation of prudes. Sex is taboo, so anything sex-related that's not about the sex (rape, revenge pr0n, etc.) is overshadowed by and confused with the sexual element.I really can't see how making and distributing a deepfake porn of someone would be different than distributing revenge porn of them. The end goal is the same: to humiliate the target.
I guess my question is why that's relevant here, and how it relates to the original point you were replying to, that this would disproportionately impact less-privileged kids. Not that they would make more of this stuff, but that it would impact them disproportionately, IE, more likely to be prosecuted, convicted, have their lives ruined than a more privileged kid.When I was replying to that person, my point was that having a 1660Ti from 2019 with 6 GB memory isn't a shitty graphics card that low income people should be able to afford. It's better than my laptop GPU from 2021 with 4 GB memory that is my current computer, and I'm not low income.
Where? The law in the in the US is pretty consistent that you can take photos of anyone in public and anywhere where there is not a realistic expectation of privacy. The photographer/artist owns the file and does not need consent to use it or edit it.That's not universally true. There are some jurisdictions where you need consent 100% of the time, excluding very narrow carve-outs.
I like to think of Baby Jesus not as nude, but dressed in one of those little tuxedo t-shirts. The kind that says I'm Jesus, but I'm here to party!If you literally applied the "someone under the age of consent" nudity rule you've specified here, every artist who ever painted a nude cherub or a naked baby Jesus would be in prison.
You're wondering what would happen if we exposed people to pornography against their will and under duress?It's because we're a nation of prudes. Sex is taboo, so anything sex-related that's not about the sex (rape, revenge pr0n, etc.) is overshadowed by and confused with the sexual element.
As an aside, I wonder if today's porn works as a sort of A Clockwork Orange aversion therapy, making us more puritanical. And if that's not the case, what would happen if we ran some A Clockwork Orange-style aversion therapy sessions with today's porn on all the puritans?
The meaning seems clear to everyone else here, but if you're seriously arguing that this bill as written would man cartoon caricatures or scribbling devil horns on a photo of someone you hate, well no harm in tightening up the language. That's clearly all you want, right? Right...I think part of the issue here is the bill text does not specify photo or photo realism at ALL. They merely say “digital depiction”, which is crazy broad without any narrowing language , which I cannot find in the text.
They reference the following definition from 18 USC § 2256(5)
(5) “visual depiction” includes undeveloped film and videotape, data stored on computer disk or by electronic means which is capable of conversion into a visual image, and data which is capable of conversion into a visual image that has been transmitted by any means, whether or not stored in a permanent format;
Again , there’s nothing here that’s photorealistic at all . The devil is in the details , and this is a poorly written and scoped bill .. which matters immensely when it comes to scope of law.
If they mean photorealistic , or photo like , or anything of that nature , they need to say so , and be very clear. A rushed poorly written law helps nobody and can deal a great deal of harm .
I think the intent was that all AI images would contain watermarks to identify the source.https://www.merriam-webster.com/dictionary/deepfake
A watermarked deepfake is a contradiction in terms. I can think of no use of deepfakes that would use watermarks except possibly journalism (which could probably get out of it citing 1st amendment).
Edit: I'm surprised malicious deepfake distribution isn't covered by existing defamation statutes.
while my 7900 XTX sits in a corner and cries.That's cute, but you do NOT need a high-end GPU to run Stable Diffusion. My years-old 1660Ti says hello.
I think I'm not quite communicating my meaning well here, but I also think you're putting words in my mouth. In no way do I want to protect the perpetrators, and I'm not in any way accusing these people pushing for these laws of being capricious. If this is what will help them, then I specifically said that I support it. I simply want to say that the laws SHOULD be about the victims and focusing on giving them what they need.Forcing a public statement and apology for creating and sharing what amounts to libel goes a long way towards restoring the damage.
But, in the end, it's not really up to YOU to decide what best serves victims of this, is it? Victims themselves have been asking for this, and they're in the best position to decide. This is a very serious thing they're doing, not some light-hearted fun like drawing a mustache on a picture. This is something passing itself off as reality.
(I'm going to just mention that it's an odd coincidence that this proposed method of "worrying about making the victim whole" involves protecting the perpetrators from consequence. Very odd.)
I'm not intending to defend the perpetrators, nor am I calling these people mentioned in the article capricious. I just think that when writing these laws, it should be about helping the victim. I actually think public statements and libel laws are exactly the kind of things I mean by that. And I'm not interested in defending the perpetrators-- I just think their punishment should be whatever best serves the victims, whether that's $150k or $1.5 million or a public statement or whatever. It's not about what I want, nor is it about what you want, to me it's about what the victims want.Forcing a public statement and apology for creating and sharing what amounts to libel goes a long way towards restoring the damage.
But, in the end, it's not really up to YOU to decide what best serves victims of this, is it? Victims themselves have been asking for this, and they're in the best position to decide. This is a very serious thing they're doing, not some light-hearted fun like drawing a mustache on a picture. This is something passing itself off as reality.
(I'm going to just mention that it's an odd coincidence that this proposed method of "worrying about making the victim whole" involves protecting the perpetrators from consequence. Very odd.)
I suck at communicating in these posts. I'm not trying to defend the perpetrators. I just think it's sad that dumb kids can cause so much more damage while they're being dumb. That, and that I think that laws shouldn't be about what you OR I want-- they should be about what the victim wants. Whatever can best help them heal.Forcing a public statement and apology for creating and sharing what amounts to libel goes a long way towards restoring the damage.
But, in the end, it's not really up to YOU to decide what best serves victims of this, is it? Victims themselves have been asking for this, and they're in the best position to decide. This is a very serious thing they're doing, not some light-hearted fun like drawing a mustache on a picture. This is something passing itself off as reality.
(I'm going to just mention that it's an odd coincidence that this proposed method of "worrying about making the victim whole" involves protecting the perpetrators from consequence. Very odd.)
Threat of punishment does work in deterring, it just doesn't work 100%. Also, laws like this are put in place to help victims and to give them recourse to do something about it to stop it from happening.While I agree with this in spirit, I don't expect a law to have much of a change on teen boys doing this sort of stupid thing on a lark.
Behind the argument that more under privileged kids than privileged kids would be convicted is the assumption that both rich and poor have equal access to technology. This premise is completely false and divorced from reality. The computers of people who can even afford to buy a new computer but have a limited budget are generally netbooks, not gaming desktops, especially if they rent tiny apartments and move a lot because they live in problematic buildings (desktops take up space and are less portable).I guess my question is why that's relevant here, and how it relates to the original point you were replying to, that this would disproportionately impact less-privileged kids. Not that they would make more of this stuff, but that it would impact them disproportionately, IE, more likely to be prosecuted, convicted, have their lives ruined than a more privileged kid.
Creating deepfakes is not allowed in Google Colab.No one said kids in poverty have high-end GPUs. If you have internet connectivity and a little bit of curiosity & savvy, it's not hard to get something like that going in Google Colab.
So, you openly admit that it does harm to the victims, but are ok with that as long as the person is "mature" enough to know that it does harm to the victims?If perpetuators are adults, no problem at all. It's the teens and early teens "doing it for fun" and still too immature to appreciate the harm done to their victims that I am more concerned about. Would hate to see 10 years slapped onto a still-idiotic 13 year old. Hopefully prosecutors and judges will proceed judiciously...
If there is a law then muggie taylor-greene needs to be arrested and charged.It's strange the law doesn't already account for whatever penalty there should be for posting explicit photos of someone without their consent. Maybe such a law doesn't exist, in which case I am now wondering what congresscritters are profiting from it...
law has a normative function in a society.While I agree with this in spirit, I don't expect a law to have much of a change on teen boys doing this sort of stupid thing on a lark.
Oh my god that would be a horrible picture!Ok, what if somebody were to draw an image of Donald Trump felating Vladimir Putin? That's absolutely and clearly political speech, not anything intended for titillation. Wanting things to be clearly defined isn't always so that people know how close to the line they can go; it's also how you prevent the law from being abused by bad actors.
But you are not worried about the victims of these ‘boys will be boys‘ malicious little wankers?If perpetuators are adults, no problem at all. It's the teens and early teens "doing it for fun" and still too immature to appreciate the harm done to their victims that I am more concerned about. Would hate to see 10 years slapped onto a still-idiotic 13 year old. Hopefully prosecutors and judges will proceed judiciously...
There have been a plethora of studies showing that longer sentences, making a crime a felony over a misdemeanor, etc in and of themselves do not increase legal compliance with a law (e.g. https://www.vera.org/news/research-shows-that-long-prison-sentences-dont-actually-improve-safety).History suggests otherwise. Certainly punishment may be required, but a possible 10 years appears to be rather over the top.
Case in point the occurrence pushing law makers here, wasn't it reported that none of the images were seen by victims or school staff? They boys were getting their rocks off and deleted everything when word spread... Not arguing that is wasn't harmful but materially different to an adult making and posting up online!If perpetuators are adults, no problem at all. It's the teens and early teens "doing it for fun" and still too immature to appreciate the harm done to their victims that I am more concerned about. Would hate to see 10 years slapped onto a still-idiotic 13 year old. Hopefully prosecutors and judges will proceed judiciously...
Murder is completely different from making pictures though. Kid can be making them "for research purposes" and leak unintentionally. He also can leak them anonymously - good luck investigating it. Also, girls can be really mean to each other.Still-idiotic 13 year olds can get ten years for murder as a juvenile. Some girls commiting suicide would not be a surprising result of deepfake porn.
At age 23 the former-idiot can put their life back together.
I disagree with your assumption. I read "disproportionately impact less-privileged kids" as "in the same legal situation, the low-income person faces a bigger struggle and worse outcome". There may be fewer cases, but that doesn't mean the outcome isn't disproportional.Behind the argument that more under privileged kids than privileged kids would be convicted is the assumption that both rich and poor have equal access to technology. This premise is completely false and divorced from reality. The computers of people who can even afford to buy a new computer but have a limited budget are generally netbooks, not gaming desktops, especially if they rent tiny apartments and move a lot because they live in problematic buildings (desktops take up space and are less portable).
Creating deepfakes is not allowed in Google Colab.
I like to think of Baby Jesus not as nude, but dressed in one of those little tuxedo t-shirts. The kind that says I'm Jesus, but I'm here to party!
The meaning seems clear to everyone else here, but if you're seriously arguing that this bill as written would man cartoon caricatures or scribbling devil horns on a photo of someone you hate, well no harm in tightening up the language. That's clearly all you want, right? Right...
You're really splitting hairs here.Case in point the occurrence pushing law makers here, wasn't it reported that none of the images were seen by victims or school staff? They boys were getting their rocks off and deleted everything when word spread... Not arguing that is wasn't harmful but materially different to an adult making and posting up online!
Okay? And what's the point of this red herring? A list is totally different from making sexually suggestive or explicit images of a person who hasn't given their consent to do so.In the same vein back in my day cliques at school would keep "fit" (British slang I despite for attractive) lists, and that goes for both boys and girls.
Sure it could. And it would be equally as damaging. However, girls and women are far more sexualized by society than men.Along those lines I also take issue with the quote in the article about imagining it was your wife/sister/daughter, it could surely also be used to humiliate males?
What "research" would this constitute as exactly?Murder is completely different from making pictures though. Kid can be making them "for research purposes" and leak unintentionally. He also can leak them anonymously - good luck investigating it. Also, girls can be really mean to each other.
So we're victim blaming people now? Yikes.Suicide is a choice. Some girls do it for stupidest reasons.
Do you realize what you're advocating for? And are you sure you want to be doing that?Prohibiting kid's stupidity is futile, they are dumb, that's just how it works. Much more realistic approach would be to stop demonizing nudity. "Stick and stones may break my bones but pictures shall never hurt me". If faking pictures become trivial there is no point to treat them as an evidence of wickedness.
Current attitude to nudity is what makes bullying, blackmailing and suicides possible.
That is like saying prosecuting offshore tax evasion will disproportionately impact low income people: in the same situation, the low income person faces a bigger struggle and worse outcome. Hypothetically true, but the implication that deciding to prosecute offshore tax evasion is unfortunate because the poor would be disproportionately affected is completely asinine.I disagree with your assumption. I read "disproportionately impact less-privileged kids" as "in the same legal situation, the low-income person faces a bigger struggle and worse outcome". There may be fewer cases, but that doesn't mean the outcome isn't disproportional.
Yes, kids are smart, inventive, and resourceful, but poverty is a real barrier, and poor kids can't just pull themselves up from their own bootstraps to have the same resources as rich kids.As to your second point, don't assume that apparent lack of resources will prevent this activity. Kids are smart, inventive, and resourceful.
Most poor people haven't been getting free outdated PCs for most of PC history. The pandemic seems to be the exception, but kids went back to the classroom after. Again, only social/political change will solve wealth inequality, not computers getting better over time.PCs that are powerful today will become freebee giveaways in a few years.
It is unrealistic to assume that poor kids would be able to bypass Google's abuse detection algorithm and not get caught creating what would be classified as deepfake child porn.As for what Google allows ... lets just say I don't think it is foolproof or couldn't be abused.
It would not run on regular CPUs "just fine". The slowness is not workable on a practical level, especially not on old computers that cannot even handle Zoom calls for remote learning, being time shared with parents and siblings.Plus, there are other resources out there as well, and Stable Diffusion will run on regular CPUs just fine, only slower.
This is like saying you can't assume low income people will be magically protected from prosecution of offshore tax evasion by their lack of resources. It may be theoretically possible for a low income person to end up in that situation by unusual circumstance, but such a law is not part of the system that disproportionately impacts low income people.My point is that you can't just assume that low income kids will be magically protected by their lack of resources.
Agreed.
What did the parents do?
So they're proposing a law to mandate creative tools embed tracking features? I understand the desire to stop an awful practice, but this seems like something that would be quickly abused by government for those using it for other purposes.I think the intent was that all AI images would contain watermarks to identify the source.
Watermarks don't "track" a person. They identify the provenance of a object. Watermarking something generated by AI tools would identify it as such, inform a consumer and allow them to do whatever they want or need to do based on that knowledge.So they're proposing a law to mandate creative tools embed tracking features? I understand the desire to stop an awful practice, but this seems like something that would be quickly abused by government for those using it for other purposes.
We don't need to reevaluate the "value" of CSAM and its place in society.If we were less puritanical, children wouldn't so easily manipulate society into overwhelming and traumatizing their victims.
Watermarks don't "track" a person. They identify the provenance of a object. Watermarking something generated by AI tools would identify it as such, inform a consumer and allow them to do whatever they want or need to do based on that knowledge.
It doesn't need to be visual to be a "watermark". As long as there is some readily available and apparent way to identify something as AI generated, that's what I meant.I assume you mean some sort of digital signature, not a visible watermark. An actual displayed watermark is simply unacceptable for a lot of perfectly legitimate uses of generative AI tools.