The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.Minnesota passes ban on fake AI nudes
It's definitely a problem for Elon. He can't just move X out of the US, given what a US-centric audience he has.Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
Well, to begin with, you try to do somethingPandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
I mean you could make that exact same argument about things like CSAM. Yes, people will absolutely move jurisdictions and regions to skirt the law. People will come up with technological solutions (VPNs etc). But that does not mean it's not worth doing. A partial solution is still worthy.Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
He also has a business presence in Minnesota thanks to Tesla, and Tesla and xAI support each other financially.It's definitely a problem for Elon. He can't just move X out of the US, given what a US-centric audience he has.
Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
History has repeatedly shown that even those who genuinely make an effort to keep such movies/pictures to themselves are often unsuccessful. Just ask Kelsey Grammer.Thank you benevolent government from saving us from a guy jerking off to jpegs he created and kept to himself.
From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
(1) an image or video is altered or generated to depict an intimate part not depicted in
an original unaltered image or video of an identifiable individual; and
(2) the altered or generated image or video is so realistic that a reasonable person would
believe that the intimate part belongs to the identifiable individual.
This administration would never allow that. Too much of their base are frothy-mouthing all over it with bad takes.Good. Next up ban X.
Probably. Also, if you wish to make nudes of those 80+, you can also do it the olde fashionede waye.The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
An interesting problem with that question is "identifiable by whom?"So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
Very good points. I certainly agree that the law is valuable as a deterrent, even if some corner cases will get through. Just having it on the books will (hopefully) push companies to add filters and safeguards.An interesting problem with that question is "identifiable by whom?"
Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.
I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.
That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.
The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
He is LITERALLY planning to move things to orbit to avoid inconvenient jurisdiction.It's definitely a problem for Elon. He can't just move X out of the US, given what a US-centric audience he has.
Well, preemptively limiting the ability of any future sentient sexbots to take their own sexbot rental app profile pictures does seem like a civil rights issue in the making.The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
Do you believe that where the servers are located makes a difference in the lawfulness of an act? States are allowed to enforce laws when they relate to one of their citizens. In the case of images as are covered by this law, this statute grants a right of civil action against the websites. Where those sites are hosted makes no difference whatsoever. Damages can be collected from folks even when they're overseas, so long as there's any US based nexus. That could well mean the civil action results in a judgement that the card processing companies have to deal with, for example, even before the funds end up in a bank account.Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
That's not how that works. There have literally been laws on the books for decades to deal with any such ridiculous attempts to evade the law.He is LITERALLY planning to move things to orbit to avoid inconvenient jurisdiction.
Then you need to read up more on how First Amendment exceptions work. Moreover, it's a dick move to equate nudification with drawing fucking facial hair on someone's image. Grow the fuck up already.This is going to get downvoted straight to hell but; I don't see how this passes Constitutional muster. Will drawing mustaches on pictures of people come next?
The law takes no action against the creator. It is only taking action against the app/software maker that facilitated the work:An interesting problem with that question is "identifiable by whom?"
Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.
I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.
That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.
Subd. 2.
Nudification prohibited
(a) A person who owns or controls a website, application, software, program, or other service must not:
(1) allow a user to access, download, or use the website, application, software, program, or other service to nudify an image or video; or
(2) nudify an image or video on behalf of a user.
(b) No person may advertise or promote any website, application, software, program, or other service that performs the actions described in paragraph (a).
Because it's one thing to say "you may not do this" and a different thing to say "you may not provide this as a service."The law takes no action against the creator. It is only taking action against the app/software maker that facilitated the work:
I also don't understand the "Photoshop" exemption? "It's not cool that you're making nudes of your workmates, but we see that you took a class and you invested a lot of time in it, so it's cool."
There's also no case against making "racy" pictures without consent, as long as there is no actual nudity. So a person can put a workmate in lingerie in the creator's bedroom or a video of two workmates kissing passionately, as long as they make sure "intimate parts" are not visible.
I found this interesting as well. Anyone who willingly poses nude has forever given-up their right to challenge fake nudes being made of them. So if one scans in a Playboy Playmate, they can create what they want all they want.From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :
So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
Then you need to read up more on how First Amendment exceptions work. Moreover, it's a dick move to equate nudification with drawing fucking facial hair on someone's image. Grow the fuck up already.
Nope, that's not how it works.I found this interesting as well. Anyone who willingly poses nude has forever given-up their right to challenge fake nudes being made of them. So if one scans in a Playboy Playmate, they can create what they want all they want.
The technicality I had in mind is that all manipulated images offer a false view on reality, and therefore are "fake." Completely setting aside whether or not the "I" in "AI" exists outside of imagination, of course.From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :
So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]