Minnesota set to be first state to ban nudification apps

Status
You're currently viewing only OpenThePodBayDoor's posts. Click here to go back to viewing the entire thread.

OpenThePodBayDoor

Smack-Fu Master, in training
60
Subscriptor
The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :

(1) an image or video is altered or generated to depict an intimate part not depicted in
an original unaltered image or video of an identifiable individual; and
(2) the altered or generated image or video is so realistic that a reasonable person would
believe that the intimate part belongs to the identifiable individual.

So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
 
Upvote
24 (24 / 0)

OpenThePodBayDoor

Smack-Fu Master, in training
60
Subscriptor
An interesting problem with that question is "identifiable by whom?"

Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.

I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.

That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.
Very good points. I certainly agree that the law is valuable as a deterrent, even if some corner cases will get through. Just having it on the books will (hopefully) push companies to add filters and safeguards.
 
Upvote
1 (2 / -1)
Status
You're currently viewing only OpenThePodBayDoor's posts. Click here to go back to viewing the entire thread.