Minnesota set to be first state to ban nudification apps

Post content hidden for low score. Show…

Quasius

Ars Scholae Palatinae
1,141
Subscriptor
Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
It's definitely a problem for Elon. He can't just move X out of the US, given what a US-centric audience he has.
 
Upvote
34 (38 / -4)

citizencoyote

Ars Tribunus Militum
1,600
Subscriptor++
It's a good start. Certainly there will be difficulties in enforcement when you're dealing with foreign apps and actors, but that's an issue with anything internet related. At least it's something rather than throwing up hands and saying, "Oops, well too late, the tech already exists, can't do anything about it!"
 
Upvote
25 (29 / -4)
Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
I mean you could make that exact same argument about things like CSAM. Yes, people will absolutely move jurisdictions and regions to skirt the law. People will come up with technological solutions (VPNs etc). But that does not mean it's not worth doing. A partial solution is still worthy.

Think of this as an analogy: even if only the citizens of Minnesota are blocked from accessing CSAM that would still be good. Shit, even if only a subset of the citizens of Minnesota were blocked that would still be good.

The only reason not to support bans, even if they are small/regional/porous, is if you don't think nudification is an issue. I personally do. I think nudification apps break vital consent laws and also give pathways towards CSAM.

So good for fucking Minnesota. If they can figure out a legal path forward that others can use an example (even as an example of what not to do) I am glad someone is trying. Shrugging and saying, "Guess everything sucks now." is not how a functioning society reacts to new issues.
 
Upvote
37 (44 / -7)
Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).

I don't think that is going to be quite as easy for AI services as it is for other services that move due to legal issues.

Sure, it can still happen, but they need compute. That means they can't just pick some random server host who is willing to look the other way in a country that doesn't care about piracy or other legal issues. They need to find server hosts that have enough compute, which limits their options more. It also costs them a lot more money to operate all that compute, which has to be paid for somehow. It's going to be harder for them to get payments from users to cover those costs.

When it requires paying in Bitcoin because the service operator can't get legitimate payment processors on board, that's going to deter plenty of people from even bothering to try.

None of this will 100% prevent such AI services from existing in some capacity, but it will all work to reduce it.
 
Upvote
31 (31 / 0)
Post content hidden for low score. Show…

LesMilpool____

Ars Scholae Palatinae
865
Subscriptor++
On the one hand, 100% for this. There's no legitimate reason apps like this should even exist.
On the other hand, not sure how much good it'll do. Companies that build this sort of software will continue doing so overseas, and about all one can do is make it hard to get hands on the software.
Still, this is more good than bad.
 
Upvote
-18 (1 / -19)

LesMilpool____

Ars Scholae Palatinae
865
Subscriptor++
Thank you benevolent government from saving us from a guy jerking off to jpegs he created and kept to himself.
History has repeatedly shown that even those who genuinely make an effort to keep such movies/pictures to themselves are often unsuccessful. Just ask Kelsey Grammer.
 
Upvote
18 (20 / -2)

OpenThePodBayDoor

Smack-Fu Master, in training
60
Subscriptor
The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :

(1) an image or video is altered or generated to depict an intimate part not depicted in
an original unaltered image or video of an identifiable individual; and
(2) the altered or generated image or video is so realistic that a reasonable person would
believe that the intimate part belongs to the identifiable individual.

So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
 
Upvote
21 (21 / 0)

Fred Duck

Ars Tribunus Angusticlavius
7,301
The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
Probably. Also, if you wish to make nudes of those 80+, you can also do it the olde fashionede waye.

VERY NUDE IMAGE.jpg
 
Upvote
2 (2 / 0)
So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
An interesting problem with that question is "identifiable by whom?"

Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.

I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.

That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.
 
Upvote
10 (11 / -1)

OpenThePodBayDoor

Smack-Fu Master, in training
60
Subscriptor
An interesting problem with that question is "identifiable by whom?"

Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.

I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.

That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.
Very good points. I certainly agree that the law is valuable as a deterrent, even if some corner cases will get through. Just having it on the books will (hopefully) push companies to add filters and safeguards.
 
Upvote
1 (2 / -1)
I agree with the "need" to address the problem, but how many other problems have we just given up on because it was perceived to be too hard to accomplish or will piss off too many people?

Our politicians have become spine-less because we as a nation have given up our voices to a small group of conservatives who don't share the same values.
 
Upvote
-3 (1 / -4)
It's definitely a problem for Elon. He can't just move X out of the US, given what a US-centric audience he has.
He is LITERALLY planning to move things to orbit to avoid inconvenient jurisdiction.

Also the sort of people who want to use X or grok to make nudes aren't going to be bothered by it running outside the US.
 
Upvote
-9 (1 / -10)
The Department of Redundancy Department wonders if the headline implies "real AI nudes" are okay.
Well, preemptively limiting the ability of any future sentient sexbots to take their own sexbot rental app profile pictures does seem like a civil rights issue in the making.
 
Upvote
2 (2 / 0)
Post content hidden for low score. Show…

Nilt

Ars Legatus Legionis
21,824
Subscriptor++
Pandora's box has already been open. I am not sure how you undue that given most services like these will just move countries (servers).
Do you believe that where the servers are located makes a difference in the lawfulness of an act? States are allowed to enforce laws when they relate to one of their citizens. In the case of images as are covered by this law, this statute grants a right of civil action against the websites. Where those sites are hosted makes no difference whatsoever. Damages can be collected from folks even when they're overseas, so long as there's any US based nexus. That could well mean the civil action results in a judgement that the card processing companies have to deal with, for example, even before the funds end up in a bank account.

This isn't particularly unusual, even, frankly. It's happened many times before. It simply doesn't typically end up in the news.
 
Upvote
10 (10 / 0)

Nilt

Ars Legatus Legionis
21,824
Subscriptor++
This is going to get downvoted straight to hell but; I don't see how this passes Constitutional muster. Will drawing mustaches on pictures of people come next?
Then you need to read up more on how First Amendment exceptions work. Moreover, it's a dick move to equate nudification with drawing fucking facial hair on someone's image. Grow the fuck up already.
 
Upvote
25 (29 / -4)

enilc

Ars Praefectus
3,877
Subscriptor++
An interesting problem with that question is "identifiable by whom?"

Suppose you took steps to try to make the person "unidentifiable". Common tricks like blurring out the face or just cropping it to only the parts you...erm...want to see. You obviously can "identify" the person, but I'd bet many of their friends would be able to name them as well. Even stock photos contain real people who could identify themselves and for whom people who know them could identify them.

I think the real point of this policy is just to give some teeth to enforcement when these things inevitably go public. People might think they'll just generate it for themselves and keep it private. Some probably actually successfully do that - and we will likely never know about those people. This exists to give some real heat to people who "slip up" and "accidentally" leak the images (or who do so maliciously as harassment), or to companies that specifically advertise nudification as a feature of their model. But open models, finetunes, and such will mean anyone sufficiently determined (and with sufficient opsec) will continue to "get away" with it regardless.

That doesn't make the law pointless. It makes it a deterrent - which is exactly what laws are supposed to do. Will some people successfully "get around" the law and generate nudes anyway, and keep them to themselves? Sure. But people will also fantasize about people without generated AI images. I guess if you're actually able to generate nudes on local offline models and you manage to actually keep those files private, then...I guess, more power to you? But the point of these policies should be to protect the victims - people who are harmed when the images "leak" or when companies provide unfiltered, unscrupulous access to the tools that make this possible. If it succeeds in making it a really bad day for anyone who does this kind of thing carelessly, then that's still a win.
The law takes no action against the creator. It is only taking action against the app/software maker that facilitated the work:

Subd. 2.​

Nudification prohibited​

(a) A person who owns or controls a website, application, software, program, or other service must not:
(1) allow a user to access, download, or use the website, application, software, program, or other service to nudify an image or video; or
(2) nudify an image or video on behalf of a user.
(b) No person may advertise or promote any website, application, software, program, or other service that performs the actions described in paragraph (a).

I also don't understand the "Photoshop" exemption? "It's not cool that you're making nudes of your workmates, but we see that you took a class and you invested a lot of time in it, so it's cool."

There's also no case against making "racy" pictures without consent, as long as there is no actual nudity. So a person can put a workmate in lingerie in the creator's bedroom or a video of two workmates kissing passionately, as long as they make sure "intimate parts" are not visible.
 
Upvote
-13 (1 / -14)

Tofystedeth

Ars Tribunus Angusticlavius
6,443
Subscriptor++
The law takes no action against the creator. It is only taking action against the app/software maker that facilitated the work:



I also don't understand the "Photoshop" exemption? "It's not cool that you're making nudes of your workmates, but we see that you took a class and you invested a lot of time in it, so it's cool."

There's also no case against making "racy" pictures without consent, as long as there is no actual nudity. So a person can put a workmate in lingerie in the creator's bedroom or a video of two workmates kissing passionately, as long as they make sure "intimate parts" are not visible.
Because it's one thing to say "you may not do this" and a different thing to say "you may not provide this as a service."
 
Upvote
10 (10 / 0)
Post content hidden for low score. Show…

enilc

Ars Praefectus
3,877
Subscriptor++
From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :



So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
I found this interesting as well. Anyone who willingly poses nude has forever given-up their right to challenge fake nudes being made of them. So if one scans in a Playboy Playmate, they can create what they want all they want.
 
Upvote
-19 (1 / -20)

ruet

Ars Praefectus
3,297
Subscriptor
Then you need to read up more on how First Amendment exceptions work. Moreover, it's a dick move to equate nudification with drawing fucking facial hair on someone's image. Grow the fuck up already.

What's that basis of the exception here? How is nudification different? Explain it as you would to someone who needs to grow TF up. In the meantime, here are some images to ponder.

NSFW

T.jpg
682491868-10245125869860802-6118754704796543521-n.jpg
download (1).jpeg
images (1).jpeg

Should the creators of these images be sanctioned? The makers of the art supplies? Photoshop? Where's the line? Do you get to decide?
 
Upvote
-16 (4 / -20)

Tofystedeth

Ars Tribunus Angusticlavius
6,443
Subscriptor++
I found this interesting as well. Anyone who willingly poses nude has forever given-up their right to challenge fake nudes being made of them. So if one scans in a Playboy Playmate, they can create what they want all they want.
Nope, that's not how it works.
 
Upvote
10 (11 / -1)

graylshaped

Ars Legatus Legionis
68,079
Subscriptor++
From the Department of Technicalities: I think, looking at the text of the law, that 'real AI nudes' would not be banned. It defines nudification as :



So if you use AI to enhance or modify an existing nude image, or use AI to generate a nude image of a non-identifiable individual, this law wouldn't apply. [IANAL]
The technicality I had in mind is that all manipulated images offer a false view on reality, and therefore are "fake." Completely setting aside whether or not the "I" in "AI" exists outside of imagination, of course.
 
Upvote
0 (0 / 0)