Women sue men who used their Instagram feed to create AI porn influencers

Major Major

Ars Praetorian
478
Subscriptor
Well we've got low key victim blaming and #notAllMen let's see if we can go for a shit take bingo with a "it's not actually them so I didn't see what the problem is" and "in fact there should be more nudity so no one cares you prudes".
And an “ethics in journalism” all on page 1.
 
Upvote
16 (17 / -1)

enilc

Ars Praefectus
3,877
Subscriptor++
This may sound unserious, but an honest question: I wonder how these sorts of rules would apply to a "Rule 34" type of situation, esp in regards to animated characters? Would that just be some kind of civil copyright violation of some sort?

If someone created a Fanvue page of Minnie/Mickey/Mario/Peach, would Disney/Nintendo sue under "Take It Down Act" or ... ?
 
Upvote
-10 (0 / -10)

Major Major

Ars Praetorian
478
Subscriptor
This may sound unserious, but an honest question: I wonder how these sorts of rules would apply to a "Rule 34" type of situation, esp in regards to animated characters? Would that just be some kind of civil copyright violation of some sort?

If someone created a Fanvue page of Minnie/Mickey/Mario/Peach, would Disney/Nintendo sue under "Take It Down Act" or ... ?
A DMCA takedown is presumably a whole lot easier in that case.
 
Upvote
13 (14 / -1)

enilc

Ars Praefectus
3,877
Subscriptor++
This seems like a difficult case to prove? How do you show that the John Does actually used your specific photos to generate their content? Wouldn't you have to seize their computers and/or online activities?

You can get pretty detailed in a prompt to generate an image from scratch that looks very similar to a real person, without using an actual image.

As far as the named dirtbags with the Whop site, can they get by with a disclaimer "for entertainment purposes only?"

Or is the hope to be successful with jury/judicial nullification due to the obvious immoral purpose of the site?
 
Upvote
-19 (1 / -20)

MechR

Ars Praefectus
3,236
Subscriptor
This seems like a difficult case to prove? How do you show that the John Does actually used your specific photos to generate their content? Wouldn't you have to seize their computers and/or online activities?

You can get pretty detailed in a prompt to generate an image from scratch that looks very similar to a real person, without using an actual image.
1) You won't get a facial likeness with just text prompting.
2) It'd still be deepfake porn made without her consent.
3) They were using the pics to literally advertise their methods:
She was even more appalled when she discovered that not only were doctored nude or scantily clad photos of her being circulated on the Internet, as she outlined in a recently filed complaint—they were also being used to advertise AI ModelForge, a platform that teaches men how to generate their own AI influencers. In a series of online classes and tutorials, the men allegedly taught subscribers to use a software called CreatorCore to train AI models using photos of unsuspecting young women, posting the resulting content on Instagram and TikTok.
Somehow I doubt "Your Honor, we were actually committing false advertising and didn't use that method for those pics" would fly in court. And it's irrelevant anyway because (2).
 
Upvote
18 (18 / 0)
1) You won't get a facial likeness with just text prompting.
2) It'd still be deepfake porn made without her consent.
3) They were using the pics to literally advertise their methods:

Somehow I doubt "Your Honor, we were actually committing false advertising and didn't use that method for those pics" would fly in court. And it's irrelevant anyway because (2).
1) is wrong
2) depends on 1
3) You are correct here
 
Upvote
-11 (2 / -13)
I hope they get sued into oblivion. There’s not even a logical reason to do this unless your intent is to impersonate. Which it doesn’t appear is even the case here so why?
Along with Instagram and any other social media company that passively permits this to occurs, billion dollar judgements. This is the only way it stops.
 
Upvote
7 (7 / 0)

GreyAreaUK

Ars Legatus Legionis
11,408
Subscriptor
Friends by definition are people you don't need technological help to cultivate, beyond maybe a phone to give them a call every now and then.
Can you cite this definition? No? Thought not.

Some of my best friends are people I’ve never met, will never meet, and have no idea what they look like. They are people I chat with on Bluesky (and prior to that that, Twitter).

Stop assuming your definitions are objective facts.
 
Upvote
16 (16 / 0)
The use of "men" instead of "persons" in the article title is weird. They also sued 50 John Does, who being anonymous could well be women.
The named plaintiffs are all men. Feel free to show us all of the times women abused men with a comprehensive comparison of how often men abuse women. Not single instances, full studies with peer review and a large cohort of study examples. Or let me save you some time.
https://duckduckgo.com/?t=ffab&q=experience+online+abuse+women+versus+men&ia=web

Did you know that there is a separate classification of head trauma resulting in brain damage that specifically indicates that the injury is due to spousal abuse and that nearly 100% of the abusers are men and the victims are women? The result is that doctors and hospitals now screen women with symptoms of head trauma for spousal related concussions. Harassment, abuse, and physical harm is predominantly caused by males abusing females.

It is important to never discount a report from a male of abuse by a female. But to seek to anonymize which sex predominantly abuses the other is dishonest.
 
Last edited:
Upvote
17 (17 / 0)

Fatesrider

Ars Legatus Legionis
25,196
Subscriptor
It MAY be a bit over the top, but perhaps if we just dug a deep hole and shoved these people who do this kind of shit into the hole, and then covered it up, the human race would be disproportionately improved.

Do this for all the psychopaths. That we tolerate this in any way is kind of the problem.

PERHAPS being somewhat less extreme, we could just pass a law that any platform that allows this would be shut down regardless of any other content or its popularity. That'd force them to curate their content on upload, not allowing it to go live until it was inspected by a human. Filter it with an AI, if need be, but make really sure it's biased enough so that nothing problematic gets through. Reuploads would have to go through the same process.

This should not be an issue on social media. It shouldn't be an issue with humans, either, but that's a more intractable problem since it's psychopaths who tend to be our more in the spot-light leaders.
 
Upvote
4 (5 / -1)
But the Take It Down Act does not go into effect until May 2026
I know it takes time to write an article, but this article is date stamped May 1 2026.

So that means the Take It Down Act is now in effect, right? Which means there are now consequences to doing this sort of thing? And those consequences apply to any previously generated fake images that are still being distributed online as of today?
 
Upvote
11 (11 / 0)
It MAY be a bit over the top, but perhaps if we just dug a deep hole and shoved these people who do this kind of shit into the hole, and then covered it up, the human race would be disproportionately improved.

Do this for all the psychopaths. That we tolerate this in any way is kind of the problem.

PERHAPS being somewhat less extreme, we could just pass a law that any platform that allows this would be shut down regardless of any other content or its popularity. That'd force them to curate their content on upload, not allowing it to go live until it was inspected by a human. Filter it with an AI, if need be, but make really sure it's biased enough so that nothing problematic gets through. Reuploads would have to go through the same process.

This should not be an issue on social media. It shouldn't be an issue with humans, either, but that's a more intractable problem since it's psychopaths who tend to be our more in the spot-light leaders.
Careful, because this way goes another major problem. You use an AI to filter things, you get what Youtube's become. You get automatic filtering of any content that even uses the word "nazi", even when it's videos educating people on the horrors they committed and debunking pro-nazi propaganda.

This isn't going to be an easy solution, and saying "throw an algorithm at it" is how we got into this mess in the first place.
 
Upvote
10 (10 / 0)
The staggering amount of accounts that on Instagram uses AI generated realistic female models is alarming.
What more alarming or ... I don't even know how to classify it anymore, are the people that actually believe the "model" is legit and not a random asshole like any of use, using a prompt and generate money.
 
Upvote
6 (6 / 0)
This may sound unserious, but an honest question: I wonder how these sorts of rules would apply to a "Rule 34" type of situation, esp in regards to animated characters? Would that just be some kind of civil copyright violation of some sort?

If someone created a Fanvue page of Minnie/Mickey/Mario/Peach, would Disney/Nintendo sue under "Take It Down Act" or ... ?
Mickey, being a fictional character, can't sue for defamation if further obviously fictional images are made of Mickey smoking pot or slapping a baby or playing patty-cake with Jessica Rabbit or whatever. No one's worried about that. We're all worried about these photorealistic images that are very very hard to distinguish from the real thing, based on real people (mainly women and children, though this article focuses on the adult women victims of it in this case), to make NCII content. It's humiliating, it's an attack on their reputation (whether intentional or not), and it's just plain sleazy.
 
Upvote
7 (8 / -1)

MechR

Ars Praefectus
3,236
Subscriptor
1) is wrong
You can dial in facial likeness of a specific desired noncelebrity with pure text prompting? Doubt. And it'd be a lot of pointless effort and RNG when you can just use images for more consistent results.
2) depends on 1
How do you figure that? The result is an intentional deepfake regardless of whether you're using text or image inputs to achieve her likeness. This isn't a case of accidentally genning a real person who was memorized in the training data somewhere.
 
Upvote
8 (9 / -1)

Nilt

Ars Legatus Legionis
21,824
Subscriptor++
This seems like a difficult case to prove? How do you show that the John Does actually used your specific photos to generate their content? Wouldn't you have to seize their computers and/or online activities?
Leaving aside the obvious points already mentioned, what makes you think these fuckwits will bother to scrub the originals from their systems? You may see a few who do so but the overwhelming majority of the gooner dipshits who pull this crap obsessively save every picture they can from virtually any source possible.
 
Upvote
5 (5 / 0)

enilc

Ars Praefectus
3,877
Subscriptor++
1) You won't get a facial likeness with just text prompting.


You are woefully behind in your AI-prompting knowledge.

Can I introduce you to "Not Jason Statham" who was created without using any names and a three-sentence prompt:

a4dda7e1-d6e0-4180-93b8-cc2b8989258f.png
 
Last edited:
Upvote
-7 (4 / -11)

Rirere

Ars Centurion
317
Subscriptor++
You are woefully behind in your AI-prompting knowledge.

Can I introduce you to "Not Jason Statham" who was created without using any names and a three-sentence prompt:

View attachment 134202

You missed the part where they said "non-celebrity", as in someone who the AI has never been specifically trained on.

Go on, try again.
 
Upvote
14 (14 / 0)

clewis

Ars Tribunus Militum
1,794
Subscriptor++
Some people need to spend the rest of their lives in a cage watching the world move on without them. Others, who can't be shamed or rehabilitated, should face a firing squad for their crimes (after rotting in prison for some arbitrary amount of time).

I'll leave it to the reader to decide which punishment is more appropriate for these so-called men.
I don't believe in the death penalty for civilian cases. Obv. there are time during combat where it's too dangerous to leave somebody alive, but that's outside the scope of this discussion.

I'm of the opinion that homicide, accidents, and crimes of passion can be rehabilitated. Murderers, rapists, arsonists, etc require a level of mental illness that (IMO) cannot be rehabilitated. But since by my own definition, they are mentally ill, they should be kept away from the general population for the rest of their natural life.

Although some of the war crimes currently going on are starting to strength my tolerance.
 
Upvote
1 (2 / -1)
Fucking hell, this is industrialized sexual assault.
It's okay, it won't be long until this advances to industrialized vigilantism when rule of law proves ineffective at closing Pandora's Box (outside of the outcome of hyperscaled surveillance against signs of bad actors on any non-airgapped machine). Then shit will get REAL chaotic.
 
Upvote
4 (4 / 0)
It's okay, it won't be long until this advances to industrialized vigilantism when rule of law proves ineffective at closing Pandora's Box (outside of the outcome of hyperscaled surveillance against signs of bad actors on any non-airgapped machine). Then shit will get REAL chaotic.
Unless you have some way to monetize catharsis, the Vigilante Industrial Complex remains impossible.
 
Upvote
1 (1 / 0)
Social media was a bad idea from the start. Now it's just basically a "here, prey on me" open invitation..

Hear, hear! The upside is far outweighed by the downside.
Once upon a time, I used to say: "the best thing about the internet is that everyone gets to express their opinion; the worst thing about the internet is that everyone gets to express their opinion."
I don't say that anymore. The worst thing has taken over.
 
Upvote
2 (2 / 0)

clewis

Ars Tribunus Militum
1,794
Subscriptor++
You are woefully behind in your AI-prompting knowledge.

Can I introduce you to "Not Jason Statham" who was created without using any names and a three-sentence prompt:

<snipped ai image>
I normally downvote AI imagery, but this gets a pass for making a good point.
 
Upvote
-5 (2 / -7)
I'm confused by the article. It leads off with the claim that the AI generated likenesses are identical, accirding to a friend of MG's. It continues with the lawyer's lodge of claim about abuse, due to likeness being identical. MG claims it is her likeness to the point of being identical. The narrative changes near the end of the article, with:

"MG says the images generated by AI Model Forge are distinct enough from her own photos that she frustratingly has been unable to claim that the accounts are impersonating hers, which is also a violation of Instagram guidelines. “It’s my face, my tattoos, on a different outfit on a slightly different body,” she says."

The article's writer refers to deepfake porn legislation but does not say if this situation, of images that are in entirety very similar but not identical, falls within scope.

I hope MG wins her case, because these images/videos are clearly too close for her comfort.

Her lawyer too seems to believe that recognisably very similar is sufficient for a case to be made, even though the lawyer claims they are identical and MG also sometimes does and at least once does not.

Instagram's caution seems prudent, assuming they are ok with the general idea of exploitation, and assuming that "not quite identical" means the content is not illegal. This grey area of legal and community standards would be why TikTok fell back on their own "community guidelines" to justify takedown.

Otherwise Wired could well write an article from another angle, media giant chills free speech by taking down material that a claimant thinks is her, but isn't her.

I'd assume the reasonable person test applies, but the article is vague, the writer sympathising with a person whose perceptions include that that fewer than 10,000 friends is not the same as in the public domain. Would a reasonable person conclude the AI stuff and MG were one and the same?

I'm only guessing about the reasonable person test. The actual test? It's a gap in the story.

And I agree with CatNamedHugs, but the problem is here to stay, the question is how to handle it.
 
Upvote
-10 (0 / -10)

Eldorito

Ars Tribunus Angusticlavius
7,964
Subscriptor
If AI lets everyone create their own "influencers," then who is left to influence?

And why is it necessary to share your life with 9000 of your closest friends? Friends by definition are people you don't need technological help to cultivate, beyond maybe a phone to give them a call every now and then. I'll never understand social media. Glad I stayed away now that it's devolved into a total shitshow and beyond.

Why is it necessary to share you opinion on a forum to thousands of people you don’t even know?

If someone were to take all your posts, your writing style, then started to use it to write something awful while being 100% identifiable as you, wouldn’t you be at least a bit creeped out?

Because that’s how insane this is. This woman did nothing unusual. It’s the perverted money seeking arseholes that are the problem.
 
Upvote
8 (8 / 0)
It's a weird extension of the historically successful concept of the people who got rich from the Gold Rush of the 19th Century West in the US: Your odds of being successful were much higher selling the shovels and supplies to the high-risk gamblers known as miners.
Yeah. I don't use an ad blocker, and a good 80-90% of the ads I get on YouTube are either obvious scams or sleazy hustles like these. There are multiple companies out there whose business model consists of hawking expensive online "courses" ostensibly coaching people on how to build a stream of passive income by flooding Amazon and/or Audible with books designed to match trending search keywords. These books are supposed to be written as quickly and cheaply as possible, so the trainee is expected to either use genAI (slop) or commission someone willing to ghostwrite for very low wages. Dan Olson of "Folding Ideas" YouTube fame did a nice little essay on the concept a little while back; rather than waste their time competing with a horde of other saps (miners) in a race to make the internet as bloated and useless as possible, they've set up a "training" program (shovel) that also functions as a sort of multi-level marketing scheme.

That this scam isn't as gross and immoral as the one claiming to teach you how to make money by harassing a woman you found online or who you photographed in public is about the only nice thing I'll say about it.
 
Upvote
3 (3 / 0)

Wheels Of Confusion

Ars Legatus Legionis
75,657
Subscriptor
It is a matter of reporting integrity.
Please keep telling us how it's about ethics in gaming AI journalism.


Even though MG and the other plaintiffs have continually lobbied Instagram to take their images down, many of them are still up, she claims, because they do not technically violate Instagram’s guidelines surrounding AI-generated content. When reached for comment, a spokesperson for Instagram said it had “extremely strict policies” around both AI- and non-AI-generated nonconsensual intimate imagery, removing accounts that post such content.
We're talking Instagram, owned by Meta which is actually Facebook, right? With Mark Zuckerberg, who okayed a plan to create stealth-AI profiles to try and keep people clicking?
Yeah, sure. I believe they have "strict policies."


Until just now I had never heard of “porn influencers”.

I was better off.
Frankly, the only problem I have with that general concept is the "influencers" part. I look down on people who willingly want to become "influencers," as I don't think there's a legitimate way to engage in that profession without admitting you're a dishonest huckster who just wants to be the face of something to sell products. And most of the "influencers" aren't that honest about it.
I think YouTube is what it is today largely because people glommed on to being "influencers" instead of posting engaging videos. Everyone from YT to the content producers to the advertisers has jumped in with both feet and really turned the site into a wasteland.

I don't have a problem with people doing porn. And if they need to read some ad copy to keep the lights on too, so be it. But doing it to be an "influencer" feels less legitimate IMO.
---
In this specific case, the idea of making "porn influencers" out of regular people without their consent is horrifying and loathsome. Selling the knowledge to do so is even more cynical. I think there's little else left to be done that better illustrates the corrosive effects of push-button content creation, capitalism, and a society which doesn't value its own members, without involving an actual body count.
 
Upvote
4 (4 / 0)
Upvote
3 (3 / 0)
You are woefully behind in your AI-prompting knowledge.

Can I introduce you to "Not Jason Statham" who was created without using any names and a three-sentence prompt:

View attachment 134202
Prove it. Post the exact prompt used.

And prove that the training data for the model did not include pictures of Jason Statham.
 
Upvote
5 (5 / 0)
This is beyond disgusting. I am a women who practices amateur photography, to the level of keeping my own home studio with strobes and backdrops. One of the consistent struggles with this hobby is the fact that the model who is most easily available to practice different forms or lighting or ideas is myself. Yet posting even the most innocuous photos online puts me at high risk for exploitation, to the point where maintaining social media for my portrait portfolio frankly seems more risky than it’s worth.

The downside? I’m limited in how easily I can connect with other photographers and I did have to figure out another online solution for sharing my portfolio with others. I lose the potential publicity I would get on social platforms. But I’ll take that trade off.

I am an older millennial. I’ve seen the early days of MySpace and the first iteration of Facebook and I was online with AIM was one of the main messaging platforms. Social media all feels like it’s stagnant and rotting today. Passive income streams producing slop to perpetuate a pipe dream that benefits only a few at the expense of many others. The sooner we can move away from this addicting poison as a society, the happier I’ll be.
To might be interested in the indieweb.

But other than that, you should glaze the images you upload so it'll mess up any AI training that might be attempted with them. That way the dataset is poisoned. Alternatively, you can go the old route of a portfolio for more serious and trusted candidates that want a wider example set - mailing physical photos to them, or maybe email as well
 
Upvote
0 (0 / 0)

Techlight

Smack-Fu Master, in training
28
Subscriptor
The older I get (not that I consider myself "old" any time soon), the more I realise how f'd up the world is for women. Sure, for both men and women in many cases, but overwhelmingly women. More than sad, especially with all the waves of outcry that seem to die down pretty quickly and leave little changed in the end.

We're talking Instagram, owned by Meta which is actually Facebook, right? With Mark Zuckerberg, who okayed a plan to create stealth-AI profiles to try and keep people clicking?
Yeah, sure. I believe they have "strict policies."
A strict policy can also be "we allow everything", and they can enforce it strictly to the point of going to court to keep content up for as long as possible. There is so much "marketing" speak in everything (big) companies say these days that quoting them becomes meaningless.

If there was one skill we should promote more in education it's critical thinking (and reading/listening). Which of course is also getting more difficult as technology changes and allows more people access to psychological tricks to play on feelings in their communication. "Confidently being wrong and persuasive" has never been easier to do, and more difficult to defend against.

(edit: I did it too. If there was one skill we should promote it should actually be respect for women)
 
Last edited:
Upvote
2 (2 / 0)

The Lurker Beneath

Ars Tribunus Militum
6,703
Subscriptor
This may sound unserious, but an honest question: I wonder how these sorts of rules would apply to a "Rule 34" type of situation, esp in regards to animated characters? Would that just be some kind of civil copyright violation of some sort?

If someone created a Fanvue page of Minnie/Mickey/Mario/Peach, would Disney/Nintendo sue under "Take It Down Act" or ... ?

"Take it Down!"

"NO - NOT LIKE THAT!!!"
 
Upvote
0 (0 / 0)