Ethical AI art generation? Adobe Firefly may be the answer

caramelpolice

Ars Tribunus Militum
1,677
Subscriptor
That is not to say that there are no ethical or economic concerns with regards to AI art. But the argument that 'people would stop making art if a computer can do it for you' is very much a strawman argument in this debate, and holds very little water.
Is anyone actually arguing that people would just altogether stop making art? I have seen my fair share of hobbyist artists get demoralized when, say, someone trains a Stable Diffusion model to deliberately and specifically rip off their style or characters. But of course the people who make art for the love of art will keep doing it anyway.

The problem is not and has never been that human artists will stop existing altogether or whatever. The problem is in how these tools are taking the work of those artists, often without consent, and using them to make commercial tools that will deprive them of further work for the sake of profit.
 
Upvote
-4 (5 / -9)
One consequence, looking at it from a very high level, is that this technology seems to empower a huge number of people that used to be either not quite talented or not passionate enough, to instantly SURPASS a bunch of other people who ARE all that, and spent decades investing in it. Millions of people who's entire identity was put into this must feel like having the rug pulled out from under them, and for what? In return we get tens of millions of moderately talented hobbyists (at least as long as ChatGPT hasn't taken over prompting completely)flood the world with mediocrity. What is the aggregate effect on Total Human Wellbeing, here?
Ah yes, much the same way that music is now pointless because someone invented a synthesiser of instruments. Damn flood of mediocrity from those people who can’t even play a piano, or a guitar, or drums, or..

Maybe, just maybe; art will be different. And that’s fine - no actually that’s exciting - to me.
 
Upvote
5 (10 / -5)

B'Trey

Ars Scholae Palatinae
869
Nor is there a carve out in copyright law to prevent using art as training material. I'm constantly amazed by how broad people mistakenly believe copyright protections to be.
Stinks that you are being down voted. (This probably will be as well.)

https://www.copyright.gov/what-is-copyright/
U.S. copyright law provides copyright owners with the following exclusive rights:

  • Reproduce the work in copies or phonorecords.
  • Prepare derivative works based upon the work.
  • Distribute copies or phonorecords of the work to the public by sale or other transfer of ownership or by rental, lease, or lending.
  • Perform the work publicly if it is a literary, musical, dramatic, or choreographic work; a pantomime; or a motion picture or other audiovisual work.
  • Display the work publicly if it is a literary, musical, dramatic, or choreographic work; a pantomime; or a pictorial, graphic, or sculptural work. This right also applies to the individual images of a motion picture or other audiovisual work.
  • Perform the work publicly by means of a digital audio transmission if the work is a sound recording.
Copyright doesn't mean "I own it and you have to ask me if you want to do anything at all with it." The only thing copyright protects that would possibly apply would be a derivative work. And that would potentially apply only to something the AI creates, not to the act of training.

It's absolutely fine if someone thinks that copyright should be stronger or protect more things than it does. It's fine to say that this OUGHT to be illegal. But it's not wrong to point out that there is very little factual support for the claim that it actually IS illegal. Edit: grammar
 
Last edited:
Upvote
16 (17 / -1)

Aurich

Director of Many Things
41,066
Ars Staff
Oh. My. God. Ray Gun! I bought the first few issues of that mag, and yeah, the layout and design was really... Interesting. It also had a lot of good and interesting articles covering alternative/indie bands, with no thought to commercialism.
The design was the brainchild of David Carson. He has a good book called The End of Print, you can find scans of it online pretty easily if you're interested, I have a hardcopy.
 
Upvote
1 (1 / 0)

Voldenuit

Ars Tribunus Angusticlavius
6,764
Is anyone actually arguing that people would just altogether stop making art? I have seen my fair share of hobbyist artists get demoralized when, say, someone trains a Stable Diffusion model to deliberately and specifically rip off their style or characters. But of course the people who make art for the love of art will keep doing it anyway.

The problem is not and has never been that human artists will stop existing altogether or whatever. The problem is in how these tools are taking the work of those artists, often without consent, and using them to make commercial tools that will deprive them of further work for the sake of profit.

The same problem has come up with the invention of the cotton gin, or the automobile, or the printing press, or the computer, or 3d milling, or robotic assembly.

Cotton pickers, horse carriage drivers, scribists, calculists, machinists and factory workers didn't get to ban these inventions, and I don't think commercial artists will have much traction with outlawing AI art.

Note that all these inventions heralded both economic, social and technological progress as well as ills, but society itself adapted around these changes, some of which were drastic.

Now it would be great if society can come with an equitable way to reimburse its participants, perhaps with some form of Universal Basic Income, but I don't think artists as a class will be the ones to put a brake on society's relentless drive for automation and automated tools (and profits. It's always about the profits).
 
Upvote
7 (11 / -4)

caramelpolice

Ars Tribunus Militum
1,677
Subscriptor
The same problem has come up with the invention of the cotton gin, or the automobile, or the printing press, or the computer, or 3d milling, or robotic assembly.

Cotton pickers, horse carriage drivers, scribists, calculists, machinists and factory workers didn't get to ban these inventions, and I don't think commercial artists will have much traction with outlawing AI art.

Note that all these inventions heralded both economic, social and technological progress as well as ills, but society itself adapted around these changes, some of which were drastic.

Now it would be great if society can come with an equitable way to reimburse its participants, perhaps with some form of Universal Basic Income, but I don't think artists as a class will be the ones to put a brake on society's relentless drive for automation and automated tools (and profits. It's always about the profits).
The key difference between AI art and the automobile is that the latter isn't the result of someone taking the work of thousands of horse-drivers into a blender without permission and using it to create a machine that does the horse-driver's work in their place. People are literally deliberately training AI models on specific artists to deliberately recreate their particular styles. Models are scraping thousands of images without permission or credit. It's not innovation, it's plagiarism and theft. It's practically a meme at this point to see an ersatz Patreon watermark in the corner of AI art because many models are trained on so many stolen pieces.
 
Upvote
-13 (4 / -17)

ardent

Ars Legatus Legionis
12,466
Nor is there a carve out in copyright law to prevent using art as training material. I'm constantly amazed by how broad people mistakenly believe copyright protections to be.
There are three major parties to this viewpoint; those who are financially incentivized to not understand this, those who are uninformed on copyright law (which is absolutely fine, generally speaking), and those who are virtue signaling.

Copyright protections are fairly good! Professional media creators probably do register their copyrights to enjoy the full set of rights in case of infringement. Using that media for training, even machine learning, is not infringement. If it was, you'd need to pay millions to graduate from art school.
 
Upvote
7 (10 / -3)

ardent

Ars Legatus Legionis
12,466
Stinks that you are being down voted. (This probably will be as well.)

https://www.copyright.gov/what-is-copyright/
U.S. copyright law provides copyright owners with the following exclusive rights:

  • Reproduce the work in copies or phonorecords.
  • Prepare derivative works based upon the work.
  • Distribute copies or phonorecords of the work to the public by sale or other transfer of ownership or by rental, lease, or lending.
  • Perform the work publicly if it is a literary, musical, dramatic, or choreographic work; a pantomime; or a motion picture or other audiovisual work.
  • Display the work publicly if it is a literary, musical, dramatic, or choreographic work; a pantomime; or a pictorial, graphic, or sculptural work. This right also applies to the individual images of a motion picture or other audiovisual work.
  • Perform the work publicly by means of a digital audio transmission if the work is a sound recording.
Copyright doesn't mean "I own it and you have to ask me if you want to do anything at all with it." The only thing copyright protects that would possibly apply would be a derivative work. And that would potentially apply only to something the AI creates, not to the act of training.

It's absolutely fine if someone thinks that copyright should be stronger or protect more things than it does. It's fine to say that this OUGHT to be illegal. But it's not wrong to point out that there is very little factual support for the claim that it actually IS illegal. Edit: grammar
From a technical perspective, it's impossible for an AI to create a derivative work. The process works by taking a training set and breaking it down into a data set that is then used to create a set of instructions for the AI to work from in creating new works. When you see things like ghost watermarks that's a training issue (the data scientist did not provide good enough instructions, i.e. to ignore watermarks). That's correctable, of course, without necessarily needing to start over. There are some interesting instances where an AI will compose something that looks pretty similar to another artist's work. But that's all it is. The way it works is by generating an entirely new media composition from scratch. If it creates "portrait of a golden retriever" that looks materially similar to an artist's, chances are that a) there are a lot of similar portraits, b) there would be enough differences for a jury to find them to be non-derivative works (if that were possible, which it isn't, since there's no copyright to argue over), and/or c) the training set was somehow incredibly limited on dog portraiture.
 
Upvote
3 (5 / -2)

Cervus

Ars Scholae Palatinae
1,122
Subscriptor
The key difference between AI art and the automobile is that the latter isn't the result of someone taking the work of thousands of horse-drivers into a blender without permission and using it to create a machine that does the horse-driver's work in their place. People are literally deliberately training AI models on specific artists to deliberately recreate their particular styles. Models are scraping thousands of images without permission or credit. It's not innovation, it's plagiarism and theft. It's practically a meme at this point to see an ersatz Patreon watermark in the corner of AI art because many models are trained on so many stolen pieces.

When I write a novel in the style of Terry Pratchett, am I stealing? Do I need permission from the Pratchett estate to learn from his works? The answer to that is no, it isn't. And no, I don't. As far as I'm concerned that also applies to AI learning. What happens with the output is a different matter.
 
Last edited:
Upvote
11 (11 / 0)
Agreed. The vast majority of art is made for art's sake. This is the case on Twitter, Instagram, Deviantart, Pixiv, or any of the large image sharing websites, where the vast majority of contributors create art that is not monetized.

The same is true for photography, fiction writing, 3d makefiles, music (just see how many people are actively making money through music vs the number of kids who go to piano lessons), knitting, sculpting, woodworking, crafts etc. Creative outlets will always have people engaging in them for rewards (personal, social) other than just the economic.

Humans are a creative species. Monetization of our creative skills is a fairly new phenomenon, and AI tools are just a new addenddum both in addition to and parallel to our creative endeavours as a species and as individuals.

We did not stop writing fiction when AI writing tools came out, we did not stop composing music when algorithmically created music came about, and we will not stop making art because AI art generation tools exist.

That is not to say that there are no ethical or economic concerns with regards to AI art. But the argument that 'people would stop making art if a computer can do it for you' is very much a strawman argument in this debate, and holds very little water.

Let's be honest here. AI fiction sucks. AI music is mediocre. Right now generative tools for creative tasks just aren't that great. With regards to actually hand-crafting usable objects like knitting, you can't do custom knits at no cost using machines. 3D printers still has a lot of creative and design work involved in making the models. So yeah, the shitty generative tools we have now have not made an impact so far, but that's to be expected. None of what is come is comparable to what is coming and you know it, so let's carefully look at what we'll probably have in several years. Prompt-driving custom art in the form of images and music that's pretty good quality for little to no cost. Personally, I think good written works are going to take quite a bit longer, but if you want to assume someone can create a good quality novel from some prompts in a few years then for the purposes of this discussion we can do that.

The argument isn't that people will stop making art. It's that this will be hugely devastating to artistic communities and that art is something special that humans do that needs special consideration compared to other things that have been automated.

The same problem has come up with the invention of the cotton gin, or the automobile, or the printing press, or the computer, or 3d milling, or robotic assembly.

But THIS is exactly my point. Widespread use of quality AI generated art will decimate the art communities. Professional artists will all but disappear. Art schools will get decimated by greatly reduced numbers of people wanting to spend a lot of money to study something that they know there's no career in. And this will have long-term impacts in terms of new art forms being developed, because fewer people will be exploring art, those that do will have far less time to devote to it, and so the creative energy being applied to arts will be far less than what we have today. That's a tragedy we should try to avoid, not shrug our shoulders at and compare professional artists to carriage drivers or scribes, because these aren't the same thing at all.

There's a common thing in people who take art for granted in that they think the time and effort of the artist aren't really valuable or that any "true" artists will make art even if they end up penniless. This isn't accurate and isn't how the real world works for the vast majority of people, nor is it just to expect artists to have to suffer for art that enriches society as a whole.

I don't even think that generative art AI is doing anything illegal by using works for training -- but that's irrelevant to the potential harm to this important human endeavor and the fact that we need to take this seriously. One solution, since people don't seem that interested in talking about solutions, is greatly increasing government grants for the arts, though that's going to hit political stumbling blocks in the US. It would be easier to achieve than UBI though and ensure that aspiring artists would know there's at least a chance they could spend their life making art and still provide for themselves -- and that's very important. And to be clear, by "art" here, I include novels and other such works, though I don't think novelists have much to fear at the moment with the quality of output we see from AI text generators.

And while I am distinguishing art from many other activities humans engage in, I don't think it's the only field like this. But, afaik, it's the only one likely to be threatened like this anytime soon.
 
Upvote
9 (11 / -2)
The key difference between AI art and the automobile is that the latter isn't the result of someone taking the work of thousands of horse-drivers into a blender without permission and using it to create a machine that does the horse-driver's work in their place. People are literally deliberately training AI models on specific artists to deliberately recreate their particular styles. Models are scraping thousands of images without permission or credit. It's not innovation, it's plagiarism and theft. It's practically a meme at this point to see an ersatz Patreon watermark in the corner of AI art because many models are trained on so many stolen pieces.

I don't even think it's really stealing anymore than one person being inspired by another's work. And I am not sure how much value there is to gain by trying to compensate artists here, because that money will dry up quickly as soon as the models have a large enough base for training. I think we need to remember and focus on the fact that there's intrinsic value and worth in humans making art and exploring artistic endeavors. That and enabling that is what is most important.
 
Upvote
2 (4 / -2)

Grey Bird

Ars Scholae Palatinae
759
Subscriptor++
people have been training off other people's art since the invention of art. There isn't anything unethical about it.
An actual artist trains learning how other artists do techniques, etc. but creates their own unique art (unless they are creating forgeries.) AI is actually using the art created by others and glomming it together with algorithms to create what you see. Everything you see in a Midjourney "created" picture includes pieces of actual art created by human artists, not just their techniques. AI art is more comparable to "sampling" used by musicians, and musicians have to get approval, possibly including payment, from the original artists they are sampling if it's still under copyright. Doing it with a computer algorithm shouldn't somehow magically avoid paying the original copy-written artists for the art it uses.
 
Upvote
-13 (2 / -15)

Cervus

Ars Scholae Palatinae
1,122
Subscriptor
An actual artist trains learning how other artists do techniques, etc. but creates their own unique art (unless they are creating forgeries.) AI is actually using the art created by others and glomming it together with algorithms to create what you see. Everything you see in a Midjourney "created" picture includes pieces of actual art created by human artists, not just their techniques. AI art is more comparable to "sampling" used by musicians, and musicians have to get approval, possibly including payment, from the original artists they are sampling if it's still under copyright. Doing it with a computer algorithm shouldn't somehow magically avoid paying the original copy-written artists for the art it uses.

That's simply not how these image generators work at all. They're not sampling. The AI trains on tens of thousands of images and keeps exactly none of them, not even a portion. Unless they've somehow found a way to compress hundreds of terabytes of images down to a few GB.

For that matter, style is not copyrightable, even if a particular style is attributable to a specific person. A specific work in a style is copyrightable. I can imitate Terry Pratchett's style of humor without owing anything to his estate.
 
Upvote
11 (11 / 0)

lucubratory

Ars Scholae Palatinae
1,430
Subscriptor++
Two options.

1. This splits the anti-AI art movement between Luddites (not as a slur, just the best word for it) and copyright devotees. Copyright devotees will be happy with a solution like this because it fits the magic rules as well as any stock image database does. If a large enough bloc of copyright enthusiasts split off from the anti-AI movement over this, I strongly suspect that there's no longer any political space for the Luddites to get anything done at all. The response will basically be "They made it legal and now everyone's happy with it, what's the holdup?". The resulting implosion of the anti-AI movement is likely to mean that even "non-ethical" AI art tools like Stable Diffusion are eventually widely used if what they're doing is found to be legal in the US. There won't be any organised campaign left to oppose them, and it'll end up just being: If you want to have "ethical AI art", you use the models of very large corporations like Adobe, Microsoft, Disney etc, and if you don't view current AI art as unethical you can use open source or research derived AI art programs.

2. Most anti-AI people aren't actually copyright devotees, but rather Luddites that are just grabbing the most useful argument at this current moment. If this is the case, there won't be a substantial bloc splitting off from the anti-AI movement regardless of how legal and ethical a training database is made, and it will either succeed as an activist movement in forcing the government to make concessions (or splitting industry with boycotts), or it will fail after enough time has passed.


I don't know which way this goes. It's also leaving aside the questions of "Is Firefly actually any good at all, considering the inherently limited training data?" and "Will the United States government let any 'ethics' movement prevent the US remaining the best place for AI researchers to do their research, given the ongoing effort by China to recruit those researchers with better pay and benefits as the two powers enter a new Cold War?"


I am also really interested in what ends up happening with the EU and UK if the anti-AI movement succeeds in getting legal bans etc in the US. The EU has been unequivocal about supporting AI researchers, and the UK has gone well beyond that and given copyright exemptions for basically anyone doing AI research in the country, with no requirement of separation between researchers and businesses. Even if the anti-AI movement succeeds in the US, I don't see how that translates to success in the EU or the UK (or Japan, or China, etc).
 
Upvote
2 (5 / -3)

graylshaped

Ars Legatus Legionis
67,938
Subscriptor++
Ah yes, much the same way that music is now pointless because someone invented a synthesiser of instruments. Damn flood of mediocrity from those people who can’t even play a piano, or a guitar, or drums, or..

Maybe, just maybe; art will be different. And that’s fine - no actually that’s exciting - to me.
Synthesizers are instruments. They take active human input at all times.

Analogy fail.
 
Upvote
-4 (1 / -5)
Two options.

1. This splits the anti-AI art movement between Luddites (not as a slur, just the best word for it) and copyright devotees. Copyright devotees will be happy with a solution like this because it fits the magic rules as well as any stock image database does. If a large enough bloc of copyright enthusiasts split off from the anti-AI movement over this, I strongly suspect that there's no longer any political space for the Luddites to get anything done at all. The response will basically be "They made it legal and now everyone's happy with it, what's the holdup?". The resulting implosion of the anti-AI movement is likely to mean that even "non-ethical" AI art tools like Stable Diffusion are eventually widely used if what they're doing is found to be legal in the US. There won't be any organised campaign left to oppose them, and it'll end up just being: If you want to have "ethical AI art", you use the models of very large corporations like Adobe, Microsoft, Disney etc, and if you don't view current AI art as unethical you can use open source or research derived AI art programs.

2. Most anti-AI people aren't actually copyright devotees, but rather Luddites that are just grabbing the most useful argument at this current moment. If this is the case, there won't be a substantial bloc splitting off from the anti-AI movement regardless of how legal and ethical a training database is made, and it will either succeed as an activist movement in forcing the government to make concessions (or splitting industry with boycotts), or it will fail after enough time has passed.


I don't know which way this goes. It's also leaving aside the questions of "Is Firefly actually any good at all, considering the inherently limited training data?" and "Will the United States government let any 'ethics' movement prevent the US remaining the best place for AI researchers to do their research, given the ongoing effort by China to recruit those researchers with better pay and benefits as the two powers enter a new Cold War?"


I am also really interested in what ends up happening with the EU and UK if the anti-AI movement succeeds in getting legal bans etc in the US. The EU has been unequivocal about supporting AI researchers, and the UK has gone well beyond that and given copyright exemptions for basically anyone doing AI research in the country, with no requirement of separation between researchers and businesses. Even if the anti-AI movement succeeds in the US, I don't see how that translates to success in the EU or the UK (or Japan, or China, etc).

I consider myself in an "Has concerns about some applications of AI" crowd and would call the concern I addressed in this thread as rooted in humanism. Art in all it's forms is a very important means of human expression and we need to make sure we, as a society, support that. That inherently means, imho, supporting people who want to do that as the focus of their life (e.g. a career). I think AI art has good qualities and applications and I am not advocating for it to be banned. I don't particularly like Adobe's 'copyright' solution either because it ignores the core issue of making sure humans are enabled to create and explore art, including new forms.

There are essential things that humans need to be deeply involved in as part of realizing and expressing our humanity. I think art is one of them. (I would say that science is another). If we let those things become dominated by AI to the point where humans can't do them or almost none see the point, then we might as well plug ourselves into the Matrix and let AI do all of our thinking for us. And that's not saying that there's no place for AI either.
 
Upvote
2 (4 / -2)

Dreams of Grandeur

Smack-Fu Master, in training
67
For those of you concerned about AI art dominating the industry, this might give some comfort, if you haven't seen it. Bear in mind that a lot of questions remain and this is far from definitive, but I find it very interesting. If you can't copyright an image from an AI, that would dramatically reduce the appeal in today's IP-driven creative industry. The argument is interesting too, although I'm skeptical that it will hold up under closer scrutiny.

If I study all of Picasso's work, and can now make a new painting clearly in the same style (but not any subject he's painted before), am I legally infringing his rights if he is alive today?

I would've thought such questions are long settled and there is a clear-cut answer already. (But I don't know the answer -- yes or no?)

"new invention destroys some people's livelihood" -- that happens all the time in all different fields, people/society has to adapt (provide training for new job roles etc.) but IMHO that is not a consideration for why such new invention should be disallowed
I've said this before, but I'll reiterate: I fail to see any way that these "AIs" are actually doing the same thing as human minds when "learning." To be sure, I am far from an expert on the brain. But I don't think this is an assumption we can make. Just because we call it "learning" does not at all mean it's analogous to human learning. Based on what knowledge I have, the two appear very different.

Of course, this doesn't really sway the verdict either direction, I'm just tired of hearing this argument.

Will artists stop complaining now?

My guess is no, because they never cared about copyright - they just used it as a blunt object.
I've seen this sort of negativity toward artists a lot lately. Where the heck did it come from? Are we really going to cast the friggen artists as the bad guys here? We don't even know if there are any bad guys.
 
Upvote
-3 (2 / -5)

graylshaped

Ars Legatus Legionis
67,938
Subscriptor++
For those of you concerned about AI art dominating the industry, this might give some comfort, if you haven't seen it. Bear in mind that a lot of questions remain and this is far from definitive, but I find it very interesting. If you can't copyright an image from an AI, that would dramatically reduce the appeal in today's IP-driven creative industry. The argument is interesting too, although I'm skeptical that it will hold up under closer scrutiny.


I've said this before, but I'll reiterate: I fail to see any way that these "AIs" are actually doing the same thing as human minds when "learning." To be sure, I am far from an expert on the brain. But I don't think this is an assumption we can make. Just because we call it "learning" does not at all mean it's analogous to human learning. Based on what knowledge I have, the two appear very different.

Of course, this doesn't really sway the verdict either direction, I'm just tired of hearing this argument.


I've seen this sort of negativity toward artists a lot lately. Where the heck did it come from? Are we really going to cast the friggen artists as the bad guys here? We don't even know if there are any bad guys.
There aren't bad guys, at this point. There are free-loaders, who are teaching machines to ape other peoples' work on a massive scale, without the right holders' permission or knowledge.

It would be nice if we had a functional Congress who could see and address this in a reponsible manner that allows development of the tech while protecting those holding rights; having said that, it is really an international issue. The tech is moving very, very fast here.
 
Upvote
-9 (1 / -10)

poochyena

Ars Scholae Palatinae
4,997
Subscriptor++
An actual artist trains learning how other artists do techniques, etc. but creates their own unique art
google search "fan art". Many artists create art based on other people's art.
glomming it together with algorithms to create what you see. Everything you see in a Midjourney "created" picture includes pieces of actual art created by human artists
Thats not how they work. They train off the image, they aren't using the actual art images and mashing them together. What they create is completely original.
AI art is more comparable to "sampling" used by musicians, and musicians have to get approval, possibly including payment, from the original artists they are sampling if it's still under copyright.
No, because they aren't using the art. Its like listening to a song and deciding you want to make music that sound like that.
 
Upvote
6 (7 / -1)

graylshaped

Ars Legatus Legionis
67,938
Subscriptor++
google search "fan art". Many artists create art based on other people's art.

Thats not how they work. They train off the image, they aren't using the actual art images and mashing them together. What they create is completely original.

No, because they aren't using the art. Its like listening to a song and deciding you want to make music that sound like that.
They train off the art but they aren't using the art? They tell the machine "do it like this," but that isn't using the unlicensed art?

Reconcile that, please.
 
Upvote
-11 (0 / -11)

Cervus

Ars Scholae Palatinae
1,122
Subscriptor
They train off the art but they aren't using the art? They tell the machine "do it like this," but that isn't using the unlicensed art?

Reconcile that, please.
I can train off the art same as the AI does. I can try and paint something in the style of Boris Vallejo, or Picasso, or a random person on DeviantArt. So I do my painting and create an original work based on their style. But I don't owe them anything, I don't have to ask permission. I don't need a license to learn from their style.

Why does the AI need a license to learn but I don't?
 
Upvote
9 (9 / 0)

poochyena

Ars Scholae Palatinae
4,997
Subscriptor++
They train off the art but they aren't using the art? They tell the machine "do it like this," but that isn't using the unlicensed art?

Reconcile that, please.
correct. Much like how I can commission an artist to draw an image of the pokemon Poochyena. They would use official art as reference, but not actually use the image for making the art.
 
Upvote
5 (6 / -1)

graylshaped

Ars Legatus Legionis
67,938
Subscriptor++
I can train off the art same as the AI does. I can try and paint something in the style of Boris Vallejo, or Picasso, or a random person on DeviantArt. So I do my painting and create an original work based on their style. But I don't owe them anything, I don't have to ask permission. I don't need a license to learn from their style.

Why does the AI need a license to learn but I don't?
Because you aren't a fucking machine that can replicate this style ad infinitum. Plus, you probably do it poorly.

This isn't hard, dude.
 
Upvote
-8 (1 / -9)

Cervus

Ars Scholae Palatinae
1,122
Subscriptor
Because you aren't a fucking machine that can replicate this style ad infinitum. Plus, you probably do it poorly.

This isn't hard, dude.

So your main issue has to do with volume of production. I see. I could argue against the printing press with that kind of reasoning. A clarke or scribe can only copy a work so quickly whereas a press can create thousands of them in the same amount of time.

I personally think that AI art needs to be correctly sourced so the viewer knows what they're looking at. Perhaps an AI mark similar to a copyright mark.
 
Upvote
6 (6 / 0)

Senjeng

Smack-Fu Master, in training
7
The basic problem, from my perspective, is economic. I'm not convinced that, in the long run, professional artists and their employers will actually be willing to pay for ethics as a line item. As one part of a broader creative suite, for which you're already paying anyway? Maybe. But nobody is going to spend money on an ethical model when the "unethical" model is just as good or better (because it has access to more training data), and is also cheaper, unless they have actual legal risks. If those legal risks materialize, then maybe this will be a different conversation, but I don't think anyone has plausibly alleged, before a court of competent jurisdiction, substantial similarity between an AI model's training data and outputs. If that never happens, then eventually people will stop worrying about it happening.
I see your point, but…the name “Adobe” is golden for many content creators and, especially, art directors (for better or worse). If they're already in the Adobe ecosystem I think they will definitely use this. And if this program is properly integrated with other Adobe products it becomes a no brainer for those people. Just my opinion.
 
Upvote
1 (1 / 0)

graylshaped

Ars Legatus Legionis
67,938
Subscriptor++
So your main issue has to do with volume of production. I see. I could argue against the printing press with that kind of reasoning. A clarke or scribe can only copy a work so quickly whereas a press can create thousands of them in the same amount of time.

I personally think that AI art needs to be correctly sourced so the viewer knows what they're looking at. Perhaps an AI mark similar to a copyright mark.
My main issue is with the role of the artist in creating the work. Mass reproduction thereafter cheapens it further.


Yes. Sourcing is where we have common ground, plus some sort of "we faked this" delineation.
 
Upvote
-5 (0 / -5)

Senjeng

Smack-Fu Master, in training
7
There’s no such thing as ethical AI image generation if it’s used to replace art made by a human. It’s bad for artists financially and for our culture
As a former graphic designer for an advertising firm, I can tell you that speed and utility will win in almost any commercial advertising company. I do think it will cause there to be less positions in commercial art, unhappily.
 
Upvote
2 (2 / 0)

graylshaped

Ars Legatus Legionis
67,938
Subscriptor++
Upvote
-10 (1 / -11)

Captionx

Smack-Fu Master, in training
9
It seem like the "Ethical" part was just to steamroll all of the users who uploaded stock content to Adobe Stock in good faith.

Fr1eDPCWAAEcCPY.jpeg
 
Upvote
4 (5 / -1)
If it creates "portrait of a golden retriever" that looks materially similar to an artist's, chances are that a) there are a lot of similar portraits, b) there would be enough differences for a jury to find them to be non-derivative works (if that were possible, which it isn't, since there's no copyright to argue over), and/or c) the training set was somehow incredibly limited on dog portraiture.
Nah. That's not true at all. For a lot of celebrities, it is training off a few very obvious movie promotional stills and will look nearly identical. Plus there are tools where you can search in the database used to train Stable Diffusion, and sure enough - neatly tagged are the exact images you think.
 
Upvote
-5 (0 / -5)

zunipus

Ars Scholae Palatinae
925
Subscriptor
As per every dictionary ever written:
There is no such thing as AI "art", and there never will be.
Call it something else: Graphics, imagery, creations. But AI doesn't ever make "art". If you want to reference, praise or blame someone, go to the person using an AI tool, and or the people who wrote its code. THEY can create actual art with the tool called AI.
[Need some convincing? When has a paintbrush ever created art? Never. The artist uses a tool, such as a paintbrush, to create art. It's that simple. Tools create nothing and never will.]
 
Upvote
-3 (3 / -6)
As per every dictionary ever written:
There is no such thing as AI "art", and there never will be.
Call it something else: Graphics, imagery, creations. But AI doesn't ever make "art". If you want to reference, praise or blame someone, go to the person using an AI tool, and or the people who wrote its code. THEY can create actual art with the tool called AI.
[Need some convincing? When has a paintbrush ever created art? Never. The artist uses a tool, such as a paintbrush, to create art. It's that simple. Tools create nothing and never will.]

I'm not sure this philosophy of art position is going to work when it comes to these systems in the real world, but I appreciate the effort at least.
 
Upvote
2 (3 / -1)
As per every dictionary ever written:
There is no such thing as AI "art", and there never will be.
Call it something else: Graphics, imagery, creations. But AI doesn't ever make "art". If you want to reference, praise or blame someone, go to the person using an AI tool, and or the people who wrote its code. THEY can create actual art with the tool called AI.
[Need some convincing? When has a paintbrush ever created art? Never. The artist uses a tool, such as a paintbrush, to create art. It's that simple. Tools create nothing and never will.]
There was no such thing, now there is. The dictionaries will be updated accordingly.
 
Upvote
6 (6 / 0)
Eh, if the art someone can produce can be simply replaced by a robot then I’m not sure they were much of an artist to begin with.

The good artists will still get paid and if they’re smart they will use AI for inspiration or to do more tedious or labor intensive parts of their job. AI might put the mediocre ones out of a job though.
This won't be decided by the audience or artists, but by business executives that will see $40 per month vs $3000. A lot of people in a few years will see comparisons like that, in programming, legal assistants ... every office job. You will need two senior coders to do the work of ten people... most probably you will have two junior programmers doing the job of ten senior ones and from time to time you will have the seniors consulting. It's cheaper that way.
 
Upvote
2 (3 / -1)