I think some of you are not understanding what my point was.
It is not about what the ai pics look like, it is what the ai images are used to create the ai image.
For example you may have produced a naked elf. Now the else may not look like any human. But it will be what content was used by the ai machine.
The argument I suspect will be that if the content the ai machine uses is taken from the web such as from facebook and so on, then your using peoples data without consent.
Now the problem then is if your ai pic was produced with machine learning, and out of the millions of pics gathered even if just one pic is of a 16 year old, then the dna of the pic of the elf would in theory be possibly be built upon using the pic of the 16 year old.
Now this becomes the legal problem, is, if your content of naked adults or elfes or whatever has been produced by using people under 18, will this become a problem.
This is not about what the ai image is, but what content was used to create the ai image.
At the moment I believe we have cases of firms claiming that the object to content they shot being used.
Lets be honest, ai is going to be a dream for lawyers, who will see this as easy money.
So as I say, you may see ai content as cheap now, but it may not be once you are having to go back and forth to court defending the ai system you have used.
|