Quote:
Originally Posted by cordoba
To say that your image has been 'used on a pornsite' because it may have been one of millions of images online the AI trained itself on, seems a liberal use of language.
I can understand artists having legal grounds against AI image generators that allow you to type in xx in the style of xx.
And even if a generator did map an image of an underage person, it's a stretch to say it was 'used' to generate an image of a 30 year old. It's not a very intelligent AI if its using images of children to create adult images.
But no doubt you are right, and AI generated porn will be illegal or legally impossible within a couple of years, for the reasons you have mentioned. Equally no doubt the anti-porn anti-trafficking groups that tried to cancel PornHub will push for these laws, even though AI porn could mean the end of 'trafficked' or underage sex workers.
|
It's not that someone's photo was used. That's not the issue. The issue is when the AI image generator produces an image or video that looks strikingly similar to a real person. So much that the public believes it's the real person or the real person's name or trademark is used to market the content.
And it's not about using a child to map a 30-year-old. It's when it produces CSAM - a sexual or pornographic image with a minor in it.
I think NCOSE is shitting themselves about AI porn because you're right, it takes away their argument about trafficking. They cant fundraise when no people were harmed.