GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   The problem with using AI for porn (legal problems) (https://gfy.com/showthread.php?t=1363802)

DVTimes 04-11-2023 03:47 AM

The problem with using AI for porn (legal problems)
 
One problem for those who are considering using AI to make porn is going to be that people may take legal action against you.

A lot of this AI seems to be based on online pics (possibly vid too).

I can see this may become a legal problem, as people are not going to want an image of themselves used on a pornsite, and in time I suspect we will see legal firms set up to just contact such as pornsites and question what system you used and what content was used to produce the AI content.

I suspect we may see new laws where AI producers will need to provide evidence that the content they use has been obtained by people giving consent.

While I can see a lot of people seeing AI as a cheap and easy way to get content for a porn site. It may end up turning into costly legal battles.

Not to mention that the content AI produces may be built on using pics of people under 18, and this too is going to create legal problems. By this I mean you may have an image produced of someone 30, but to create this image, it may be built from millions of pics, and even if one pic used us under 18, then perhaps this could be a legal problem. I could see the laws may also request you can provide evidence that all AI porn produced has been built on people 18 or older, who all gave consent.

pornlaw 04-11-2023 07:26 AM

Quote:

Originally Posted by DVTimes (Post 23122464)
One problem for those who are considering using AI to make porn is going to be that people may take legal action against you.

A lot of this AI seems to be based on online pics (possibly vid too).

I can see this may become a legal problem, as people are not going to want an image of themselves used on a pornsite, and in time I suspect we will see legal firms set up to just contact such as pornsites and question what system you used and what content was used to produce the AI content.

I suspect we may see new laws where AI producers will need to provide evidence that the content they use has been obtained by people giving consent.

While I can see a lot of people seeing AI as a cheap and easy way to get content for a porn site. It may end up turning into costly legal battles.

Not to mention that the content AI produces may be built on using pics of people under 18, and this too is going to create legal problems. By this I mean you may have an image produced of someone 30, but to create this image, it may be built from millions of pics, and even if one pic used us under 18, then perhaps this could be a legal problem. I could see the laws may also request you can provide evidence that all AI porn produced has been built on people 18 or older, who all gave consent.

Every data set for AI is built on photos and in that data set are photos of children and infants so yes it is impossible for it to generate CSAM.

dUbster 04-11-2023 07:29 AM

Jesus Christ you guys are stupid, if you knew how it worked you would know each model is trained from a certain set of images, not all models use every single damn image on the internet, stupid fucks

pornmasta 04-11-2023 08:02 AM

Quote:

Originally Posted by dUbster (Post 23122538)
Jesus Christ you guys are stupid, if you knew how it worked you would know each model is trained from a certain set of images, not all models use every single damn image on the internet, stupid fucks

Who said that?

pornlaw 04-11-2023 09:24 AM

Quote:

Originally Posted by dUbster (Post 23122538)
Jesus Christ you guys are stupid, if you knew how it worked you would know each model is trained from a certain set of images, not all models use every single damn image on the internet, stupid fucks

If you are using one of the open source generators then you are using a data set using photos of everyone - including children.

If you have it set up locally and you are using your own images that are hand selected, you can get around this issue. But you will have other issues.

cordoba 04-11-2023 10:47 AM

Quote:

Originally Posted by DVTimes (Post 23122464)
One problem for those who are considering using AI to make porn is going to be that people may take legal action against you.

A lot of this AI seems to be based on online pics (possibly vid too).

I can see this may become a legal problem, as people are not going to want an image of themselves used on a pornsite, and in time I suspect we will see legal firms set up to just contact such as pornsites and question what system you used and what content was used to produce the AI content.

I suspect we may see new laws where AI producers will need to provide evidence that the content they use has been obtained by people giving consent.

While I can see a lot of people seeing AI as a cheap and easy way to get content for a porn site. It may end up turning into costly legal battles.

Not to mention that the content AI produces may be built on using pics of people under 18, and this too is going to create legal problems. By this I mean you may have an image produced of someone 30, but to create this image, it may be built from millions of pics, and even if one pic used us under 18, then perhaps this could be a legal problem. I could see the laws may also request you can provide evidence that all AI porn produced has been built on people 18 or older, who all gave consent.

To say that your image has been 'used on a pornsite' because it may have been one of millions of images online the AI trained itself on, seems a liberal use of language.

I can understand artists having legal grounds against AI image generators that allow you to type in xx in the style of xx.

And even if a generator did map an image of an underage person, it's a stretch to say it was 'used' to generate an image of a 30 year old. It's not a very intelligent AI if its using images of children to create adult images.

But no doubt you are right, and AI generated porn will be illegal or legally impossible within a couple of years, for the reasons you have mentioned. Equally no doubt the anti-porn anti-trafficking groups that tried to cancel PornHub will push for these laws, even though AI porn could mean the end of 'trafficked' or underage sex workers.

blackmonsters 04-11-2023 12:22 PM

Quote:

Originally Posted by cordoba (Post 23122642)
... and AI generated porn will be illegal or legally impossible within a couple of years ...

How can AI porn be illegal without making all porn illegal?

Especially AI porn made from models that sign up for AI porn modeling.

:2 cents:

cordoba 04-11-2023 12:22 PM

A Rolling Stone article that appeared yesterday covers some of this ground.

https://www.rollingstone.com/culture...it-1234708474/

cordoba 04-11-2023 12:25 PM

Quote:

Originally Posted by blackmonsters (Post 23122674)
How can AI porn be illegal without making all porn illegal?

Especially AI porn made from models that sign up for AI porn modeling.

:2 cents:

Well yeah, there shouldn't be any possible legal problem there, but doesn't the AI need more than a few consenting models to learn from, even if it is only going to generate pics of one (consenting) individual?

For example, if Model A wants to create AI pics of herself having anal sex (because she doesn't actually want to shoot that in real life). Surely the AI will still have to learn from images of real models having anal sex?

blackmonsters 04-11-2023 12:36 PM

Quote:

Originally Posted by cordoba (Post 23122678)
For example, if Model A wants to create AI pics of herself having anal sex (because she doesn't actually want to shoot that in real life). Surely the AI will still have to learn from images of real models having anal sex?

True; but now we are in the territory of "can an asshole be copyrighted".

:1orglaugh

Especially an asshole that's already been used so much that we could recognize that asshole in court like a fingerprint.

:1orglaugh

pornlaw 04-11-2023 01:19 PM

Quote:

Originally Posted by cordoba (Post 23122642)
To say that your image has been 'used on a pornsite' because it may have been one of millions of images online the AI trained itself on, seems a liberal use of language.

I can understand artists having legal grounds against AI image generators that allow you to type in xx in the style of xx.

And even if a generator did map an image of an underage person, it's a stretch to say it was 'used' to generate an image of a 30 year old. It's not a very intelligent AI if its using images of children to create adult images.

But no doubt you are right, and AI generated porn will be illegal or legally impossible within a couple of years, for the reasons you have mentioned. Equally no doubt the anti-porn anti-trafficking groups that tried to cancel PornHub will push for these laws, even though AI porn could mean the end of 'trafficked' or underage sex workers.

It's not that someone's photo was used. That's not the issue. The issue is when the AI image generator produces an image or video that looks strikingly similar to a real person. So much that the public believes it's the real person or the real person's name or trademark is used to market the content.

And it's not about using a child to map a 30-year-old. It's when it produces CSAM - a sexual or pornographic image with a minor in it.

I think NCOSE is shitting themselves about AI porn because you're right, it takes away their argument about trafficking. They cant fundraise when no people were harmed.

2CamXXX 04-11-2023 01:49 PM

Now I am curious if we get in the technical side of things how this could play out.

For what I gather, the IA for standard diffusion at leasts, when it gets a prompt to create such and such, then goes to make an image of pure random pixels or noise, then it uses the knowlege it got from looking at millions of pictures, not the pictures itself, and correct me if im wrong here, so the files in the IA are not said images but the knowlege gained from looking at them. So then the IA proceeds to sort the random pixels into what it thinks could be the most probable order to do what we ask from it.

So if it is as I think it works, then I dont know if we can say that any image used during training is used directly on the end result of the stable diffusion process, yes the AI knows about those images and might use them as reference, but the images are not shipped along the AI and only the knowlege gained from the learning is shipped on the files that then are used to generate whatever it is asked from the AI.

TLDR: The original images are not distributed or used directly but what the AI learned from them is what then creates the pictures we want.

CaptainHowdy 04-11-2023 02:02 PM

Quote:

Originally Posted by pornlaw (Post 23122536)
Every data set for AI is built on photos and in that data set are photos of children and infants so yes it is impossible for it to generate CSAM.

:helpme :helpme . . .

pornmasta 04-11-2023 03:37 PM

Quote:

Originally Posted by pornlaw (Post 23122536)
Every data set for AI is built on photos and in that data set are photos of children and infants so yes it is impossible for it to generate CSAM.

Please source this claim.
You can put whatever you want.

dUbster 04-11-2023 04:23 PM

Quote:

Originally Posted by pornmasta (Post 23122755)
Please source this claim.
You can put whatever you want.

He is a dumb fuck and doesn't know how models work for stable diffusion, each model is trained on a certain set of images, he is just spitting out false bullshit
I can guarantee you that in fact civitai or huggingface would not accept any stable diffusion models with such intentions, the dumb fuck is just talking non-sense garbage

Waddymelon 04-11-2023 05:08 PM

"This is a work of fiction. Any similarities to persons living or dead, or actual events is purely coincidental."

You guys act like you've never seen that before. Hollywood covered this decades ago.

pornmasta 04-11-2023 06:27 PM

Quote:

Originally Posted by dUbster (Post 23122767)
He is a dumb fuck and doesn't know how models work for stable diffusion, each model is trained on a certain set of images,

I confirm this claim.
(Cf tutorial on how to train model)
It needs enough memory to train the model and it can't be the whole internet.

pornmasta 04-11-2023 06:40 PM

Should we ban Gimp because it's possible to do CP with it ?
I think no.

Other difficulty: attribute the age.
We coming back to this Australian censorship problem: everything small boobs is CP ???

Btw if you are on hormone blocker at 21 and you are having sex, is it CP ?
Because technically you are before your teenage.

a1ka1ine 04-11-2023 08:23 PM

should only be an issue if the generated issue resembles someones likeness. most models are trained on so many different people that the resulting generated image isnt similar to any one person.

pornlaw 04-11-2023 11:47 PM

Quote:

Originally Posted by Waddymelon (Post 23122778)
"This is a work of fiction. Any similarities to persons living or dead, or actual events is purely coincidental."

You guys act like you've never seen that before. Hollywood covered this decades ago.

That could be an answer, but that requires responsible AI content creators to use that disclaimer. And we won't know if that will be enough until there's a body of case law after litigation has already occurred.

Someone just didn't invent that saying, they were sued, and it became a standard based on case law.

https://en.wikipedia.org/wiki/All_pe...ous_disclaimer

pornlaw 04-11-2023 11:56 PM

Quote:

Originally Posted by a1ka1ine (Post 23122836)
should only be an issue if the generated issue resembles someones likeness. most models are trained on so many different people that the resulting generated image isnt similar to any one person.

Is it similar enough? I dont know. A jury would decide that. What I do know is if the creator of the first photo started marketing that as Tera Patrick with massive boobs... that's certainly enough for a lawsuit to be filed.


https://adultbizlaw.files.wordpress....howyaicfbj.jpg
https://adultbizlaw.files.wordpress....k-7c67f23d.jpg

bill_musk 04-12-2023 02:12 AM

As always, it believe (no law experience, please note) really depends on the use case we see.
- You try to replicate a pornstar, you'll probably be in trouble, AI or not.
- CSAM content, trouble, AI or not.

- You generate random people, I dont see any problems EVEN IF trained on a set of pornstars.

2MuchMark 04-12-2023 04:48 AM

How would the law handle AI generated porn for personal use? If your AI porn model looks exactly like Scarlett Johansson but was not intended for sale or distribution then is that legal?

DVTimes 04-12-2023 05:41 AM

What I suspect will be that it is not if the image looks like someone, is what data the AI uses.

In the UK for instance businesses how a lot of legal laws when it comes to data and such as cctv. For instance they cannot just upload cctv to the net.

We have seen business sued for millions because they use data they collected from there own websites.

So with AI, it is using peoples data.

It will not matter if your image is used or not to create an image of a porn model, it will be the problem hat it may have been used.

I will presume that we will see law firms set up who will sue people using ai as a data situation.

As someone who has ai content, you may need to show what system you used and what it did to only use content that had given consent.

Or to put it another way, your site has pics of naked fairy's flying around. Soft core. But then you get a legal letter saying a client suspects the images on your site may have used pics of her 5 year old taken from her Instagram page. The image may look nothing like the 5 year old, however the ai system used may be known to use Instagram pages to collect pics. As such it will not matter if the pics of the girl was used or not, just that the ai system uses Instagram pics.

It will be down to data use.

One suspects for a few years we will see test cases about ai pics.

One thing to note is that in the USA they have stated that pics made by ai cannot be copyrighted. As they say at this moment in time only content made by humans can be copyrighted. As such if you use ai content on your site, you may not at this moment in time have any legal claims on it.

pornlaw 04-12-2023 05:53 AM

Quote:

Originally Posted by DVTimes (Post 23122979)
What I suspect will be that it is not if the image looks like someone, is what data the AI uses.

In the UK for instance businesses how a lot of legal laws when it comes to data and such as cctv. For instance they cannot just upload cctv to the net.

We have seen business sued for millions because they use data they collected from there own websites.

So with AI, it is using peoples data.

It will not matter if your image is used or not to create an image of a porn model, it will be the problem hat it may have been used.

I will presume that we will see law firms set up who will sue people using ai as a data situation.

As someone who has ai content, you may need to show what system you used and what it did to only use content that had given consent.

Or to put it another way, your site has pics of naked fairy's flying around. Soft core. But then you get a legal letter saying a client suspects the images on your site may have used pics of her 5 year old taken from her Instagram page. The image may look nothing like the 5 year old, however the ai system used may be known to use Instagram pages to collect pics. As such it will not matter if the pics of the girl was used or not, just that the ai system uses Instagram pics.

It will be down to data use.

One suspects for a few years we will see test cases about ai pics.

One thing to note is that in the USA they have stated that pics made by ai cannot be copyrighted. As they say at this moment in time only content made by humans can be copyrighted. As such if you use ai content on your site, you may not at this moment in time have any legal claims on it.

Are you a UK attorney ?

DVTimes 04-12-2023 05:53 AM

I think some of you are not understanding what my point was.

It is not about what the ai pics look like, it is what the ai images are used to create the ai image.

For example you may have produced a naked elf. Now the else may not look like any human. But it will be what content was used by the ai machine.

The argument I suspect will be that if the content the ai machine uses is taken from the web such as from facebook and so on, then your using peoples data without consent.

Now the problem then is if your ai pic was produced with machine learning, and out of the millions of pics gathered even if just one pic is of a 16 year old, then the dna of the pic of the elf would in theory be possibly be built upon using the pic of the 16 year old.

Now this becomes the legal problem, is, if your content of naked adults or elfes or whatever has been produced by using people under 18, will this become a problem.

This is not about what the ai image is, but what content was used to create the ai image.

At the moment I believe we have cases of firms claiming that the object to content they shot being used.

Lets be honest, ai is going to be a dream for lawyers, who will see this as easy money.

So as I say, you may see ai content as cheap now, but it may not be once you are having to go back and forth to court defending the ai system you have used.

cordoba 04-12-2023 06:40 AM

But this is an argument about data laws, and whether they will cover AI porn, and if they cover AI porn, they will cover all AI generated images, and probably any LLM AI collecting data.

So your original point about people 'not happy that their image is used on a porn site' is not relevant.

If a pornstar says she modelled her hairstyle on 17 year old Miley Cyrus, do you think this would create an issue of her nude images contravening cp laws?

pornlaw 04-12-2023 07:50 AM

Quote:

Originally Posted by DVTimes (Post 23122983)
I think some of you are not understanding what my point was.

It is not about what the ai pics look like, it is what the ai images are used to create the ai image.

For example you may have produced a naked elf. Now the else may not look like any human. But it will be what content was used by the ai machine.

The argument I suspect will be that if the content the ai machine uses is taken from the web such as from facebook and so on, then your using peoples data without consent.

Now the problem then is if your ai pic was produced with machine learning, and out of the millions of pics gathered even if just one pic is of a 16 year old, then the dna of the pic of the elf would in theory be possibly be built upon using the pic of the 16 year old.

Now this becomes the legal problem, is, if your content of naked adults or elfes or whatever has been produced by using people under 18, will this become a problem.

This is not about what the ai image is, but what content was used to create the ai image.

At the moment I believe we have cases of firms claiming that the object to content they shot being used.

Lets be honest, ai is going to be a dream for lawyers, who will see this as easy money.

So as I say, you may see ai content as cheap now, but it may not be once you are having to go back and forth to court defending the ai system you have used.


Im not a UK attorney and I dont know if DVTimes is or isnt since he wont answer...

But I dont agree with his legal analysis for US law. He could be correct for UK law though... IDK.

I do agree that AI will result in a lot of litigation. So if you're an AI content producer you should be creating a litigation fund from the beginning to prepare for when/if you are sued.

DVTimes 04-12-2023 08:05 AM

Quote:

Originally Posted by cordoba (Post 23122994)
But this is an argument about data laws, and whether they will cover AI porn, and if they cover AI porn, they will cover all AI generated images, and probably any LLM AI collecting data.

So your original point about people 'not happy that their image is used on a porn site' is not relevant.

If a pornstar says she modelled her hairstyle on 17 year old Miley Cyrus, do you think this would create an issue of her nude images contravening cp laws?

I think you are getting confused.

I will make this simple.

Think this way.

If I took a book, such as Harry Potter and the Deathly Hallows, and changed it a bit, would I get sued. Probably.

Now what if I mix this with 20 books, I still probably would get sued.

Now take all the books ever written.

I produce a brand new book that has a new story and different style to Harry Potter and the Deathly Hallows.

Now within the new ai book is still the dna of Harry Potter and the Deathly Hallows.

Now along comes J. K. Rowling with her team of lowers and says, while your ai book may have no resemblance to Harry Potter and the Deathly Hallows, we are going to state that your AI system used J. K. Rowling books as a reference at some stage. Now we are claiming copyright not on her technique/skills being used as a copyright infringement.

And so we put it to the ai firm that they have infringed upon copyrighting J. K. Rowling skills and techniques, and abilities as well as the dna of her work.

So when it comes to porn, my argument would not be that the pic on your site looks like a pic of a friend I photographed, but the fact you have the photographed in your database as reference. You have the dna of my work, in everything you produce.

DVTimes 04-12-2023 08:08 AM

Quote:

Originally Posted by pornlaw (Post 23123029)
I do agree that AI will result in a lot of litigation. So if you're an AI content producer you should be creating a litigation fund from the beginning to prepare for when/if you are sued.


Yes

I suspect that this will be tested out in court a lot.

Your going to I suspect find lots of legal firms develop legal arguments about copyright.

I suspect they will also target those with least cash as they will be easer cases to win.

DVTimes 04-12-2023 08:19 AM

Quote:

Originally Posted by pornlaw (Post 23122536)
Every data set for AI is built on photos and in that data set are photos of children and infants so yes it is impossible for it to generate CSAM.

In truth I am not technically minded.

I looked up what CSAM.

It says this is child abuse material.

So I am not clear about your statement.

I presume that someone could use AI to produce illegal content with ease.

Unless your statement means that the content is not real and so no one is harmed. In the same way I could tell ai to produce a pic of me with my head being cut off. As in reality I have not been harmed.

Although my thread is not about producing illegal content, this will also be a problem with ai that you could in fact producer thousands of illegal content within a few mins and flood the internet.

We have already seen the pope wearing a white coat and people believing it.

One problem will be that you could produce pics of people you do not like doing illegal activities (such as selling drugs) and post them on the net.

Indeed the courts may have a problem soon as video and photo evidence may become a problem as people will just claim it is not real and ai generated.

Indeed, how long before someone murders someone and the evidence is mobile phone footage, and the guilty person just says it is ai generated. In the UK you cannot convict someone if there is some doubt of guilt.

2MuchMark 04-12-2023 08:19 AM

Quote:

Originally Posted by DVTimes (Post 23122983)
I think some of you are not understanding what my point was.

It is not about what the ai pics look like, it is what the ai images are used to create the ai image.

For example you may have produced a naked elf. Now the else may not look like any human. But it will be what content was used by the ai machine.

Interesting question. In my opinion, the images utilized are an integral component of the training model. This raises the question of how an AI that learns to generate images based on other images differs from a human who does the same.

Both humans and AI can recognize various body parts such as eyes, hair, arms, and legs, and have the ability to understand their appearances. So in this case, what is the difference?

2MuchMark 04-12-2023 08:22 AM

Quote:

Originally Posted by DVTimes (Post 23122464)
One problem for those who are considering using AI to make porn is going to be that people may take legal action against you.

A lot of this AI seems to be based on online pics (possibly vid too).

I can see this may become a legal problem, as people are not going to want an image of themselves used on a pornsite, and in time I suspect we will see legal firms set up to just contact such as pornsites and question what system you used and what content was used to produce the AI content.

I suspect we may see new laws where AI producers will need to provide evidence that the content they use has been obtained by people giving consent.

While I can see a lot of people seeing AI as a cheap and easy way to get content for a porn site. It may end up turning into costly legal battles.

Not to mention that the content AI produces may be built on using pics of people under 18, and this too is going to create legal problems. By this I mean you may have an image produced of someone 30, but to create this image, it may be built from millions of pics, and even if one pic used us under 18, then perhaps this could be a legal problem. I could see the laws may also request you can provide evidence that all AI porn produced has been built on people 18 or older, who all gave consent.


I copy-pasted your post into ChatGPT with the prompt "What do you think?" and it gave me the following:
As an AI language model, I don't have personal opinions. However, I can provide insights on the topic you presented.

The concern you raised regarding the use of AI to generate images and videos without consent is a valid one. It is possible that legal action could be taken against those who use AI to create content based on images or videos of individuals who have not given their consent. This could lead to costly legal battles, as well as damage to the reputation of the individuals or organizations involved.

Furthermore, the use of images of minors in the creation of AI-generated content could also pose legal challenges. If even a single image used in the creation of the content features a minor, this could be considered a violation of privacy laws.

It is possible that new laws and regulations may be put in place to address these concerns, requiring AI producers to provide evidence that the content they use has been obtained with consent and only from individuals 18 years or older. This would help to protect the privacy rights of individuals and prevent legal issues from arising.

In conclusion, while AI-generated content may offer a cheap and easy way to produce images and videos, it is important to consider the legal implications of using this technology without proper consent and safeguards in place.

DVTimes 04-12-2023 10:04 AM

Quote:

Originally Posted by 2MuchMark (Post 23123042)
I copy-pasted your post into ChatGPT with the prompt "What do you think?" and it gave me the following:
As an AI language model, I don't have personal opinions. However, I can provide insights on the topic you presented.

The concern you raised regarding the use of AI to generate images and videos without consent is a valid one. It is possible that legal action could be taken against those who use AI to create content based on images or videos of individuals who have not given their consent. This could lead to costly legal battles, as well as damage to the reputation of the individuals or organizations involved.

Furthermore, the use of images of minors in the creation of AI-generated content could also pose legal challenges. If even a single image used in the creation of the content features a minor, this could be considered a violation of privacy laws.

It is possible that new laws and regulations may be put in place to address these concerns, requiring AI producers to provide evidence that the content they use has been obtained with consent and only from individuals 18 years or older. This would help to protect the privacy rights of individuals and prevent legal issues from arising.

In conclusion, while AI-generated content may offer a cheap and easy way to produce images and videos, it is important to consider the legal implications of using this technology without proper consent and safeguards in place.



Basically what I have said.

cordoba 04-12-2023 11:04 PM

Quote:

Originally Posted by DVTimes (Post 23123038)
In truth I am not technically minded.

One problem will be that you could produce pics of people you do not like doing illegal activities (such as selling drugs) and post them on the net.

Indeed the courts may have a problem soon as video and photo evidence may become a problem as people will just claim it is not real and ai generated.

Indeed, how long before someone murders someone and the evidence is mobile phone footage, and the guilty person just says it is ai generated. In the UK you cannot convict someone if there is some doubt of guilt.

Yes, in a few years time it will be impossible to tell whether any media is real or AI. Even whether the presenter reading the news to you is AI generated or not.

As it is, there are probably millions of people who genuinely believe that the war in the Ukraine is entirely staged.

Even if AI only improves a realitively little over the next decade, the world is still going to be absolutely transformed by it.

The Porn Nerd 04-13-2023 05:34 AM

Quote:

Originally Posted by cordoba (Post 23123421)
Yes, in a few MONTHS time it will be impossible to tell whether any media is real or AI. Even whether the presenter reading the news to you is AI generated or not.

As it is, there are probably millions of people who genuinely believe that the war in the Ukraine is entirely staged.

Even if AI only improves a realitively little over the next decade, the world is still going to be absolutely transformed by it.

Fixed it for ya. :thumbsup


All times are GMT -7. The time now is 06:30 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123