It's one thing to have stolen videos of celebrities - it's another to create a video using a celebrity face OR someone you want to target. I've seen some deep fakes and while most don't impress me there are a few that simply blow my mind at how real life they appear. Now individuals have to fight just for their face/image being used improperly which is a far more difficult task than we may think. Just think Vegas and all those bait and switch escort ads that use images of celebrities or pornstars but that doesn't show up at the door.
--------------------------------------------------
A planned new law would make sharing pornographic deepfakes without consent a crime in England and Wales.
Tackling the rise in manipulated images, where a person's face is put on someone else's body, is part of a crackdown on the abuse of intimate pictures in the Online Safety Bill.
This law would also make it easier to charge people with sharing intimate photos without consent.
Prosecutors would no longer need to prove they intended to cause distress.
In some cases under the existing law, men have admitted sharing women's intimate images without consent, but have not been prosecuted because they said they did not intend to cause any harm.
The government says around one in 14 adults in England and Wales say they have been threatened with their intimate images being shared against their will.
It also says there are growing global concerns about technology being used to create fake pornographic images and video, with one website which creates nude images from clothed ones receiving 38 million visits last year.
In August, BBC Panorama exposed a network of men on the social media site Reddit who traded women's nudes online - including some which had been faked - as well as harassing and threatening the women.
The Law Commission said reporting such as this, along with campaigners' calls for stronger laws, helped to make a "compelling case" to government for reform.
It outlined recommendations earlier this year to ensure all examples of deliberately taking or sharing intimate images without consent are illegal.
One deepfake porn creator told the BBC earlier this year that the risk of prosecution could make him stop.
"If I could be traced online I would stop there and probably find another hobby," he said.
The Ministry of Justice also said it was looking at whether it could give the victims of intimate image abuse the same anonymity as the victims of sexual offences are granted, in line with the Law Commission's recommendations.
Prof Clare McGlynn at Durham University, an expert in image-based sexual abuse, told the BBC the changes were "a testament to the courage of women who have been speaking up", but added that it was "absolutely vital that anonymity is granted immediately".
"Victims tell us that not being able to be anonymous means they are more reluctant to report to the police and it means cases are more often dropped," she said.
https://www.bbc.com/news/technology-63669711
