Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 11-28-2022, 06:50 AM   #1
TheLegacy
SEO Connoisseur
 
TheLegacy's Avatar
 
Industry Role:
Join Date: Apr 2003
Location: Brantford, Ontario
Posts: 16,428
Sharing pornographic deepfakes to be illegal in England and Wales

It's one thing to have stolen videos of celebrities - it's another to create a video using a celebrity face OR someone you want to target. I've seen some deep fakes and while most don't impress me there are a few that simply blow my mind at how real life they appear. Now individuals have to fight just for their face/image being used improperly which is a far more difficult task than we may think. Just think Vegas and all those bait and switch escort ads that use images of celebrities or pornstars but that doesn't show up at the door.

--------------------------------------------------



A planned new law would make sharing pornographic deepfakes without consent a crime in England and Wales.

Tackling the rise in manipulated images, where a person's face is put on someone else's body, is part of a crackdown on the abuse of intimate pictures in the Online Safety Bill.

This law would also make it easier to charge people with sharing intimate photos without consent.

Prosecutors would no longer need to prove they intended to cause distress.

In some cases under the existing law, men have admitted sharing women's intimate images without consent, but have not been prosecuted because they said they did not intend to cause any harm.

The government says around one in 14 adults in England and Wales say they have been threatened with their intimate images being shared against their will.

It also says there are growing global concerns about technology being used to create fake pornographic images and video, with one website which creates nude images from clothed ones receiving 38 million visits last year.

In August, BBC Panorama exposed a network of men on the social media site Reddit who traded women's nudes online - including some which had been faked - as well as harassing and threatening the women.

The Law Commission said reporting such as this, along with campaigners' calls for stronger laws, helped to make a "compelling case" to government for reform.

It outlined recommendations earlier this year to ensure all examples of deliberately taking or sharing intimate images without consent are illegal.

One deepfake porn creator told the BBC earlier this year that the risk of prosecution could make him stop.

"If I could be traced online I would stop there and probably find another hobby," he said.

The Ministry of Justice also said it was looking at whether it could give the victims of intimate image abuse the same anonymity as the victims of sexual offences are granted, in line with the Law Commission's recommendations.

Prof Clare McGlynn at Durham University, an expert in image-based sexual abuse, told the BBC the changes were "a testament to the courage of women who have been speaking up", but added that it was "absolutely vital that anonymity is granted immediately".

"Victims tell us that not being able to be anonymous means they are more reluctant to report to the police and it means cases are more often dropped," she said.


https://www.bbc.com/news/technology-63669711



__________________
SEO Connoisseur


Microsoft Teams: Robert Warren SEO
Telegram: @TheLegacy54
RobertWarrenSEO.com
TheLegacy is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-19-2022, 04:10 PM   #2
$5 submissions
I help you SUCCEED
 
$5 submissions's Avatar
 
Industry Role:
Join Date: Nov 2003
Location: The Pearl of the Orient Seas
Posts: 32,195
Any progress on using AI to simulate completely made up actors?
$5 submissions is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-19-2022, 06:58 PM   #3
blackmonsters
Making PHP work
 
blackmonsters's Avatar
 
Industry Role:
Join Date: Nov 2002
Location: 🌎🌅🌈🌇
Posts: 20,230
Quote:
Originally Posted by $5 submissions View Post
Any progress on using AI to simulate completely made up actors?
I don't see much value in doing that; because the know face is the main attraction.

__________________
Make Money with Porn
blackmonsters is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-20-2022, 04:07 AM   #4
eric_wahlberg
Confirmed User
 
eric_wahlberg's Avatar
 
Industry Role:
Join Date: Dec 2015
Posts: 604
Quote:
Originally Posted by TheLegacy View Post
It's one thing to have stolen videos of celebrities - it's another to create a video using a celebrity face OR someone you want to target. I've seen some deep fakes and while most don't impress me there are a few that simply blow my mind at how real life they appear. Now individuals have to fight just for their face/image being used improperly which is a far more difficult task than we may think. Just think Vegas and all those bait and switch escort ads that use images of celebrities or pornstars but that doesn't show up at the door.

--------------------------------------------------



A planned new law would make sharing pornographic deepfakes without consent a crime in England and Wales.

Tackling the rise in manipulated images, where a person's face is put on someone else's body, is part of a crackdown on the abuse of intimate pictures in the Online Safety Bill.

This law would also make it easier to charge people with sharing intimate photos without consent.

Prosecutors would no longer need to prove they intended to cause distress.

In some cases under the existing law, men have admitted sharing women's intimate images without consent, but have not been prosecuted because they said they did not intend to cause any harm.

The government says around one in 14 adults in England and Wales say they have been threatened with their intimate images being shared against their will.

It also says there are growing global concerns about technology being used to create fake pornographic images and video, with one website which creates nude images from clothed ones receiving 38 million visits last year.

In August, BBC Panorama exposed a network of men on the social media site Reddit who traded women's nudes online - including some which had been faked - as well as harassing and threatening the women.

The Law Commission said reporting such as this, along with campaigners' calls for stronger laws, helped to make a "compelling case" to government for reform.

It outlined recommendations earlier this year to ensure all examples of deliberately taking or sharing intimate images without consent are illegal.

One deepfake porn creator told the BBC earlier this year that the risk of prosecution could make him stop.

"If I could be traced online I would stop there and probably find another hobby," he said.

The Ministry of Justice also said it was looking at whether it could give the victims of intimate image abuse the same anonymity as the victims of sexual offences are granted, in line with the Law Commission's recommendations.

Prof Clare McGlynn at Durham University, an expert in image-based sexual abuse, told the BBC the changes were "a testament to the courage of women who have been speaking up", but added that it was "absolutely vital that anonymity is granted immediately".

"Victims tell us that not being able to be anonymous means they are more reluctant to report to the police and it means cases are more often dropped," she said.


https://www.bbc.com/news/technology-63669711




I am in absolute support of this law. Many people are becoming victim of sextortion. At least, this bill will be able to protect them.
eric_wahlberg is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 12-20-2022, 06:15 AM   #5
2MuchMark
Videochat Solutions
 
2MuchMark's Avatar
 
Industry Role:
Join Date: Aug 2004
Location: Canada
Posts: 48,573
Quote:
Originally Posted by blackmonsters View Post
I don't see much value in doing that; because the know face is the main attraction.

Faces are quickly climbing out of the uncanny valley. I expect to be unable to tell CG faces from real faces within a year.
__________________

VideoChat Solutions | Custom Software | IT Support
https://www.2much.net | https://www.lcntech.com
2MuchMark is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks

Tags
images, intimate, law, sharing, consent, abuse, england, pornographic, wales, online, victims, bbc, means, government, stop, recommendations, women, anonymity, sexual, told, earlier, granted, womens, deepfakes, illegal



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.