Quote:
Originally Posted by MrBaldBastard
Very scary bill they're trying to rapidly push through.
https://www.sasse.senate.gov/public/...ion-act-3-.pdf
Key components of the legislation:
(1) Require platforms hosting pornography to, within two weeks of enactment:
(a) Require any user uploading a video to the platform verify their
identity
(b) Require any user uploading a video to the platform also upload a
signed consent form from every individual appearing in the video.
(2) Creates a private right of action against an uploader who uploads a
pornographic image without the consent of an individual featured in the
image.
(3) Require platforms hosting pornography include a notice or banner on the
website instructing how an individual can request removal of a video if an
individual has not consented to it being uploaded on the platform.
(4) Prohibit video downloads from these platforms, to be in place within three
months of enactment of this legislation.
(5) Require platforms hosting pornography offer a 24-hour hotline staffed by
the platform. Individuals who contact the hotline can request removal of a
video that has been distributed without their consent.
(a) Require removal of flagged videos as quickly as possible, but not to
exceed 2 hours.
(6) Require platforms to use software to block a video from being reuploaded after its removal. The platforms must have this software in place
within six months of enactment of this legislation.
(7) Directs the Federal Trade Commission to enforce violations of these
requirements.
(8) Creates a database of individuals that have indicated they do not
consent. The database must be checked against before new content can be
uploaded to the platforms.
(a) Instructs the Department of Justice to promulgate rules on where this
database should be housed, and determine how to connect these
victims with services, to include counseling and casework.
(b) Failure to comply with this requirement will result in a civil penalty to
the platform, with proceeds going towards victims services.
|
I numbered the bullet points to make them easier to reference. This proposed bill is a clusterfuck.
(1) "Platforms hosting pornography" would likely include Twitter and Reddit, so if this bill was passed, they're likely to ban porn vs. dealing with the onerous compliance requirements.
Would "platforms hosting pornography" include porn related forums like GFY where actual images and videos are not hosted, but hotlinked?
(1a) "Require any user uploading a video" appears to exempt existing content and any images, from ID verification.
Most people will not be willing to give up their privacy, especially pirates. This could prevent lots of pirated videos from being uploaded, both to tubes and filelockers.
(1b) "Require any user uploading a video" appears to exempt existing content and any images, from consent forms.
Given the allegations made by the NYT against PH about UA victims, it's surprising the requirement only requires consent and not consent with proof of 18+.
Note the word "appearing" is not the same as "performing". Consent forms would need to be provided for everyone appearing in Public Disgrace, Party Hardcore or similar videos regardless of whether they are clothed or performing sex acts.
(2) Good, let the victims sue for damages. Why aren't videos without consent also included?
(3) How would a platform verify the individual? Most victims want to remain anonymous. The plaintiffs in GDP were all Jane Does. Would a victim be expected to out themselves and provide personally identifiable info to potentially hundreds or thousands of porn sites that have unauthorized content? A UA victim certainly can't be expected to provide ID.
Why can't victims request removal of images without consent?
(4) I doubt this can work. Screen capture software that users can use to circumvent download prohibitions is already widely available.
(5) This is very expensive to implement and only the biggest players will be able to do this. "Staffed by the platform" seems to imply outsourcing isn't permitted.
How are phone attendants supposed to verify a stranger calling on the phone and compare the caller to someone allegedly appearing in a video?
(5a) Is 2 hours enough time for platforms to even verify complaints before a takedown?
What if the flagged videos are legitimate videos with performers who have given consent and are 18+?
(6) Is this even possible? If analyzing the data from videos, wouldn't videos of different length or resolution have different "fingerprints"? Would adding an overlay like a watermark change the "fingerprint"?
Would multiple platforms with common ownership such as MindGeek have to implement software that blocks a removed video from a platform to being uploaded to a related platform?
(7) How would the FTC enforce US law for sites hosted outside the US or by persons/entities who are not US citizens/residents/based/incorporated?
(8) See comments from (3) above.
Who has access to this sensitive info? Compare this to the WHOIS database.
Would this database conflict with EU GDPR?
How would those few actually willing to provide personally identifiable info be protected from hackers?
Would it block access to companies looking to mine data and sell it like those who sell background checks?
Would someone be able to type in the name of their coworker or neighbor to get info from the database?
Would the database draw the attention of stalkers and sexual predators?