12-19-2020, 12:31 AM
|
|
Living The Dream
Industry Role:
Join Date: Jun 2009
Location: Inside a Monitor
Posts: 19,507
|
Quote:
Originally Posted by InfoGuy
I numbered the bullet points to make them easier to reference. This proposed bill is a clusterfuck.
(1) "Platforms hosting pornography" would likely include Twitter and Reddit, so if this bill was passed, they're likely to ban porn vs. dealing with the onerous compliance requirements.
Would "platforms hosting pornography" include porn related forums like GFY where actual images and videos are not hosted, but hotlinked?
(1a) "Require any user uploading a video" appears to exempt existing content and any images, from ID verification.
Most people will not be willing to give up their privacy, especially pirates. This could prevent lots of pirated videos from being uploaded, both to tubes and filelockers.
(1b) "Require any user uploading a video" appears to exempt existing content and any images, from consent forms.
Given the allegations made by the NYT against PH about UA victims, it's surprising the requirement only requires consent and not consent with proof of 18+.
Note the word "appearing" is not the same as "performing". Consent forms would need to be provided for everyone appearing in Public Disgrace, Party Hardcore or similar videos regardless of whether they are clothed or performing sex acts.
(2) Good, let the victims sue for damages. Why aren't videos without consent also included?
(3) How would a platform verify the individual? Most victims want to remain anonymous. The plaintiffs in GDP were all Jane Does. Would a victim be expected to out themselves and provide personally identifiable info to potentially hundreds or thousands of porn sites that have unauthorized content? A UA victim certainly can't be expected to provide ID.
Why can't victims request removal of images without consent?
(4) I doubt this can work. Screen capture software that users can use to circumvent download prohibitions is already widely available.
(5) This is very expensive to implement and only the biggest players will be able to do this. "Staffed by the platform" seems to imply outsourcing isn't permitted.
How are phone attendants supposed to verify a stranger calling on the phone and compare the caller to someone allegedly appearing in a video?
(5a) Is 2 hours enough time for platforms to even verify complaints before a takedown?
What if the flagged videos are legitimate videos with performers who have given consent and are 18+?
(6) Is this even possible? If analyzing the data from videos, wouldn't videos of different length or resolution have different "fingerprints"? Would adding an overlay like a watermark change the "fingerprint"?
Would multiple platforms with common ownership such as MindGeek have to implement software that blocks a removed video from a platform to being uploaded to a related platform?
(7) How would the FTC enforce US law for sites hosted outside the US or by persons/entities who are not US citizens/residents/based/incorporated?
(8) See comments from (3) above.
Who has access to this sensitive info? Compare this to the WHOIS database.
Would this database conflict with EU GDPR?
How would those few actually willing to provide personally identifiable info be protected from hackers?
Would it block access to companies looking to mine data and sell it like those who sell background checks?
Would someone be able to type in the name of their coworker or neighbor to get info from the database?
Would the database draw the attention of stalkers and sexual predators?
|
Synopsis: good luck passing and enforcing this clusterfuck. 
|
|
|