GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Best Way to Prevent Image Scraping? (https://gfy.com/showthread.php?t=1096834)

XSAXS 01-18-2013 08:43 PM

Best Way to Prevent Image Scraping?
 
Building a new picture post site (of sorts). Wondering if there are any solid strategies to diminish the effectiveness of image scraper bots?

It seems preventing non-browser agents is a waste of time because bots can mimic any browser they want.

I'd rather not hash all the file paths, as that seems like it would be a huge added burden on the server.

So if we're still allowed to talk about business here... I'm fishing for ideas. :upsidedow

Supz 01-18-2013 08:44 PM

use videos.

pornmasta 01-18-2013 08:57 PM

swf format for pictures ?
some javascript that clocks the real source code ?

acctman 01-19-2013 01:42 AM

can't stop screenshots... just watermark the image and be done with it

JamesM 01-19-2013 01:46 AM

idk if this would work or not, but this same techniques is used to prevent video files hotlinking in many big tubes.

lighttpd with token for images serving.

DWB 01-19-2013 05:12 AM

While screen recording is a royal pain in the ass and time consuming for videos, it literally takes 1 second to do it for a photo.

As it was said before, watermark it in a good place (randomly perhaps) and move on.

fris 01-19-2013 09:44 AM

if it was hotlinking i would say htaccess, im sure something can be done to stop curl or wget

mafia_man 01-19-2013 11:04 AM

Quote:

Originally Posted by acctman (Post 19432690)
can't stop screenshots... just watermark the image and be done with it

This....

XSAXS 01-19-2013 12:48 PM

As stated in the original post -- I'm specifically focused on IMAGE SCRAPER BOTS. I'm not at all worried about the average user who wants to save individual images to his hard drive.

Of course I know there's no bullet-proof method to prevent someone saving images. That's not the point. (!)

I just want to prevent robo site-ripping. Or at least make it such a pain in the ass that the ripper will move on to some other site instead.

I'm looking into throttling back the allowed page views -- allow only 30 page views per minute. The average surfer would never reach that threshold, but it would frustrate a lot of scrapers.

Also... serving the largest-sized images with javascript instead of the usual <img src""> would discourage most scrapers.

I'd also love to build a killer bot trap... I'm reading up on http://projecthoneypot.org right now.

fris 01-19-2013 12:51 PM

i would use tokens, mod_auth_token is good, so it generates a fresh url every x amount of seconds or mins.

XSAXS 01-19-2013 01:08 PM

Quote:

Originally Posted by fris (Post 19433350)
i would use tokens, mod_auth_token is good, so it generates a fresh url every x amount of seconds or mins.

Hmm. I've never heard of mod_auth_token. Fris. Will read up on it right now. Thank you for the pointer.

fris 01-19-2013 01:17 PM

Quote:

Originally Posted by XSAXS (Post 19433373)
Hmm. I've never heard of mod_auth_token. Fris. Will read up on it right now. Thank you for the pointer.

http://code.google.com/p/mod-auth-token/

here you go ;)

XSAXS 01-19-2013 01:20 PM

Yep, reading it now. Thanks again, Fris. Have you used it?

fris 01-19-2013 10:35 PM

Quote:

Originally Posted by XSAXS (Post 19433384)
Yep, reading it now. Thanks again, Fris. Have you used it?

yes i have used it, i also use my own expiring script which does the same thing.


All times are GMT -7. The time now is 04:39 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123