GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   Stopping Download Spiders and Saving (https://gfy.com/showthread.php?t=124744)

digihax 04-12-2003 04:34 AM

Stopping Download Spiders and Saving
 
is there any scripts or something out there that would stop someone from downloading your content they can view it but when when they try to actually save it the file would return a different file?

I know this is possible as I have seen it firsthand anyone know how to do this I would be interested in adding this to one of my websites?

High Quality 04-12-2003 04:36 AM

Quote:

Originally posted by digihax
is there any scripts or something out there that would stop someone from downloading your content they can view it but when when they try to actually save it the file would return a different file?

I know this is possible as I have seen it firsthand anyone know how to do this I would be interested in adding this to one of my websites?

Not sure on file switch thing but there are certainly scripts out there that disallow users to go over certain bw limits set by the admin in a certain amount of time....

ServerGenius 04-12-2003 04:41 AM

A very easy trick that will reduce filesaving a lot is to put your
pics in tables with the pic itself as table background. Then put
a transparant gif of 1 pixel in the table and stretch it to the same
size as the pic itself.

That way when people rightclick and save the pic they download
the transparant pixel gif. At least itīs a pain in the ass for
people to steal your pics they either have to spider your pages
but you can limit the known spiders access by using .htaccess

DynaMite :2 cents:

ServerGenius 04-12-2003 04:44 AM

Then add these to your .htaccess file

AuthUserFile /dev/null
AuthGroupFile /dev/null

RewriteEngine On
RewriteOptions inherit
RewriteCond %{HTTP_USER_AGENT} ^.*Iria.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Backweb.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*gotit.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Bandit.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Ants.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Buddy.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*WebZIP.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Crawler.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Wget.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Grabber.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*BlackWidow.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Sucker.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Downloader.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Siphon.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Collector.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Widow.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Snake.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Vacuum.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Pump.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Reaper.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Mag-Net.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Memo.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*pcBrowser.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*SuperBot.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*leech.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Offline.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Copier.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Mirror.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*HMView.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*HTTrack.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*JOC.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*likse.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Recorder.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*GrabNet.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Likse.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Navroad.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*attach.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Magnet.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Surfbot.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Whacker.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*FileHound.*$
RewriteCond %{HTTP_USER_AGENT} ^.*Stripper.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Offline.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Crawler.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Snagger.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Teleport.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Go!Zilla.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*Go-Ahead-Got-It.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*FileHound.*$
RewriteRule /* http://www.domain.com/popuphell.php [R,L]


DynaMite :thumbsup

High Quality 04-12-2003 04:44 AM

Quote:

Originally posted by DynaSpain
A very easy trick that will reduce filesaving a lot is to put your
pics in tables with the pic itself as table background. Then put
a transparant gif of 1 pixel in the table and stretch it to the same
size as the pic itself.

That way when people rightclick and save the pic they download
the transparant pixel gif. At least itīs a pain in the ass for
people to steal your pics they either have to spider your pages
but you can limit the known spiders access by using .htaccess

DynaMite :2 cents:

Curious, have you got a list or know of a list of known spiders?

ServerGenius 04-12-2003 04:46 AM

Yeah I just posted a bunch of them....there are more but these
are the ones that are most used.....as I said, itīs bulletproof
but it will certainly block most of them and make it too much
hassle for most of the people to bother.....they will just pick
the next site that is easier to rip

DynaMite :winkwink:

High Quality 04-12-2003 04:48 AM

Quote:

Originally posted by DynaSpain
Yeah I just posted a bunch of them....there are more but these
are the ones that are most used.....as I said, itīs bulletproof
but it will certainly block most of them and make it too much
hassle for most of the people to bother.....they will just pick
the next site that is easier to rip

DynaMite :winkwink:

heh, yeah, I posted before your 2nd post. Thanks man.

DarkJedi 04-12-2003 04:52 AM

DynaSpain, if you put all that in your htaccess wont it overload the server ? It would be auto checking every visitor to each of these lines.

ServerGenius 04-12-2003 05:04 AM

Quote:

Originally posted by DarkJedi
DynaSpain, if you put all that in your htaccess wont it overload the server ? It would be auto checking every visitor to each of these lines.
No this is not so much load as it checks .htaccess stuff on the
request wich are very small http packets they are only bytes in
size. I've got this .htaccess setup and do quite a few millions of
hits (request) per day.

I do use dedicated servers and have tweaked apache quite a bit.
For example increased the hard_server_limit in httpd.h in order
to run 2048 apache instead of the default 256 and I have 1G
RAM in each box as apache loves memory.

if anyone wants to have a copy of my httpd.conf hit me up on
ICQ and I can help you out.

DynaMite :thumbsup

High Quality 04-12-2003 05:07 AM

Quote:

Originally posted by DynaSpain


No this is not so much load as it checks .htaccess stuff on the
request wich are very small http packets they are only bytes in
size. I've got this .htaccess setup and do quite a few millions of
hits (request) per day.

I do use dedicated servers and have tweaked apache quite a bit.
For example increased the hard_server_limit in httpd.h in order
to run 2048 apache instead of the default 256 and I have 1G
RAM in each box as apache loves memory.

if anyone wants to have a copy of my httpd.conf hit me up on
ICQ and I can help you out.

DynaMite :thumbsup

Which is?

ServerGenius 04-12-2003 05:10 AM

UIN# 370820

I might be gone as Iīm about to to head out for a while....but will
be back later (3 hours or so). Just leave me a message and I'll
get back to you

DynaMite :thumbsup

High Quality 04-12-2003 05:17 AM

Quote:

Originally posted by DarkJedi
DynaSpain, if you put all that in your htaccess wont it overload the server ? It would be auto checking every visitor to each of these lines.
I just added this list to my .htaccess configs and it noticibly dropped my cpu idle %. However I still have plenty of idle left on a P4 1.8ghz w/ 1GB ram.

funkmaster 04-12-2003 05:23 AM

"DynaSpain, if you put all that in your htaccess wont it overload the server ? It would be auto checking every visitor to each of these lines."

... 2048 server limit and the 1 GB ram does the trick here, if you donīt have that I wouldnīt use the htaccess shown above, it will take your box down.

on the other hand a robots.txt file comes in handy too. but this way you will have to rely on the spiders actually checking your robots.txt file ... and most of those offline browsers allow users to ignor the robots exclusions ...


All times are GMT -7. The time now is 04:01 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123