![]() |
Stopping Download Spiders and Saving
is there any scripts or something out there that would stop someone from downloading your content they can view it but when when they try to actually save it the file would return a different file?
I know this is possible as I have seen it firsthand anyone know how to do this I would be interested in adding this to one of my websites? |
Quote:
|
A very easy trick that will reduce filesaving a lot is to put your
pics in tables with the pic itself as table background. Then put a transparant gif of 1 pixel in the table and stretch it to the same size as the pic itself. That way when people rightclick and save the pic they download the transparant pixel gif. At least itīs a pain in the ass for people to steal your pics they either have to spider your pages but you can limit the known spiders access by using .htaccess DynaMite :2 cents: |
Then add these to your .htaccess file
AuthUserFile /dev/null AuthGroupFile /dev/null RewriteEngine On RewriteOptions inherit RewriteCond %{HTTP_USER_AGENT} ^.*Iria.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Backweb.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*gotit.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Bandit.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Ants.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Buddy.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*WebZIP.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Crawler.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Wget.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Grabber.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*BlackWidow.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Sucker.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Downloader.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Siphon.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Collector.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Widow.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Snake.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Vacuum.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Pump.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Reaper.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Mag-Net.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Memo.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*pcBrowser.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*SuperBot.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*leech.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Offline.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Copier.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Mirror.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*HMView.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*HTTrack.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*JOC.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*likse.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Recorder.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*GrabNet.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Likse.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Navroad.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*attach.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Magnet.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Surfbot.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Whacker.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*FileHound.*$ RewriteCond %{HTTP_USER_AGENT} ^.*Stripper.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Offline.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Crawler.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Snagger.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Teleport.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Go!Zilla.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*Go-Ahead-Got-It.*$ [OR] RewriteCond %{HTTP_USER_AGENT} ^.*FileHound.*$ RewriteRule /* http://www.domain.com/popuphell.php [R,L] DynaMite :thumbsup |
Quote:
|
Yeah I just posted a bunch of them....there are more but these
are the ones that are most used.....as I said, itīs bulletproof but it will certainly block most of them and make it too much hassle for most of the people to bother.....they will just pick the next site that is easier to rip DynaMite :winkwink: |
Quote:
|
DynaSpain, if you put all that in your htaccess wont it overload the server ? It would be auto checking every visitor to each of these lines.
|
Quote:
request wich are very small http packets they are only bytes in size. I've got this .htaccess setup and do quite a few millions of hits (request) per day. I do use dedicated servers and have tweaked apache quite a bit. For example increased the hard_server_limit in httpd.h in order to run 2048 apache instead of the default 256 and I have 1G RAM in each box as apache loves memory. if anyone wants to have a copy of my httpd.conf hit me up on ICQ and I can help you out. DynaMite :thumbsup |
Quote:
|
UIN# 370820
I might be gone as Iīm about to to head out for a while....but will be back later (3 hours or so). Just leave me a message and I'll get back to you DynaMite :thumbsup |
Quote:
|
"DynaSpain, if you put all that in your htaccess wont it overload the server ? It would be auto checking every visitor to each of these lines."
... 2048 server limit and the 1 GB ram does the trick here, if you donīt have that I wouldnīt use the htaccess shown above, it will take your box down. on the other hand a robots.txt file comes in handy too. but this way you will have to rely on the spiders actually checking your robots.txt file ... and most of those offline browsers allow users to ignor the robots exclusions ... |
All times are GMT -7. The time now is 04:01 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc123