Thread
:
Robots.txt
View Single Post
10-27-2008, 09:05 AM
Lycanthrope
Confirmed User
Industry Role:
Join Date: Jan 2004
Location: Wisconsin
Posts: 4,517
In a nutshell, robots.txt is used to tell spiders / crawlers what NOT to scan.
Not every robot obeys it however.
User-agent: *
Disallow: /
Would tell ALL robots not to look at any pages.
I recommend NOT using the example above.
__________________
Lycanthrope
View Public Profile
Find More Posts by Lycanthrope