Quote:
Originally Posted by SAC
on sites i dont need to exclude i dont use them
|
Same here... But now I got email from webmaster tools:
"Over the last 24 hours, Googlebot encountered 4 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%."
Why did I get it now? There was no robots.txt file for months yet I got this message today...