Include a robots.txt File at the root of your site.
Open up a text editor, type User-agent: *. Then save the file as robots.txt and upload it to your root directory on your domain. This one command will tell the spiders that hit your site to crawl every page.
The search engine analyzes everything it indexes to determine what your website is all about, it is a good idea to block certain folders and files that have nothing to do with the content you want analyzed. You can disallow unrelated files to be read by adding "Disallow: /folder_name/" or "Disallow: /filename.html"
User-Agent: *
Disallow: /cgi-bin/
Disallow: /img/
