View Single Post
Old 07-30-2013, 11:25 AM  
EN1GMA
Confirmed User
 
EN1GMA's Avatar
 
Industry Role:
Join Date: Nov 2005
Posts: 1,251
Simple question about robots.txt

Hi,

Maybe someone can help me with a little question that I have about robots.txt

So doing a site:domain.com on google.com I found many search results that I don't want to appear.
All links are:
http://www.domain.com/foldername/filename.php

I added this to my robots.txt, please let me know if is correct:
Code:
User-agent: *
Sitemap: http://www.domain.com/sitemap.xml
Disallow: /link/
Disallow: /link/*.php$
Is better to use both of them?
Or one of this lines is enough?
If yes, whats line is better?

After doing this, is better to use webmaster tools to say google to remove that links from search results?
I think that they have this option right? If is a good idea, can you please teach me by saying the steps that I need?

Thank you guys!

Best regards!
EN1GMA is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook