Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums.

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today!

If you have any problems with the registration process or your account login, please contact us.

Post New Thread Reply

Register GFY Rules Calendar
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed.

 
Thread Tools
Old 10-27-2008, 08:57 AM   #1
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Robots.txt

I am looking into the SEO of my sites and someone mentioned a robots.txt file that I should have ??? I am of course going to speak to my best friend Mr Google... but wondered if anyone could take 5 mins to explain what it is and what it should consist of... ?

__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:05 AM   #2
Lycanthrope
Confirmed User
 
Lycanthrope's Avatar
 
Industry Role:
Join Date: Jan 2004
Location: Wisconsin
Posts: 4,517
In a nutshell, robots.txt is used to tell spiders / crawlers what NOT to scan.

Not every robot obeys it however.

User-agent: *
Disallow: /

Would tell ALL robots not to look at any pages.

I recommend NOT using the example above.
__________________
Lycanthrope is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:11 AM   #3
The Duck
Adult Content Provider
 
The Duck's Avatar
 
Industry Role:
Join Date: May 2005
Location: Europe
Posts: 18,243
I tell the robots the adress to my sitemap, done!
__________________
Skype Horusmaia
ICQ 41555245
Email [email protected]
The Duck is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:12 AM   #4
DutchTeenCash
I like Dutch Girls
 
DutchTeenCash's Avatar
 
Join Date: Feb 2003
Location: dutchteencash.com
Posts: 21,684
google it and read some stuff it can do a lot for your sites for sure
DutchTeenCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:23 AM   #5
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Hmmm...

Quote:
Originally Posted by DutchTeenCash View Post
google it and read some stuff it can do a lot for your sites for sure
I am currently reading up on it... I am a little confuzzled though... I want bots/crawlers to read my site, so does it matter if I dont have a robots.txt file ? It says if I have a robots.txt file thats empty they will all take it that its ok to enter... but in not having one or having a blank one does that mean they are less likely to crawl the site ?

I understand however some bots are undesirable... where is there a list of the bots that are not wanted ???

Someone said about bots that some are bad bots that for example harvest emails, and others that are site rippers... but it appears these are more blockable by re writing your HTACESS file as opposed to editing the robots.txt file... but again I would ask where you find a list of bad bots/site rippers that is up to date to put into your HTACCESS
__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:29 AM   #6
TripleXPrint
Confirmed User
 
TripleXPrint's Avatar
 
Join Date: Apr 2007
Posts: 983
Robot text files are pretty much useless unless you're trying to get Google to not index a page. Google will crawl your page and index your content so just keep the robot.txt file out of your header. You really don't need it.
__________________
Skype: Triplexprint
TripleXPrint is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:31 AM   #7
Jdoughs
Confirmed User
 
Jdoughs's Avatar
 
Industry Role:
Join Date: Mar 2004
Location: Great White North
Posts: 5,794
Quote:
Originally Posted by BigPimpCash View Post
I am currently reading up on it... I am a little confuzzled though... I want bots/crawlers to read my site, so does it matter if I dont have a robots.txt file ? It says if I have a robots.txt file thats empty they will all take it that its ok to enter... but in not having one or having a blank one does that mean they are less likely to crawl the site ?

I understand however some bots are undesirable... where is there a list of the bots that are not wanted ???

Someone said about bots that some are bad bots that for example harvest emails, and others that are site rippers... but it appears these are more blockable by re writing your HTACESS file as opposed to editing the robots.txt file... but again I would ask where you find a list of bad bots/site rippers that is up to date to put into your HTACCESS
If you want them to spider everything, dont add one. If you feel you need to, add a general one allowing all bots, if you have a specific bot problem, ban that one bot. You can also use it to ban certain directories.

To allow all robots complete access:

User-agent: *
Disallow:


To exclude all robots from the server:

User-agent: *
Disallow: /

To exclude all robots from parts of a server:

User-agent: *
Disallow: /private/
Disallow: /images-saved/
Disallow: /images-working/

To exclude a single robot from the server:

User-agent: Named Bot
Disallow: /
__________________
LinkSpun - Premier Adult Link Trading Community - ICQ - 464/\281/\250
Be Seen By New Webmasters/Affiliates * Target out webmasters/affiliates based on niches your sites are for less than $20 a month.
AmeriNOC - Proudly hosted @ AmeriNOC!
Jdoughs is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:47 AM   #8
TheSenator
Too lazy to set a custom title
 
TheSenator's Avatar
 
Industry Role:
Join Date: Feb 2003
Location: NJ
Posts: 13,336
You should hooked this up on your site.
http://www.google.com/webmasters/tools/
__________________
ISeekGirls.com since 2005
TheSenator is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 10:18 AM   #9
baddog
So Fucking Banned
 
Industry Role:
Join Date: Apr 2001
Location: the beach, SoCal
Posts: 107,089
Use it to keep spiders out of certain parts of your site.
baddog is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 10:19 AM   #10
alexchechs
Confirmed User
 
alexchechs's Avatar
 
Join Date: May 2008
Location: BROOKLYN!!!
Posts: 3,474
They are pretty simple files and can do alot for your site. I also suggest using a .xml site map and have google spider it often
__________________
Alex Chechs
http://thefawnconspiracy.com
alexchechs is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 11:25 AM   #11
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Thanks guys...

great advice keep it coming... I was thinking of a xml sitemap, just not sure how many things will be on it as we're a paysite and not that many pages but if it helps the SEo then I will speak to my guy about it...

How do I get google to look at it lots
__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 02:27 PM   #12
Tempest
Too lazy to set a custom title
 
Industry Role:
Join Date: May 2004
Location: West Coast, Canada.
Posts: 10,217
This is what I put on all of my sites

User-agent: Fasterfox
Disallow: /
User-agent: *
Disallow:
Sitemap: http://...../.....txt

The sitemap is just a text listing of all the urls in UTF-8. Sitemap is a recent addition to the robot.txt spec and the big 3 all grab it now. Much easier if you have lots of sites to do this than fuck around with the google stuff.

http://www.sitemaps.org/protocol.php#informing

The Fasterfox thing is due to some FF plugin or something that causes "false" hits to your site because it preloads pages.
Tempest is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 08:03 PM   #13
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Ok...

So the general consencus is we should create some form of site map... either xml or txt... and then put within the robots.txt a command to make sure that the bots read/crawl the sitemap

Anyone add anything to keep the bad ones out... or is that done mainly via the htpaccess file ? either way is there a good example ? For either... that shows ones that should be included ?
__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 08:51 PM   #14
mona
Registered User
 
Join Date: Feb 2008
Location: Mona = "female monkey" in Spanish
Posts: 1,940
Quote:
Originally Posted by TripleXPrint View Post
Robot text files are pretty much useless unless you're trying to get Google to not index a page. Google will crawl your page and index your content so just keep the robot.txt file out of your header. You really don't need it.


If you submit your sitemap to Google Webmaster Tools, then they'll spider everything regularly...Amongst lots of other cool stuff.
mona is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 08:53 PM   #15
mona
Registered User
 
Join Date: Feb 2008
Location: Mona = "female monkey" in Spanish
Posts: 1,940
Quote:
Originally Posted by BigPimpCash View Post
So the general consencus is we should create some form of site map... either xml or txt... and then put within the robots.txt a command to make sure that the bots read/crawl the sitemap

Anyone add anything to keep the bad ones out... or is that done mainly via the htpaccess file ? either way is there a good example ? For either... that shows ones that should be included ?
<> My fave --> www.xml-sitemaps.com </>
mona is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-27-2008, 09:08 PM   #16
d-null
. . .
 
d-null's Avatar
 
Industry Role:
Join Date: Apr 2007
Location: NY
Posts: 13,724
I add this to all of my sites' robots.txt :


User-agent: ia_archiver
Disallow: /




that keeps archive.org from spidering and keeping a copy of your site forever
__________________

__________________

Looking for a custom TUBE SCRIPT that supports massive traffic, load balancing, billing support, and h264 encoding? Hit up Konrad!
Looking for designs for your websites or custom tubesite design? Hit up Zuzana Designs
Check out the #1 WordPress SEO Plugin: CyberSEO Suite
d-null is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 02:10 AM   #17
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Question

Quote:
Originally Posted by d-null View Post
I add this to all of my sites' robots.txt :


User-agent: ia_archiver
Disallow: /




that keeps archive.org from spidering and keeping a copy of your site forever
Can I ask why this is a bad thing ? Isnt ia_archiver the alexa one ?
__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 08:20 AM   #18
NinjaSteve
Too lazy to set a custom title
 
Industry Role:
Join Date: Dec 2003
Posts: 11,089
Quote:
Originally Posted by d-null View Post
I add this to all of my sites' robots.txt :


User-agent: ia_archiver
Disallow: /




that keeps archive.org from spidering and keeping a copy of your site forever
Seems to be more than just the archive.org, but alexa overall.
http://www.alexa.com/site/help/webmasters
__________________
...

Last edited by NinjaSteve; 10-28-2008 at 08:23 AM..
NinjaSteve is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 10:02 AM   #19
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Question

Can anyone show me an example of a xml sitemap for a paysite that they use to encourage the spiders to crawl the page... ???
__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 11:50 AM   #20
wizzart
scriptmaster
 
wizzart's Avatar
 
Industry Role:
Join Date: May 2006
Location: Serbia
Posts: 5,237
Quote:
Originally Posted by BigPimpCash View Post
Can anyone show me an example of a xml sitemap for a paysite that they use to encourage the spiders to crawl the page... ???

Quote:

User-agent: BecomeBot
Disallow: /

User-agent: Nutch
Disallow: /

User-agent: Jetbot/1.0
Disallow: /

User-agent: Jetbot
Disallow: /

User-agent: Teoma
Disallow: /

User-agent: WebVac
Disallow: /

User-agent: Stanford
Disallow: /

User-agent: Stanford CompSciClub
Disallow: /

User-agent: Stanford CompClub
Disallow: /

User-agent: Stanford Spiderboys
Disallow: /

User-agent: scooter
Disallow: /

User-agent: naver
Disallow: /

User-agent: dumbot
Disallow: /

User-agent: Hatena Antenna
Disallow: /

User-agent: grub-client
Disallow: /

User-agent: grub
Disallow: /

User-agent: looksmart
Disallow: /

User-agent: WebZip
Disallow: /

User-agent: larbin
Disallow: /

User-agent: b2w/0.1
Disallow: /

User-agent: Copernic
Disallow: /

User-agent: psbot
Disallow: /

User-agent: Python-urllib
Disallow: /

User-agent: NetMechanic
Disallow: /

User-agent: URL_Spider_Pro
Disallow: /

User-agent: CherryPicker
Disallow: /

User-agent: EmailCollector
Disallow: /

User-agent: EmailSiphon
Disallow: /

User-agent: WebBandit
Disallow: /

User-agent: EmailWolf
Disallow: /

User-agent: ExtractorPro
Disallow: /

User-agent: CopyRightCheck
Disallow: /

User-agent: Crescent
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: ProWebWalker
Disallow: /

User-agent: CheeseBot
Disallow: /

User-agent: LNSpiderguy
Disallow: /

User-agent: Curl
Disallow: /

User-agent: ia_archiver
Disallow: /

User-agent: ia_archiver/1.6
Disallow: /

User-agent: Alexibot
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: Stanford Comp Sci
Disallow: /

User-agent: MIIxpc
Disallow: /

User-agent: Telesoft
Disallow: /

User-agent: Website Quester
Disallow: /

User-agent: moget/2.1
Disallow: /

User-agent: WebZip/4.0
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebSauger
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: NetAnts
Disallow: /

User-agent: Mister PiX
Disallow: /

User-agent: WebAuto
Disallow: /

User-agent: TheNomad
Disallow: /

User-agent: WWW-Collector-E
Disallow: /

User-agent: RMA
Disallow: /

User-agent: libWeb/clsHTTP
Disallow: /

User-agent: asterias
Disallow: /

User-agent: httplib
Disallow: /

User-agent: turingos
Disallow: /

User-agent: spanner
Disallow: /

User-agent: InfoNaviRobot
Disallow: /

User-agent: Harvest/1.5
Disallow: /

User-agent: Bullseye/1.0
Disallow: /

User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95)
Disallow: /

User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
Disallow: /

User-agent: CherryPickerSE/1.0
Disallow: /

User-agent: CherryPickerElite/1.0
Disallow: /

User-agent: WebBandit/3.50
Disallow: /

User-agent: NICErsPRO
Disallow: /

User-agent: Microsoft URL Control - 5.01.4511
Disallow: /

User-agent: DittoSpyder
Disallow: /

User-agent: Foobot
Disallow: /

User-agent: WebmasterWorldForumBot
Disallow: /

User-agent: SpankBot
Disallow: /

User-agent: BotALot
Disallow: /

User-agent: lwp-trivial/1.34
Disallow: /

User-agent: lwp-trivial
Disallow: /

User-agent: http://www.WebmasterWorld.com bot
Disallow: /

User-agent: BunnySlippers
Disallow: /

User-agent: Microsoft URL Control - 6.00.8169
Disallow: /

User-agent: URLy Warning
Disallow: /

User-agent: Wget/1.6
Disallow: /

User-agent: Wget/1.5.3
Disallow: /

User-agent: Wget
Disallow: /

User-agent: LinkWalker
Disallow: /

User-agent: cosmos
Disallow: /

User-agent: moget
Disallow: /

User-agent: hloader
Disallow: /

User-agent: humanlinks
Disallow: /

User-agent: LinkextractorPro
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Mata Hari
Disallow: /

User-agent: LexiBot
Disallow: /

User-agent: Web Image Collector
Disallow: /

User-agent: The Intraformant
Disallow: /

User-agent: True_Robot/1.0
Disallow: /

User-agent: True_Robot
Disallow: /

User-agent: BlowFish/1.0
Disallow: /

User-agent: http://www.SearchEngineWorld.com bot
Disallow: /

User-agent: JennyBot
Disallow: /

User-agent: MIIxpc/4.2
Disallow: /

User-agent: BuiltBotTough
Disallow: /

User-agent: ProPowerBot/2.14
Disallow: /

User-agent: BackDoorBot/1.0
Disallow: /

User-agent: toCrawl/UrlDispatcher
Disallow: /

User-agent: WebEnhancer
Disallow: /

User-agent: suzuran
Disallow: /

User-agent: VCI WebViewer VCI WebViewer Win32
Disallow: /

User-agent: VCI
Disallow: /

User-agent: Szukacz/1.4
Disallow: /

User-agent: QueryN Metasearch
Disallow: /

User-agent: Openfind data gathere
Disallow: /

User-agent: Openfind
Disallow: /

User-agent: Xenu's Link Sleuth 1.1c
Disallow: /

User-agent: Xenu's
Disallow: /

User-agent: Zeus
Disallow: /

User-agent: RepoMonkey Bait & Tackle/v1.01
Disallow: /

User-agent: RepoMonkey
Disallow: /

User-agent: Microsoft URL Control
Disallow: /

User-agent: Openbot
Disallow: /

User-agent: URL Control
Disallow: /

User-agent: Zeus Link Scout
Disallow: /

User-agent: Zeus 32297 Webster Pro V2.9 Win32
Disallow: /

User-agent: Webster Pro
Disallow: /

User-agent: EroCrawler
Disallow: /

User-agent: LinkScan/8.1a Unix
Disallow: /

User-agent: Keyword Density/0.9
Disallow: /

User-agent: Kenjin Spider
Disallow: /

User-agent: Iron33/1.0.2
Disallow: /

User-agent: Bookmark search tool
Disallow: /

User-agent: GetRight/4.2
Disallow: /

User-agent: FairAd Client
Disallow: /

User-agent: Gaisbot
Disallow: /

User-agent: Aqua_Products
Disallow: /

User-agent: Radiation Retriever 1.1
Disallow: /

User-agent: WebmasterWorld Extractor
Disallow: /

User-agent: Flaming AttackBot
Disallow: /

User-agent: Oracle Ultra Search
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: PerMan
Disallow: /

User-agent: searchpreview
Disallow: /

User-agent: sootle
Disallow: /

User-agent: es
Disallow: /

User-agent: Enterprise_Search/1.0
Disallow: /

User-agent: Enterprise_Search
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: http://Anonymouse.org/_(Unix)
Disallow: /

User-agent: WebRipper
Disallow: /

User-agent: Download_Ninja_7.0
Disallow: /

User-agent: Download_Ninja_2.0
Disallow: /

User-agent: Download_Ninja_5.0
Disallow: /

User-agent: Download_Ninja_3.0
Disallow: /

User-agent: Download_Ninja/4.0
Disallow: /

User-agent: FDM Free Download Manager
Disallow: /

User-agent: *
Disallow: /cgi-bin
wizzart is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 11:53 AM   #21
ScottXXX
Confirmed User
 
Join Date: Oct 2007
Posts: 102
Quote:
Originally Posted by kandah View Post
I tell the robots the adress to my sitemap, done!
lmao....
__________________
If you have sites with creampie gangbangs, gang bangs, or creampies please submit them to my new site:

http://creampiegangbang.org/add.php

http://creampiegangbang.org/
ScottXXX is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 12:02 PM   #22
BigPimpCash
Confirmed User
 
Join Date: Jul 2006
Posts: 808
Hey

Quote:
Originally Posted by wizzart View Post
Thanks for that Wizzart... I take it thats what you use to stop bad bots and site rippers... ??? Does it work ok in the robots.txt as I was told a lot of them ignore the robots.txt file... and its better used putting it in the HTACCESS file instead ???
__________________

BIG PIMP CASH - Upto 70% Revshare and Upto 30$ PPS
[email protected] - ICQ - 197119155

Need Content ? TRASHY CONTENT
BigPimpCash is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 12:22 PM   #23
TripleXPrint
Confirmed User
 
TripleXPrint's Avatar
 
Join Date: Apr 2007
Posts: 983
Quote:
Originally Posted by mona_klixxx View Post


If you submit your sitemap to Google Webmaster Tools, then they'll spider everything regularly...Amongst lots of other cool stuff.
Yep, that's exactly what I use and have A LOT of sites ranked on top of the SERPS for high ranking keywords. Give me a long tailed keyword and I guarantee you that I can have it on top of Google in less than a week. And that's without using blackhat methods. With blackhat I can get it there in less than 48 hours but it wouldn't stay up there long. To me optimization is like sex; the longer it lasts the better it feels.
__________________
Skype: Triplexprint
TripleXPrint is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Old 10-28-2008, 12:22 PM   #24
Kudles
Confirmed User
 
Join Date: Feb 2003
Location: Here There and Everywhere
Posts: 5,477
Google would help
__________________
Free to Play MMOs and MMORPGs
Kudles is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote
Post New Thread Reply
Go Back   GoFuckYourself.com - Adult Webmaster Forum > >

Bookmarks



Advertising inquiries - marketing at gfy dot com

Contact Admin - Advertise - GFY Rules - Top

©2000-, AI Media Network Inc



Powered by vBulletin
Copyright © 2000- Jelsoft Enterprises Limited.