![]() |
![]() |
![]() |
||||
Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
![]() ![]() |
|
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
Thread Tools |
![]() |
#1 |
Confirmed User
Industry Role:
Join Date: Jun 2005
Location: concrete jungle
Posts: 3,488
|
sitemap and duplicate content question
I have a website that due to its setup has duplicate content pages that I can not remove.. I am going to rig something up to noindex them but I have a question about sitemaps...
If I make and submit a sitemap to google will the goog disregard what is not in the sitemap? |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#2 |
Confirmed User
Industry Role:
Join Date: Jul 2006
Location: Somewhere between reality and total ape-shit bonkers.
Posts: 2,870
|
No
Best to put noindex tags on those pages, or canonical links to point to your preferred version, or include which pages should not be indexed in your robots.txt file.
__________________
The best Adult Affiliate Programs reviewed and indexed by niche and feature. Easily find the sponsors that suit your needs. ![]() |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#3 |
Confirmed User
Industry Role:
Join Date: Nov 2009
Location: Heaven
Posts: 4,306
|
nope., sitemap is just an addition to the crawler.,
it like , Where crawler cant reach use sitemap for everything else theres crawler. :P |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#4 |
Confirmed User
Industry Role:
Join Date: Sep 2004
Location: Montrealquebecanada
Posts: 5,500
|
If the pages in question haven't been indexed I think you can use the robots.txt file to tell it not to crawl them.
But if they have, forget it, you need to "noindex" them... :D
__________________
![]() YOU Are Industry News! Press Releases: pr[at]payoutmag.com Facebook: Payout Magazine! Facebook: MIKEB! ICQ: 248843947 Skype: Mediaguy1 |
![]() |
![]() ![]() ![]() ![]() ![]() |