View Single Post
Old 10-14-2013, 02:36 AM  
DannyA
Registered User
 
Join Date: Oct 2005
Posts: 85
Just looked at your Google situation.

Getting 900,000 pages crawled and indexed puts you in a good place, but you've got a major problem with thin content. Out of all those indexed pages, Google only sees 600 as unique, so as far as you're concerned, you're spreading 600 pages worth of link juice over 900,000 pages.

Your biggest problem here is probably with the tagging. There are 300,000 tag pages indexed, but only 232 make it into the main index. The lowest hanging fruit you can grab would be to make it so the tag pages don't get indexed. You can do that by putting this in the head (but only do it on the tag pages):

Code:
<META NAME=?ROBOTS? CONTENT=?NOINDEX, FOLLOW?>
If you do this, you should see your indexed page count drop significantly, but the ones with good content will get all the link juice that was spread over hundreds of thousands of rejects.

One other thing is you've got 2 domains going on, the www and no www, so pick one and 301 the other. That should clear out a lot of garbage.
DannyA is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook