![]() |
Great post Lloyd
|
Quote:
|
Quote:
|
Quote:
|
Quote:
Near as I can tell it was a combination of tracking ID's along with a very similar (but NOT identical) contact form and tos template used across the sites. It has also occurred to me that I may have fallen under a blanket 'affiliate' penalty, whereby google determined that the sites in question were adding 'no real value' and therefore didn't belong in their SERPS - and used the points of commonality listed above to find them all. Example: - 8 sites with unique designs / original content which employ all discussed 'stealth factors' are in promotion of sponsor site A. - 3 of these sites are on the first page of Google (with the other 5 in the top 100) along with sponsor A's site. - Google notices these 8 sites have subtle points of commonality (ref codes, similar text, cms signature, etc) - and seem to talk a great deal about Site A. - While these 8 sites break no Google Webmaster Guidelines, it's clear that there sole existence is simply to get the visitor to sponsor site A - and as quickly as possible. - Big G asks the question "Why do we have 3 listings in our top 10 that are, for arguments sake, the exact same thing - this isn't helping our users" - Big G also notices that these 8 sites have a lot in common with these 32 other sites - and that these 32 have a lot in common with these other 84 sites...and so on...and so on... -You know the rest. Of course, sites like these make up half the internet...lucky me getting caught. I'm taking what happened as a glimpse into the 'future' of how Google will list results. The Big G is getting smarter and is relying more and more on factors such as bounce rates, time on site, and other 'user experience' type data. In my mind, the Search Engine of the future (whatever or whoever that might be), will be a highly evolved machine that ranks sites heavily based on USER EXPERIENCE. This is likely the hardest thing for an SEO'er to manipulate, and it makes sense that a smart SE would key in on it. Now, don't get me wrong, Content is STILL king, and on-site optimization, backlinks, and a sprinkle SEO 'magic' here and there all play a clutch role. And, if done right, can still get you to the top of the SE's...for now. I guess the moral of the story is that websites that offer no real value (affiliate or otherwise) have got their work cut out for them. |
Quote:
Good to know. |
Quote:
They have an online marketing bachelors and master's program. :winkwink: |
:thumbsup :thumbsup
|
Hey baddog - GREAT post man! Quick question tho:
I run a network of 18 paysites, and I just switched to a dedicated hosting plan. I've left some of lesser-performing sites on the old hosting (which is a shared account). All my sites are currently linked together, with no outside link trades. (See here: www.misterpeabodyworld.com) So just wondering if I should take every site in the network and get seperate hosting for each one? Or would just having the two hosts do? The idea of getting eighteen differant hosts, especially now that I've just switched to dedicated hosting ($500 a month, btw, as opposed to $20 a month for a shared account) seems daunting. Thanks!! |
Useful link for basics on Search Engine Optimization http://bit.ly/cDmwyp
|
Quote:
|
Quote:
I was talking about a real school lol. |
Quote:
Anyway, thanks! |
that was very informative, thanks for taking the time to write it up.
|
Quote:
They are an accredited university, I do know that much. Here's the link to the Master's Program: http://www.fullsail.edu/online/degre...keting-masters |
Very cool, thanks for the read. :)
|
Quote:
syc·o·phant (sĭk'ə-fənt, sī'kə-) n. A servile self-seeker who attempts to win favor by flattering influential people. [Latin sȳcophanta, informer, slanderer, from Greek sūkophantēs, informer, from sūkon phainein, to show a fig (probably originally said of denouncers of theft or exportation of figs) : sūkon, fig + phainein, to show; see bhā-1 in Indo-European roots.] syc'o·phan'tic (-fān'tĭk), syc'o·phan'ti·cal (-tĭ-kəl) adj., syc'o·phan'ti·cal·ly adv. |
Quote:
|
Quote:
|
I do "host crowding". I have 20 sites on one IP, and more than half of them are very similar. G has not penalized me and I have good placement on some decent keywords. I do get nervous sometimes about the sandbox although I fear algorithym rewrites more.
As to losing key #1 & #2 placements, that has happened to me in the past and it was one site on one IP on a shared host. G just decided my site and many like it just wasn't relevant any more. |
Good post, I've got my sites spread out over class c's on dedicated IP's, I'm no seo expert but I'm a believer that it helps. I do take issue with the following quote (if Ive missed any clarification in further posts, forgive me, I didn't read past page 1)
Quote:
Maybe google treats the super competitive terms differently or it wont return 2 of the same results from same server in top 10 or something, I don't know. I do believe host crowding plays a role but it's not as simple as stated, especially with the less competitive terms. |
Quote:
|
baddog can you hit me on icq, wanna bang heads for a few mins :thumbsup
|
Quote:
|
Top 100 ??
|
Thanks Baddog, It answers a lot of questions I had about SEO.. Always like to read your posts..
|
Very good post will.
|
some other guys really help here, thanks for a nice thougt:)hs
|
if u want seo hosting stay away from yellow fiber
|
How many of you "SEO Hosts" use the same DNS for each client / domain unless they ask how to make them private?
|
Quote:
On our dedicated servers, every customer provides their own DNS. On our shared servers with dedicated IPs, every Class C is given a different NS. If we use the same C on multiple servers, we give different nameservers on each subsequent server. As we are forced by design to limit the number of clients per server there is no possibility of a large number of sites having the same DNS. |
great read. Thank you:)
|
Thanks I never thought of using this for adult seo.
|
I really like the info, thanks for the share.
|
Great post, thanks for sharing Baddog
|
Quote:
|
Do NOT think that you can outrun or outsmart Google in Any Way... or achieve any higher results... by use ANY NET Architecture deployment such as Different C Class IPs, registrars, name servers, IPs virtual hosts or any other spread factors. Matt Cutts has Repeatedly shot those factors down ( I don't know why people keep taking about them ).
People are just always trying to get a leg up on Google - so they Believe in the factors just mentioned. This takes time and resources away from what really matters - building a good site. But what does that mean??? "Build a Good Site"? It sounds simple enough but many people read Far to into it. This is not the proper post for me to go really deep into it. What I will say here is building a "Good Site" means building a good site for the End User... not Google and not Googlebot( different iPs c-classes, name servers etc ). The End User determines what my people call The Popular Vote. When Google determines the Popular Vote for your site on a search term... If the vote is low... All the different links IPs, c-classes, etc. - Won't help you hold a high position one bit. Look at what matters to the End User... When they find a site they like and they visit it again and again ( increasing its Google Popular Vote ) Do you think they care what IP-C class it is on... or what DNS or registrar it is using? |
Thanks for clarifying this. I never understood before why a network of sites needed IP address ranges. Makes sense tho.
|
Quote:
|
Thanks for bringing to our attention the scam that is "SEO hosting", I believe many webmasters would have fallen for the scam without first reading your warning on the subject. Now that we know what to look for -- that's great. Best of luck at KFC where your skills are best suited.
|
All times are GMT -7. The time now is 12:20 PM. |
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc