![]() |
![]() |
![]() |
||||
Welcome to the GoFuckYourself.com - Adult Webmaster Forum forums. You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content and access many other special features. Registration is fast, simple and absolutely free so please, join our community today! If you have any problems with the registration process or your account login, please contact us. |
![]() ![]() |
|
Discuss what's fucking going on, and which programs are best and worst. One-time "program" announcements from "established" webmasters are allowed. |
|
Thread Tools |
![]() |
#1 |
. . .
Industry Role:
Join Date: Apr 2007
Location: NY
Posts: 13,724
|
do you think google's algorithm looks at script clues and treats sites differently...
do you think google's algorithm looks at script clues and treats sites differently depending on the type of site/script that it is?
I've theorized that they have been doing that for awhile, but never noticed if anyone else has talked about that... we used to like to assume that google treated all sites equally and on a page by page basis, but I am thinking that somwhere in the algorithm google now recognizes sites and categorizes how they are treated based on clues in identifying them (Ie. there are only a small number of scripts that make up the vast majority of content on the internet.. wordpress blogs, various forum scripts, etc... )... obviously there are many exceptions, static sites, custom scripts, but perhaps google uses some variables to make assumptions so basically I am thinking that there might be slight differences in the way google ranks pages and or treats indexing of an entire site based on whether google considers it a "blog" or considers it a "forum" or considers it something else altogether any other theories?
__________________
__________________ Looking for a custom TUBE SCRIPT that supports massive traffic, load balancing, billing support, and h264 encoding? Hit up Konrad!
Looking for designs for your websites or custom tubesite design? Hit up Zuzana Designs Check out the #1 WordPress SEO Plugin: CyberSEO Suite |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#2 |
. . .
Industry Role:
Join Date: Apr 2007
Location: NY
Posts: 13,724
|
and therefore it also would follow that google would consider links off of pages to have different authority or value... like blog comments links vs. blog content links vs. forum post links vs. signature links etc... where one would have thought and hoped that any link on a page would be equal, it seems likely that google has discriminatory ways in ranking link values based on where the links are found and in what type of site they are found
thinking about this when the xrumer stuff started getting out of hand, forums have always been a great place to see viral buzz and links from forums were generally a good indicator of value, but now with the spammers abusing linkspam scripts one would think that google would be working on ways to combat that (and possibly costing the legit value of legit forum links) |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#3 |
. . .
Industry Role:
Join Date: Apr 2007
Location: NY
Posts: 13,724
|
and further thinking, one's choice of script when building a site would be an important decision if google is indeed treating different types of sites differently in the algorithm...
for example, say I want to spend an hour or two on a build and forget type of site, I can choose to do it in wordpress, or with forum software, or static with html in dreamweaver or something.... if google sees that it is built with forum software, will google penalize that site in the algorithm if there are not daily updates in the forum (like an active forum would have)? vs. if it was built as a static html site would google give the site more slack in demanding/expecting fresh content? same goes for a blog, if google detects that your site is a wordpress blog, and it gets frequent updates, like 10 times a day, will that site fall under "is it a splog" scrutiny (where a forum might not)?
__________________
__________________ Looking for a custom TUBE SCRIPT that supports massive traffic, load balancing, billing support, and h264 encoding? Hit up Konrad!
Looking for designs for your websites or custom tubesite design? Hit up Zuzana Designs Check out the #1 WordPress SEO Plugin: CyberSEO Suite |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#4 |
Too lazy to set a custom title
Join Date: Dec 2006
Posts: 23,400
|
Can't sleep?
__________________
i like waffles |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#5 |
It's coming look busy
Join Date: Mar 2001
Location: "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn".
Posts: 35,299
|
Of course not. Nobody at google is smart enough to program something that can even deal with tables let alone scripts
![]()
__________________
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#6 |
. . .
Industry Role:
Join Date: Apr 2007
Location: NY
Posts: 13,724
|
![]() just had a nice real old domain banned from google probably cuz I splogged out on it a bit too much I think, and another domain with a forum on it not getting much google love, and a static site getting lots more google love than it deserves, etc... so just sitting here wondering aloud at why google has to be the way google is and I forgot to theorize on how google must deem some sites as being completely script generated, yet they show alot of those sites major google love when you look through the serps... |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#7 |
. . .
Industry Role:
Join Date: Apr 2007
Location: NY
Posts: 13,724
|
I think I get what you are saying, perhaps it's the old stick it all together with chewing gum and duct tape and it'll be what it ends up being
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#8 |
Confirmed User
Industry Role:
Join Date: Dec 2002
Location: Behind the scenes
Posts: 5,190
|
success of any SE is built on content (being able to index it), relevance (being able to produce most relevant results) and references which add up to relevance.
you cannot discriminate content based on what back-end platform it's being served from.
__________________
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#9 |
Confirmed User
Industry Role:
Join Date: Dec 2002
Location: Behind the scenes
Posts: 5,190
|
if you worry that your theories may hold up to be somewhat true - you can spoof your back-end scripts in such way that they don't look like their default copies by customizing url structure and outputted html code.
content patterns and html source structure patterns may be used to find duplicate content - but its another subject for theorization how deep (by which i mean to what extent) search to weed out duplicate content can go.
__________________
|
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#10 |
Confirmed User
Join Date: Apr 2002
Location: /root/
Posts: 4,997
|
chew on this http://airweb.cse.lehigh.edu/
if you want some indepth look at search engines |
![]() |
![]() ![]() ![]() ![]() ![]() |
![]() |
#11 |
Confirmed User
Join Date: Mar 2008
Location: olde europa
Posts: 199
|
Interesting link, definitively bookmarked.
Thanks Darksoul, you gave me a lot of reading for the night ;)
__________________
I like sexy porn |
![]() |
![]() ![]() ![]() ![]() ![]() |