GoFuckYourself.com - Adult Webmaster Forum

GoFuckYourself.com - Adult Webmaster Forum (https://gfy.com/index.php)
-   Fucking Around & Business Discussion (https://gfy.com/forumdisplay.php?f=26)
-   -   do you think google's algorithm looks at script clues and treats sites differently... (https://gfy.com/showthread.php?t=938161)

d-null 11-12-2009 02:30 AM

do you think google's algorithm looks at script clues and treats sites differently...
 
do you think google's algorithm looks at script clues and treats sites differently depending on the type of site/script that it is?

I've theorized that they have been doing that for awhile, but never noticed if anyone else has talked about that... we used to like to assume that google treated all sites equally and on a page by page basis, but I am thinking that somwhere in the algorithm google now recognizes sites and categorizes how they are treated based on clues in identifying them (Ie. there are only a small number of scripts that make up the vast majority of content on the internet.. wordpress blogs, various forum scripts, etc... )... obviously there are many exceptions, static sites, custom scripts, but perhaps google uses some variables to make assumptions

so basically I am thinking that there might be slight differences in the way google ranks pages and or treats indexing of an entire site based on whether google considers it a "blog" or considers it a "forum" or considers it something else altogether


any other theories?

d-null 11-12-2009 02:35 AM

and therefore it also would follow that google would consider links off of pages to have different authority or value... like blog comments links vs. blog content links vs. forum post links vs. signature links etc... where one would have thought and hoped that any link on a page would be equal, it seems likely that google has discriminatory ways in ranking link values based on where the links are found and in what type of site they are found


thinking about this when the xrumer stuff started getting out of hand, forums have always been a great place to see viral buzz and links from forums were generally a good indicator of value, but now with the spammers abusing linkspam scripts one would think that google would be working on ways to combat that (and possibly costing the legit value of legit forum links)

d-null 11-12-2009 02:39 AM

and further thinking, one's choice of script when building a site would be an important decision if google is indeed treating different types of sites differently in the algorithm...

for example, say I want to spend an hour or two on a build and forget type of site, I can choose to do it in wordpress, or with forum software, or static with html in dreamweaver or something.... if google sees that it is built with forum software, will google penalize that site in the algorithm if there are not daily updates in the forum (like an active forum would have)? vs. if it was built as a static html site would google give the site more slack in demanding/expecting fresh content?


same goes for a blog, if google detects that your site is a wordpress blog, and it gets frequent updates, like 10 times a day, will that site fall under "is it a splog" scrutiny (where a forum might not)?

Iron Fist 11-12-2009 02:43 AM

Can't sleep?

After Shock Media 11-12-2009 02:49 AM

Of course not. Nobody at google is smart enough to program something that can even deal with tables let alone scripts :)

d-null 11-12-2009 02:49 AM

Quote:

Originally Posted by sharphead (Post 16542106)
Can't sleep?

:1orglaugh

just had a nice real old domain banned from google probably cuz I splogged out on it a bit too much I think, and another domain with a forum on it not getting much google love, and a static site getting lots more google love than it deserves, etc... so just sitting here wondering aloud at why google has to be the way google is


and I forgot to theorize on how google must deem some sites as being completely script generated, yet they show alot of those sites major google love when you look through the serps...

d-null 11-12-2009 02:52 AM

Quote:

Originally Posted by After Shock Media (Post 16542116)
Of course not. Nobody at google is smart enough to program something that can even deal with tables let alone scripts :)

I think I get what you are saying, perhaps it's the old stick it all together with chewing gum and duct tape and it'll be what it ends up being

Serge Litehead 11-12-2009 03:23 AM

success of any SE is built on content (being able to index it), relevance (being able to produce most relevant results) and references which add up to relevance.

you cannot discriminate content based on what back-end platform it's being served from.

Serge Litehead 11-12-2009 03:40 AM

if you worry that your theories may hold up to be somewhat true - you can spoof your back-end scripts in such way that they don't look like their default copies by customizing url structure and outputted html code.

content patterns and html source structure patterns may be used to find duplicate content - but its another subject for theorization how deep (by which i mean to what extent) search to weed out duplicate content can go.

darksoul 11-12-2009 03:55 AM

chew on this http://airweb.cse.lehigh.edu/
if you want some indepth look at search engines

sexy-frenchie 11-12-2009 04:22 AM

Interesting link, definitively bookmarked.
Thanks Darksoul, you gave me a lot of reading for the night ;)


All times are GMT -7. The time now is 02:55 PM.

Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
©2000-, AI Media Network Inc