View Single Post
Old 07-08-2019, 12:15 AM  
thommy
Confirmed User
 
thommy's Avatar
 
Industry Role:
Join Date: Jun 2003
Location: Switzerland / Germany / Thailand
Posts: 5,469
Quote:
Originally Posted by brassmonkey View Post
they want to reduce page removal right?? this is something they have to pay for currently. i think they are trimming the fat to focus on tech items. i use robots on everything
I think this is just one reason the other is that they donīt get fined for what they show.

actually Google shows many documents and websites that do not have a robot.txt

now letīs imagine a funny example:

a weapon company uploads the newest secret version of a killer machine into their web - Google crawls it and publish it without the explicit demand of doing so - they would be also in trouble.

THE INTERNET law is not existing and google works worldwide under the laws of 255 different countries.
I think that robots.txt would be the simplest way to allow or deny to crawl and publish
stuff from a site.

we can see everywhere in internet that rules and laws are going to an excessive point. users have to agree to cookies (even when this was a common technique for the part 25 years).

in addition, an internet presence is not necessarily a privilege of companies. consumer protection can also apply here to the site operator.
__________________
Open for handpicked publishers and advertisers:
www.trafficfabrik.com
thommy is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote