Monday, December 26, 2005

Google Peeks on Purpose

Google must peek at pages they are expressly told not to index in robots.txt on purpose just to see what in the heck is being hidden. There is no technical reason imaginable that they can't read the same robots.txt everyone else is reading as neither Yahoo, MSN or Teoma have ever crawled the pages marked off limits yet Google just can't seem to control themselves and keep their damned bots off those pages.

So which is it Google?

  • Everything at Google is still in BETA so what do you expect
  • Our engineering dept. just can't get all the bugs out, get over it
  • We peek regardless because we're Google and we can
I'd really like to know which it is as the competition doesn't seem to break those rules.

1 comment:

Anonymous said...

You still catching the Googlebots where they shouldn't be?

Don't bother asking questions of Google as they don't have any answers and my 15 month old grandaughter can write better code than Google can.