20 years ago, the robots.txt file was created as a way to limit what web crawlers would and would not index. The file disallows certain pages on a web site from being indexed, and therefore keeping it out of search results. And, like any other anniversary of something related to technology, Google is taking the opportunity to make a clever joke with it.
Google has uploaded a robots.txt file to their servers that can be found at Google.com/killer-robots.txt with instructions for the inevitable Terminator-style doomsday scenario. The file lists two user agents, the T-800 and T-1000, and instructs them to “disallow” the deaths of Sergey Brin and Larry Page. If you’re a fan of the Terminator movies, you’ll notice both of those user agents are also Terminator models. » Read the rest