Google Webmaster Tools blog rolled out a new personal robot translator tool called “Robots.txt Generator” in Webmaster Tools. It's designed to give webmasters an easy and interactive way to build a robots.txt file. To access the tool, log-in to your Google Webmaster Tools account, then click on the Tools menu option on the left-hand side of the screen after you select one of your verified sites. You'll see a "Generate robots.txt" link among the tool options.
It can be as simple as entering the files and directories contents, you don't want crawled by any search engines robots. Or, if you need to, you can create fine-grained rules for specific robots and areas of your site.
Once you're finished with the generator, feel free to test the effects of your new robots.txt file with our robots.txt analysis tool. When you're done, just save the generated file to the top level (root) directory of your site, and you're good to go. There are a couple of important things to keep in mind about robots.txt files:
- Not every search engine will support every extension to robots.txt files
The Robots.txt Generator creates files that Googlebot will understand, and most other major robots will understand them too. But it's possible that some robots won't understand all of the robots.txt features that the generator uses.
- Robots.txt is simply a request
Although it's highly unlikely from a major search engine, there are some unscrupulous robots that may ignore the contents of robots.txt and crawl blocked areas anyway. If you have sensitive content that you need to protect completely, you should put it behind password protection rather than relying on robots.txt
Google, Search Engine, Indexing, Crawling, Webmaster Tools, Robots.txt