How can I use a robots.txt file to my advantage? How can I make one? Print

  • 8

How can I use a robots.txt file to my advantage? How can I make one?

A robots.txt file is a file where you can give the search engine robots/spiders some kind of instructions when they visit your website.

Perhaps you had a web page that displayed personal information on it, but not personal enough to password protect. You could easily put instructions in a robots.txt file to tell the robots/spiders to ignore this web page and to NOT put it in the Search Engine listings. This is also good in case you have duplicate content as a result of the script you?re running. You can tell your robots.txt file to have the search engine spiders ignore one of the duplicate pages, while allowing it to index and analyze the other one. If you didn't already know, duplicate content on your site is a really bad thing for search engine rankings. Here is a great link which touches a bit on using robots.txt to disallow certain content from being listed:

http://www.robotstxt.org/robotstxt.html.

The actual robots.txt file can be placed in your /public_html/ folder to apply those settings recursively to the rest of your directories. You can also upload additional robots.txt files to sub directories if you want to instruct the robots to do something else within those folders as well.


Was this answer helpful?

« Back