robots.txt telling the search engines where to go
The robots.txt file is designed to instruct robots and spiders where they can go on your site and what they should and should not index. So robots can be instructed to ignore areas such as login pages, shopping carts or sensitive areas such as the cgi-bin.
The main issue with robots.txt files is that maliciously programmed robots, spiders or crawlers will ignore this file or even use it to find the areas you want to keep hidden. Thus robots.txt should never be used as an attempt at security but rather as a means to bolster your SEO approach. For security you are much better served to use the .htaccess file.
We include the robots.txt file in every site we create. The important search engines, such as Google or Bing, appreciate a well developed robots.txt file and play nicely with them.