The robots.txt is a text file which has .txt extension. It is used to tell search engines (like Google, Yahoo and Bing) about the web pages that it should not be crawled and indexed and prevent search engines to crawl to specific area like feeds,trackbacks for security reason, wordpress admin pages and avoid duplicate content because that web pages are not so important for you.
How to create robots.txt file
i) Make a new text document (notepad).
ii) Write your robots.txt.
For example : ( Wordpress CMS )
User-Agent : *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /trackback/
Disallow: /comments/feed/
Disallow: /search?
Disallow: