Definition:
A file that tells search engines which pages on a website they should crawl and index. Robots.txt files can be used to prevent search engines from crawling certain pages, such as pages that are under construction or pages that are not yet ready to be published.