The robots.txt file is a simple text file that instructs search engine robots how to crawl and index pages on their website. It must be placed in your root directory, You can keep Web robots from accessing all or parts of your Web site that you want to keep private.
For example, if you do not want the search engines from indexing, certain scripts, or files on your site that might contain email addresses, phone numbers, or other sensitive data, you can do that using robots.txt file. Here are some other directories that might contain sensitive data and do not want to index: /cgi-bin/, /wp-admin/, /cart/, /scripts/ etc.
When a search engine crawler comes to your site, it will look for robots.txt file and it tells the search engine spider, which Web pages of your site should be indexed and which Web pages should be ignored.
Use our Robots.txt generator to create a robots.txt file. Note: Google does not support the crawl delay command directly
Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.