Robots.txt file plays a very important part in SEO because its function is to instruct search engine robots on how to crawl and index pages on your website. Through this file, you can prevent bots from accessing certain pages and directories within your site. Some of the WordPress pages you don’t want robots to access are the /wp-admin/, /wp-includes/, /feed/ and /trackback/. You can also tell search engines about your sitemap by adding the following line to your robots.txt file:
To get you started, follow these steps:
1. On a notepad, put this code to keep all robots from accessing the above-mentioned pages:
2. Log into your FTP and go to
/public_html and just drag the file and drop it to that folder.
If you are using GoDaddy hosting and you can’t find the
public_html folder, just save the robots.txt file to the root directory, which is the public area for the account.
3. To make sure that you did the right thing, go to http://domain.com/robots.txt file. You should be seeing the exact line of codes you put on the notepad.
To check the Google Robots.txt Specifications, go to: