With help of robots.txt file you can give commands to the robots or crawling bots what content is important for your site to display on search engine. This is such a nice method where you can block unnecessary content to crawl your website by search bots.
By doing so this will help to increase website visibility in search results and build better search engine reputation. To generate manual robot.txt you can do with help of webmaster tools easily. Follow the step by step instructions.
Generate Robots.txt file Manually in Webmaster Tools:
- Sign in Google webmaster tools. Select the website which you want to generate robots file.
- Select site configuration then crawler access from left side of webmaster tools.
- Click on Generate robots.txt options. At first options choose default crawler access select allow all radio button.
- In second option Specify additional rules (optional), select block from drop down Action menu. In this option you are going to block specific directory or files to crawling by search engine. Select all robots from user agents drop down menu.
- Type directory or files name path in Directory and files box and click on add rule.
- You can add rule to block such directory and files to crawling one by one.
When you finish simply download your new robots files and upload to websites root directory in server.
You can also check Google robots files from this link www.google.com/robots.txt. This is an example of my wordpress blog bloggerbonus.com/robots.txt where i used robots.txt to block few directories by robots for do not display on search engines.
This Video Will Help To Generate Robots files:
Example of robots file.
User-agent: * Allow: /
User-agents: * – (*) Star indicate to all robots. Next line [Allow: /] (/) backslash means allow all directories and paths.
First three lines Allow all blogger bonus directories, files, sitemap and posts.
Now here is twist in next line to block directories that you don’t want to display on search results. For example, [Disallow: /wp/wp-/] this method does not allow robots to crawl [/wp/wp-] directories because this disallow to crawl directory path. Like the same [Disallow: /go/], [Disallow: /wp-content/cache], [Disallow: /wp-content/plugins/] all commands give the instructions to robots do not crawl these directories and these directories will be exclude by robots when next time site goes under crawling.