Robots.txt made simple

For beginner WordPress bloggers, creating a search bot friendly robots.txt is a headache sometimes. This is particularly true for me: I first blindly copied a robots.txt from some site, which resulted in the blocking of all my posts from the search engines. I later found out the robots.txt I copied deliberately blocked all posts with a ? in the default permlink format.

Learning a hard lesson, I found a simple robots.txt will just do the job. Don’t copy robots.txt from sites that claim their robots.txt can do magic to SEO.

So how to create a simple and easy robots.txt for your new WordPress blog?

  • You want to allow all search bots to crawl your site, so simply put this simple line into your robots.txt:
User-agent: *
  • Create a Sitemap and submit it to Google Webmaster Tools. How to create a sitemap? Just install the plugin google-sitemap-generator for this purpose.
  • Block search bots from crawling certain directories – the directories which do not provide contents of your site, and are there for administrative or maintenance tasks. For WordPress, you certainly want to block these three directories. Remember, don’t disallow wp-contents. Because it holds your site’s contents.
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
  • If you feel like generating a robots.txt automatically, here is a robots.txt generator you can try and it can also help you better understand how robots.txt works.
  • Still want to know more? Actually you can learn a lot from your favorite sites. Just add robots.txt to the end of the url of your favorite sites, and you can see how they configured theirs. For exmaple, you can see my robots.txt by this url.
  • Feel lazy? You should be just fine if yours robots.txt looks similar to this (remember to supply your own site’s url to the sitemap line).
User-Agent: *

Leave a Reply

Your email address will not be published. Required fields are marked *