How to Optimize WordPress Robots.txt File for SEO

You are lucky that WordPress automatically creates a Robots.txt file for you. Having this file is half of the battle. You have to make sure that Robots.txt file is optimized to get the full benefits.

What is Robots.txt File?

Robots.txt file is a text file which instructs search engine bots how to crawl and index a site. Whenever any search engine bots come to your site, it reads the robots.txt file and follows the instructions.

The basic format for a robots.txt file looks like this:

Wordpress Beginners Guide

User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
User-agent: [user-agent name]
Allow: [URL string to be crawled]
Sitemap: [URL of your XML Sitemap]

What does an ideal robots.txt file look like?

The first line usually names a user agent. The user agent is the name of the search bot you’re trying to communicate with. For example, Googlebot or Bingbot.

Once you’ve installed and activated Yoast SEO plugin, go to WordPress Admin Panel > SEO > Tools.

Then click on “File editor”.

Learning WordPress for Beginners

There are three commands mainly.

  • User-agent — Defines the name of the search engine bots like Googlebot or Bingbot. You can use an asterisk (*) to refer to all search engine bots.
  • Disallow — Instructs search engines not to crawl and index some parts of your site.
  • Allow — Instructs search engines to crawl and index which parts you want to index.

Here’s a sample of Robots.txt file.

User-agent: *
Disallow: /wp-admin/
Allow: /

If you find this post helpful, please help me by sharing this post on Facebook, Twitter or Google+.

Social Share

Leave a Comment