We created a step-by-step guide to make SEO easy and understandable

Check it Out

Enter your search term

Search by title or post keyword

Robots.txt WordPress: Explanation, Importance & How To Optimize 2022

Level up your digital marketing skills with our free courses, expert insights, forums, and social groups!

Check it Out

If you operate a WordPress website, then you need to know about robots.txt files.

In this article, we’ll explain what robots.txt files are and why you should use one on your WordPress website.

We’ll also show you how to optimize your robots.txt file for better performance.

What Is a Robots.txt File?

A robots.txt file is a text file that provides search engine robots with instructions.

Here’s an example of what a robots.txt file might look like.

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

In the example above, the robots can crawl everything on the website except for the wp-admin folder.

What Does Robots.txt File Do?

The robots.txt file helps search engines understand which pages they should crawl and index.

Best Robots.txt for WordPress

For most WordPress websites, the best robots.txt WordPress file is simple and tells robots to crawl all the pages on your website.

Do You Need a Robots.txt File for Your WordPress Site?

There are pros and cons to using a robots.txt file for a website.

On the plus side, a robots.txt file can help ensure that only relevant pages show up in the search engines.

It can improve the overall search engine ranking of your website.

However, there’s the potential for negative consequences if your robots.txt file is not set up correctly.

For example, if you accidentally tell the Google bot to ignore your entire website, your pages will remain invisible to search engine bots.

Carefully consider whether or not you need a robots.txt file for your WordPress site.

If you’re not sure, it’s generally best to err on the side of caution and include a robots.txt file.

How To Add Robots.txt File for WordPress

When creating your robots.txt file, keep the following items in mind.

  • Specify all robots.txt directives in lowercase letters
  • Include a trailing slash in paths (e.g. /path/to/page/)
  • Avoid using wildcards

Use the correct file extension.

Save your robots.txt file with a .txt extension, not .doc, .rtf, or any other extension.

Where Is the WordPress robots.txt File Located?

The robots.txt file goes in the root directory of your WordPress site.

For example, if your WordPress site is example.com, place the robots.txt file at example.com/robots.txt.

You can also access the robots.txt file through the WordPress admin dashboard by going to Settings > Reading and scrolling down to the robots.txt section.

How To Use a Robots.txt File in WordPress?

Generally speaking, there are two ways to use a robots.txt file in WordPress.

  1. Using a WordPress plugin
  2. Manually adding the robots.txt file to your WordPress site

The easiest way to manage your robots.txt file is by using a plugin.

A plugin allows you to easily add, edit, and delete robots.txt rules from your WordPress admin dashboard.

To use a plugin, first install and activate a plugin like the Robots.txt plugin.

Once activated, you’ll need to visit Settings > Reading in your WordPress admin area and scroll down to the robots.txt section.

In the robots.txt section, you’ll see a list of rules that the plugin has automatically generated.

You can edit these rules or add new ones by clicking on the Edit button.

You can manually add the robots.txt file to your WordPress site.

To do this, you’ll need to connect to your WordPress site using an FTP client or File Manager in cPanel.

What Does an Ideal Robots.txt File Look Like?

You will find the following terms in a robots.txt file.

User-agent: This directive specifies which robot you want to target.

For example, Googlebot is the user agent for Google’s crawler.

Allow: Using this directive, you can specify which pages you want to allow robots to crawl and show in a search result.

Disallow: The disallow directive tells search engines which pages to prevent a search bot from crawling.

Sitemap: With this directive, specify the location of your website’s sitemap file.

Ideally, your robots.txt file should look something like this.

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
Disallow: /admin/
Sitemap: example.com/sitemap.xml

The above robots.txt file tells all web crawlers that they can access all of the content on your website except for the /cgi-bin/, /tmp/, and /admin/ directories.

Additionally, it includes a sitemap directive that tells web crawlers where they can find your XML sitemap.

How To Optimize WordPress Robots.txt File for Better SEO

Use the following steps when you want to optimize your robots.txt file for search engine visibility and better results inside Google Webmaster Tools or the Bing Webmaster tool dashboard.

First, place your robots.txt file in the root directory of your WordPress site.

If it’s not, search engines may have difficulty finding it.

Second, you should use the robots.txt file to block any pages on your website that don’t add value.

For example, you may want to block the /admin/ page or any duplicate content pages.

Third, you should use robots.txt to specify the location of your XML sitemap.

It will help search engines index your website more efficiently.

Finally, make sure your robots.txt file is well-organized and easy to understand.

If it’s not, search engines may have difficulty parsing it.

How To Create and Edit a Robots.txt File With Yoast SEO

Using the Yoast SEO plugin, you can easily create and edit a robots.txt file from within your WordPress admin area.

To do this, head to SEO / Tools / File Editor and select the robots.txt tab.

You can add any directives you want to include in your robots.txt file.

It’s important to note that Yoast SEO will automatically generate a robots.txt file for you if one doesn’t already exist.

How To Create and Edit a Robots.txt File With All in One SEO

You can also create and edit a robots.txt file with the All in One SEO plugin.

You will do so inside the WordPress dashboard.

Go to the All in One SEO admin section and look for the robots.txt tab.

If you don’t see it, click on the cog icon to open the settings.

Once you’re in the robots.txt settings, simply add any directives you want to include in your file.

Like Yoast SEO, All in One SEO will automatically generate a robots.txt file.

How To Create and Edit a Robots.txt File via FTP

If you don’t want to use a plugin, you can also create and edit your robots.txt file via FTP.

You will need to connect to your WordPress site using an FTP client like FileZilla.

Once connected, navigate to the root directory of your WordPress site.

If you don’t see a robots.txt file, you can create one using a text editor like Notepad++.

After opening the robots.txt file, add any directives you want to include.

When finished, save your changes and upload the robots.txt file to your WordPress site.

It’s important to note that robots.txt is a case-sensitive file.

So, make sure you don’t accidentally change the robots.txt to Robots.txt or ROBOTS.TXT.

Doing so could prevent your robots.txt file from working properly.

What You Should Not Use the Robots.txt For

You should understand what you should not use robots.txt for.

Don’t use the robots.txt file as a way to hide sensitive or confidential information on your website.

If you want to block pages containing customer information or credit card numbers, you should use password protection instead.

To prevent search engines from indexing duplicate content, you should optimally use the rel=”canonical” tag or the 301 redirects.

Many websites do use the robots.txt file for duplicate content, so this isn’t a hard rule.

How To Create Different Rules for Different Robots

Let’s review four specific rules you may want to use with your website’s robots.txt file.

How To Use Robots.txt to Block Access to Your Entire Site

If you want to block all robots from accessing your WordPress site, you can do so by adding the following line of code to your robots.txt file.

User-agent: *
Disallow: /

The “User-agent” directive tells robots which rule to follow.

The asterisk (*) is a wildcard that matches all robots.

The “Disallow” directive tells robots not to crawl any pages on your website.

In this case, we’re telling robots not to crawl any pages on our WordPress site.

How To Use Robots.txt to Block a Single Bot From Accessing Your Site

If you want to block a specific bot from accessing your WordPress site, you can do so by adding the following line of code to your robots.txt file:

User-agent: Googlebot
Disallow: /

In this example, we’re blocking the Googlebot from crawling our WordPress site.

You can replace “Googlebot” with the name of any other bot you want to block.

How To Use Robots.txt to Block Access to a Specific Folder or File

Use the following code to block robots from accessing a specific folder or file on your WordPress site.

User-agent: *
Disallow: /wp-admin/

The above code blocks robots from accessing the wp-admin folder on a WordPress site.

How To Create Different Rules for Different Bots in Robots.txt

You can create different rules for different bots by adding the following code to your robots.txt file.

User-agent: Googlebot
Disallow: /wp-admin/
User-agent: Bingbot
Disallow: /wp-includes/

The above code provides instructions to block Googlebot from accessing the wp-admin folder and Bingbot from accessing the wp-includes folder.

You can replace “Googlebot” and “Bingbot” with the names of other bots you want to block.

How To Test Your Robots.txt File

Go to Google Search Console to test your robots.txt file.

Inside the Console, select your website, and click on the “Robots.txt Tester” tool.

Enter the robots.txt file you want to test and submit it for testing.

If your robots.txt file works properly, you should see a “Success” message.

If there are any issues with your robots.txt file, you’ll see an “Error” message that outlines the issue.

Wrapping Up

The robots.txt file is an important part of any WordPress blog.

It’s used to tell search engines which pages they should and shouldn’t index.

It can also specify the location of your XML sitemap.

If you don’t already have a robots.txt file on your WordPress site, you should create one immediately.

Leave a Comment