How to Configure Robots.txt and Meta Robots for Magento 2

If you are reading the post, it means either you are a Magento 2 store owner or is responsible for the SEO of the store. You must be aware of the importance of robots.txt file. It is a text file that instructs the web robots, i.e., search engines which pages to crawl and which not!

In the words of Google,
“A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site.”

This tiny file is a part of EVERY website, and Magento 2 stores are no exception. Fortunately, the default Magento 2 supports to create a robots.txt file and I’ll show how to do so here.

Meta robots are a way for webmasters to offer search engines with details about their stores. It is a piece of code in the  <head> section of your webpage that tells the search engines what to follow and what not to! Moreover, it tells the crawlers what links to follow and what links to stop with.

Create Magento 2 Robots.txt file and assist your SEO efforts because:

  • Controls how the search engine spiders see and interact with the pages of Magento 2 store.
  • Its improper use can affect the rankings of your store negatively.
  • Robots.txt file is the foundation of how the search engine works.
  • Restrict duplicate content pages from appearing in SERPs.
  • It helps to avoid overloading your store with requests from Google’s crawler.

Now that we’ve talked enough about the robots.txt and its importance for Magento 2 store, let’s create one!

Steps to create Magento 2 Robots.txt:

  1. Log in to Admin Panel.
  2. Navigate to Content > Design > Configuration

    1_Global edit
  3. Expand the Search Engine Robots.
    1. Default Robots: Set to one from the drop-down.
    2. Edit Custom Instruction of Robots.txt File: Enter any custom details if needed in this field.
    3. Reset to Default: Use this button to set the Magento 2 Robots.txt file to its default configuration.Set NoIndex NoFollow Tags to Links in Magento 2
  4. Save the Configuration

The robots exclusion protocol or standard, also known as Robots.txt can be configured in default Magento 2 with the above steps.

Why Should I Set NoIndex NoFollow Tags to Links in Magento 2:

For example, you are launching a new product in your Magento 2 store. However, your team is still working on it. For time being, you can set that product page to NoIndex to tell the search engine to not index that page.

In this way, you can test the changes in the live store and at the same time restrict the search engine to index it.

Also, the NoFollow tags to links can be useful when you want to offer any additional information that is located on a particular web address but not pass the link equity.

Meta robots tags: NOINDEX, NOFOLLOW

Now that you have created robots.txt successfully, pay attention to the meta robots tags. Cover the unnecessary parts of the code from crawlers using these tags.

  • No Index is an attribute of tag, that restricts the transfer of the page weight to a non certified source. Moreover, it can be used for pages with a large number of external links.
  • No Follow hides the page from indexation

Apply Nofollow or Noindex to your configuration by either updating the robots.txt file or using meta name=“robots” tag.

Possible Combinations:

  1. INDEX, FOLLOW: Instructs the web crawlers to index the store and check back later for the changes.
  2. NOINDEX, FOLLOW: Instructs the web crawlers not to index the store but check back later.
  3. INDEX, NOFOLLOW: Instructs the web crawlers to index the site once and avoid checking back later.
  4. NOINDEX, NOFOLLOW: Instructs the web crawlers not to index the site and also not check back later

Add the following code to the robots.txt file in order to hide specific pages:

Alternatively, you can restrict indexation with the following code:

The recommended default settings for Magento 2:

Check these frequently used Magento 2 robots.txt file examples:

  • Allow full access to all directories and pages:

  • Don’t allow access for any user-agent to any directory and page:

  • Default Instructions:

  • Restrict User Accounts and Checkout Pages

  • To disallow duplicate content

  • To disallow CMS directories

  • To disallow Catalog and Search Pages

  • To disallow URL Filter Searches

Common Web crawlersL:

Here are some common bots on the internet:

Communicate with search engines the correct way! After the successful robots.txt creation, you can check for the validation of it using Google’s robots.txt Tester.

Create Magento 2 Robots.txt file and improve your SEO today!

Feel free to ask for any help with the topic in the Comments section below.

Do rate the post with 5 stars if it was helpful.

Thanks 😊

(based on 8 Reviews)
How to Configure Robots.txt and Meta Robots for Magento 2Author Magento Badge

Sanjay Jethva

Sanjay is a co-founder at Meetanshi. He is a Certified Magento Developer who loves creating Magento E-commerce solutions. Owing to his contributions in Magento Forums and posting solutions, he is among the top 50 contributors of the Magento community in 2019. When he is not engrossed with anything related to Magento, he loves to play cricket.

Leave a Reply

Your email address will not be published. Required fields are marked *