🔥 Just Launched! Werra Premium Template for HyväSee it in Action

How to Configure Robots.txt and Meta Robots for Magento 2

By Sanjay JethvaUpdated on May 22, 2025 6 min read

If you are reading the post, it means either you are a Magento 2 store owner or is responsible for the SEO of the store. You must be aware of the importance of Magento robots.txt file. It is a text file that instructs the web robots, i.e., search engines which pages to crawl and which not!

In the words of Google,
“A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site.”

This tiny file is a part of EVERY website, and Magento 2 stores are no exception. Fortunately, the default Magento 2 supports creating a robots.txt file and I’ll show how to do so here.

Meta robots are a way for webmasters to offer search engines details about their stores. It is a piece of code in the  <head> section of your webpage that tells the search engines what to follow and what not to! Moreover, it tells the crawlers what links to follow and what links to stop with.

Create Magento 2 Robots.txt file and assist your SEO efforts because:

  • Controls how the search engine spiders see and interact with the pages of Magento 2 store.
  • Its improper use can affect the rankings of your store negatively.
  • Robots.txt file is the foundation of how the search engine works.
  • Restrict duplicate content pages from appearing in SERPs.
  • It helps to avoid overloading your store with requests from Google’s crawler.

Now that we’ve talked enough about robots.txt and its importance for Magento 2 store, let’s create one!

Steps to create Magento 2 Robots.txt:

1. Log in to Admin Panel.

2. Navigate to Content > Design > Configuration

1_Global edit

3. Expand the Search Engine Robots.

  • Default Robots: Set to one from the drop-down.
  • Edit Custom Instruction of Robots.txt File: Enter any custom details if needed in this field.
  • Reset to Default: Use this button to set the Magento 2 Robots.txt file to its default configuration.
set Meta Robots from Magento admin panel
  • Save the Configuration

The robots exclusion protocol or standard, also known as Robots.txt can be configured in default Magento 2 with the above steps.

Why Should I Set NoIndex NoFollow Tags to Links in Magento 2:

For example, you are launching a new product in your Magento 2 store. However, your team is still working on it. For time being, you can set that product page to NoIndex to tell the search engine to not index that page.

In this way, you can test the changes in the live store and at the same time restrict the search engine to index it.

Also, the NoFollow tags to links can be useful when you want to offer any additional information that is located on a particular web address but not pass the link equity.

Meta robots tags: NOINDEX, NOFOLLOW

Now that you have created robots.txt successfully, pay attention to the meta robots tags. Cover the unnecessary parts of the code from crawlers using these tags.

  • No Index is an attribute of tag, that restricts the transfer of the page weight to a non certified source. Moreover, it can be used for pages with a large number of external links.
  • No Follow hides the page from indexation

Apply Nofollow or Noindex to your configuration by either updating the robots.txt file or using meta name=“robots” tag.

Possible Combinations:

  1. INDEX, FOLLOW: Instructs the web crawlers to index the store and check back later for the changes.
  2. NOINDEX, FOLLOW: Instructs the web crawlers not to index the store but check back later.
  3. INDEX, NOFOLLOW: Instructs the web crawlers to index the site once and avoid checking back later.
  4. NOINDEX, NOFOLLOW: Instructs the web crawlers not to index the site and also not check back later

Add the following code to the robots.txt file in order to hide specific pages:

User-agent: *
Disallow: /myfile.html

Alternatively, you can restrict indexation with the following code:

<html >
<head >
<meta name=”robots” content=”noindex, follow”/ >
<title>page title>
</head >

The recommended default settings for Magento 2:

User-agent: *

# Directories
Disallow: /app/
Disallow: /bin/
Disallow: /dev/
Disallow: /lib/
Disallow: /phpserver/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /setup/
Disallow: /update/
Disallow: /var/
Disallow: /vendor/

# Paths (clean URLs)
Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /wishlist/

# Files
Disallow: /composer.json
Disallow: /composer.lock
Disallow: /CONTRIBUTING.md
Disallow: /CONTRIBUTOR_LICENSE_AGREEMENT.html
Disallow: /COPYING.txt
Disallow: /Gruntfile.js
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /nginx.conf.sample
Disallow: /package.json
Disallow: /php.ini.sample
Disallow: /RELEASE_NOTES.txt

# Do not index pages that are sorted or filtered.
Disallow: /*?*product_list_mode=
Disallow: /*?*product_list_order=
Disallow: /*?*product_list_limit=
Disallow: /*?*product_list_dir=

# Do not index session ID
Disallow: /*?SID=
Disallow: /*?
Disallow: /*.php$

# CVS, SVN directory and dump files
Disallow: /*.CVS
Disallow: /*.Zip$
Disallow: /*.Svn$
Disallow: /*.Idea$
Disallow: /*.Sql$
Disallow: /*.Tgz$

Check these frequently used Magento 2 robots.txt file examples:

  • Allow full access to all directories and pages:
User-agent:*
Disallow:
  • Don’t allow access for any user-agent to any directory and page:
User-agent:*
Disallow: /
  • Default Instructions:
Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=
  • Restrict User Accounts and Checkout Pages
Disallow: /checkout/
Disallow: /onestepcheckout/
Disallow: /customer/
Disallow: /customer/account/
Disallow: /customer/account/login/
  • To disallow duplicate content
Disallow: /tag/
Disallow: /review
  • To disallow CMS directories
Disallow: /app/
Disallow: /bin/
Disallow: /dev/
Disallow: /lib/
Disallow: /phpserver/
Disallow: /pub/
  • To disallow Catalog and Search Pages
Disallow: /catalogsearch/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
  • To disallow URL Filter Searches
Disallow: /*?dir*
Disallow: /*?dir=desc
Disallow: /*?dir=asc
Disallow: /*?limit=all
Disallow: /*?mode*

Common Web crawlersL:

Here are some common bots on the internet:

User-agent: Googlebot
User-agent: Googlebot-Image/1.0
User-agent: Googlebot-Video/1.0
User-agent: Bingbot
User-agent: Slurp		# Yahoo
User-agent: DuckDuckBot
User-agent: Baiduspider
User-agent: YandexBot
User-agent: facebot		# Facebook
User-agent: ia_archiver		# Alexa

Communicate with search engines the correct way! After the successful  Magento robots.txt creation, you can check for the validation of it using Google’s robots.txt Tester.

Create Magento 2 Robots.txt file and improve your SEO today!

Do rate the post with 5 stars if it was helpful.

Thanks

Sanjay Jethva Full Image
Article bySanjay Jethva

Sanjay is the co-founder and CTO of Meetanshi with hands-on expertise with Magento since 2011. He specializes in complex development, integrations, extensions, and customizations. Sanjay is one the top 50 contributor to the Magento community and is recognized by Adobe. His passion for Magento 2 and Shopify solutions has made him a trusted source for businesses seeking to optimize their online stores. He loves sharing technical solutions related to Magento 2 & Shopify.