Are you struggling to get a fair share of traffic to your Shopify store?
If yes, then chances are that many of your website’s important pages have yet to be discovered by the search engines. By customizing your robots.txt file in Shopify, you can set the rules to optimize your web pages’ visibility and improve your ranking.
In this blog you will learn everything about robots.txt file and how to make changes in robots.txt file in Shopify.
What is Shopify Robots.txt?
Robots.txt is a file that tells search engine crawlers or bots which pages to request from the website to list on Search Engine Result Pages (SERPs).
If you want to restrict access to certain web pages, you can mention them under the “disallow” tag in this text file. Also, if you like bots to crawl important pages of your site, you can update them in the “allow” tag.
You can check the robots.txt file for any website by visiting the route yourwebsite.com/robots.txt.
Shopify provides a default robots.txt file optimal for SEO for all Shopify stores.
Unlike in the old days, you can now edit robots.txt files in Shopify to customize them per your requirements. However, Shopify recommends that you only edit robots.txt files if you have knowledge of editing code and understand SEO requirements, as it can impact traffic on your store or block the domain altogether.
How to Configure Robots.Txt in Shopify?
Since Shopify store themes are designed in liquid language, the robots.txt file template for Shopify is referred to as robots.txt.liquid.
Here’s how you can add Shopify robots.txt to your online store and make changes in the file.
Editing theme.liquid file
Follow the below steps to add the Shopify robots.txt file.
- Open your Shopify admin portal and click Settings > App and Sales channels.
- From the App and sales channels page, click Online Store.
- Go to Open Sales Channels and click Themes.
- You will find the Live theme section; click Actions > Edit code.
- Select Add a new template. You will find multiple template options here. Select robots from the list.
- Click Create a Template.
Once you create a template, you will get a default Shopify robot.txt.liquid template ready. It’s good to go for SEO purposes.

However, you can update the code here if you wish to make any amendments.
Using third-party apps
Though Shopify doesn’t recommend using third-party apps to customize robot.txt.liquid files, there is an app in the Shopify App Store – Robots.txt.

Using this app, you can pay a one-time charge of $19 to create a robots.txt file for your online store without getting into the complexity of the code. It can help you edit/update robots.txt files without impacting your store’s theme.
How Do You Modify Robots.Txt File in Shopify?
In the previous section, you learned how to add the robots.txt file to your Shopify store. Here, we will guide you through the customizations you can make in this file.
Customize Robots.txt Manually
You will often make either of the three changes mentioned below –
- Add a new rule to existing rules: This implies you would like to add new rules to the default set of rules. You will add the conditions for which new rules will be applicable. If you want to update different bot settings for other search engines, you can also update them here. Use disallow expressions :
- /blogs/*/tagged To disallow blog tag being crawled
- /collections/vendors*?*q= To disallow vendor collections from being crawled
Here, * is any text input given by the user.
- Remove a rule from existing rules: Although it’s not recommended practice from Shopify if you want to remove any existing rules from the default robots.txt.liquid file, you need to mention the code for the same accurately.
{% for rule in group.rules -%} {%- unless rule.directive == 'Disallow' and (rule.value == '/search' or rule.value == '/blogs') -%} {{ rule }} {%- endunless -%} {% endfor -%}
- Add custom rules: If you want to add new rules that aren’t in the default group, you can update those changes at the bottom of the file template.
Here is Shopify robots.txt example for your reference –
User-agent: * # Allow important pages Allow: /collections Allow: /products Allow: /blogs # Block irrelevant pages Disallow: /cart Disallow: /checkout Disallow: /admin # Block duplicate content Disallow: /collections/*?sort_by= Disallow: /collections/*?filter= # Sitemap location Sitemap: https://www.yourshopifystore.com/sitemap.xml
Customize Robots.txt Using Third Party App
When you don’t want to get into the challenge of code-level editing for robots.txt file for your Shopify store, you can navigate to the Shopify app store.
As mentioned earlier, you can check for robots.txt app. You can install it as per instructions, and update the links to specific pages and sitemap URLs. Once you update the settings, the robots.txt file will be ready per your customizations.
Test & Verify Your Shopify Robots.txt File
It’s tricky to look at the file after making alterations and consider it good to go. Even a small syntax error can impact the functioning of your site and block unnecessary pages.
However, there is no option to check the robots.txt file prior. You can navigate to the Google Search Console dashboard. Go to Settings > Crawling > robots.txt. Click Open Report.
From here, you can verify whether the robots.txt file is correctly implemented for your store or not.

You can check the issues in this report and fix any errors.
Best Practices for Configuring Robots.Txt File in Shopify
- Understand which website pages should be accessible to search engines to minimize the crawl budget for your online store.
- Use ‘Disallow’ directives to block access to the irrelevant pages of your store, like admin pages, checkout, cart, or any duplicate content. Here’s how you can restrict access or fix duplicate content pages based on filters.
- In the robots.txt file, include a link to the sitemap to help search engines find all the pages of your online store.
- When you block search engine bots, be specific about managing how different search engine bots will access your site.
- Verify the updates you’ve made in robots.txt before implementation. Using the Google Search console tester tool, you can check whether the file’s syntax is correct and functioning as expected.
- Avoid blocking too many URLs and use comments to clarify wherever necessary to explain the purpose of directives. This will be helpful for your team and future reference.
Implementing best practices in your robots.txt file can effectively enhance the performance of your Shopify store. For further assistance from a Shopify expert, you can hire our Shopify SEO Expert to effectively use the Shopify Robots.txt.