Robots.txt

The "robots.txt" file is a text file that is placed in the root directory of a website to provide instructions to web robots (also known as crawlers, spiders, or bots) about which pages or sections of the site should be crawled and indexed by search engines. This allows the merchant to control what the search engine crawlers will and won't index. For example, merchants do not want search engine crawlers from crawling and indexing any of the Back Office pages. By default, when a website is set up Storefront will add URLs that we don't want crawled: User-agent: * Disallow: /error/ Disallow: /platformhealthcheck/ Disallow: /publish/ Disallow: /checkout/ Disallow: /cart/ Disallow: /account/ Disallow: /profile/ Disallow: /scripts/ Disallow: /styles/ Disallow: /images/ Disallow: /fonts/ Disallow: /ckeditor/ Disallow: /backoffice/ Allow: /account/login Allow: /account/register

How to Update Robots.txt file

  1. select the store you want to edit the file on

  2. Click on Manage Store Settings under Settings

  3. Navigate to SEO

  4. Click on Robots.txt

  5. Enter the URL that you want to Allow or Disallow

Last updated