Creating an online store, you should take care of how to hide duplicate pages from the search engine (for example, the site will have several products with the same description). Or do you want the content of a specific page to be displayed only to users, but search bots do not index it. To prevent your site from being banned by Google or Yandex, you must create a special file - robots.txt. This file is located in the root of the site folder and contains information about pages that do not need to be indexed. If you don’t want to constantly seek help from programmers to place another link in it, you will need the “NeoSeo Robots.txt Generator” module. It allows you to automatically generate a file that you can fill in with “forbidden” links yourself, using your site’s administrative panel. If you want to know about the turnkey service - you are here.
- Edit the robots.txt file directly in the administrative zone.
- Automatic generation of the contents of a file by clicking on one button.
- Multi-store support.