Creating an online store, it is worth taking care of how to hide duplicate pages from the search engine (for example, on the site there will be several products with the same description). Or you want the content of a particular page to be displayed only for users, but the search robots did not index it. To prevent your site from getting into the Google Bans or Yandex, you need to create a special file - robots.txt. This file is located at the root of the site folder and contains information about pages that you do not need to be indexed.
If you do not want to constantly refer to the help of programmers to put the next link in it, you need the module "NeoSeo Generator robots.txt". It allows you to automatically generate a file, which you can fill with "forbidden" links yourself, using your administrative panel of the site.
How to install the module
Described in the readme.txt file in the module's archive.
- Edit the robots.txt file directly in the administrative zone.
- Automatic generation of the contents of a file by clicking on one button.
- Multi-store support.
System requirements: PHP 5.3 - 7.0 ocmod для 2.x
ioncube loader: Minimum required version 6.0
OpenCart: 1.5, 2.0, 2.1, 2.2, 2.3
License type: One domain
Activation method: automatically upon purchase or on request by mail email@example.com