Jump to content

Guides

Creating a robots.txt file

There are various reasons that you may wish to create a robots.txt file. This may be because you wish to block the indexing of certain pages by search engines, or even block a search engine altogether, amongst many other reasons. But did you know you can do this with the pages application?

Creating a robots.txt file with the pages application can be helpful if you have no access to your server via ftp, such as on the cloud platform, and also gives you a quick location for you to edit these without the need to ever access the file directly.

Creating the page

In order to create you robots.txt file for the site, you need to create a new page within the following location in your ACP

Pages -> Page Management -> Pages

When creating a new page, choose the 'Manual HTML' option, which will allow you to manually create pages in code. When creating use 'robots.txt' as the page and file name

2018-10-25_12-21-51.png

Robots.txt example

It is important to note in the above that we have deselected the "Use suite HTML wrapper" option. This is done so that we are directly creating just text on the page for search engine use, rather than surrounding with the wrapper from the rest of the site. Also, ensure this is not placed within another folder, and is instead at the root of your pages application.

Adding content

Adding content to your robots.txt file is then as simple as adding whatever you need to the contents tab. Here I have added a simple entry to tell googlebot not to index my site.

2018-10-25_12-27-02.png

robots.txt entry

When saving, you must ensure you give permissions for guests to view that page, otherwise search engine bots will not be able to view the contents/rules you have in place. Upon saving you will see that the robots.txt file is available from yourSiteURLHere/index.php


  Report Guide


×