Jump to content

Featured Replies

1 hour ago, Jheroen said:

Is it the best option to use the robots.txt which is shown here

We have a built-in robots.txt file now. If we ever believe there to be a need to update it, we will. You can set to use it by going to ACP -> System -> Search Engine Optimization -> Crawl Management.

  • Replies 50
  • Views 3.3k
  • Created
  • Last Reply

Top Posters In This Topic

Most Popular Posts

  • Nathan Explosion
    Nathan Explosion

    And /api/.htaccess, if it exists.

  • Randy Calvert
    Randy Calvert

    1. Click on crawl management tab (you see it in your screenshot). Click on custom and define your own. You can also click on “Do not use” and create your own file via FTP.  Remember though… if you cho

  • Please make any feature suggestions in Feedback forum.  

Posted Images

okay, i created a robots file based on your website is that one okay or should i modify it and if so what will be the contents?

1 minute ago, Jheroen said:

okay, i created a robots file based on your website is that one okay or should i modify it and if so what will be the contents?

It's recommended you use the one that is generated within your ACP instead of using the one on this website.  

6 minutes ago, Jim M said:

We have a built-in robots.txt file now. If we ever believe there to be a need to update it, we will. You can set to use it by going to ACP -> System -> Search Engine Optimization -> Crawl Management.

If you do what Jim says, IPB will manage updates to the robots.txt as new improvements are made, etc.  

okay but if i have changed the current one how do i set the default one again?

This post was recognized by Marc!

Nathan Explosion was awarded the badge 'Helpful' and 5 points.

Go to the ACP, type robots into the search and click on the robots.txt results that appears.

Could contain: File, Webpage, Page, Text

delete existing and do this action?

should i see a new one appear on the server after this action? i just clicked on save nothing more but i don't see a new robots.txt file

Edited by Jheroen

Just now, Jheroen said:

should i see a new one appear on the server after this action?

No.

but go to the expected URL in the browser (www.domain.com/robots.txt) and you will see the content.

Gracias all! it works

Is there something like this for a (best) htaccess file?

5 minutes ago, Jheroen said:

Gracias all! it works

Is there something like this for a (best) htaccess file?

Not other than the default htaccess for rewrites, which you can find in System>Search Engine Optimisation and download from there

thanks again, last for now: this board runs for years, can it be there are several folders/files that once were in use but not anymore? And if so how could we clean the root

Edited by Jheroen

15 hours ago, Jheroen said:

thanks again, last for now: this board runs for years, can it be there are several folders/files that once were in use but not anymore? And if so how could we clean the root

There is no built in way to do this, however what you could do is this, after making sure you are on the latest release and things are working properly:

  1. Make a full backup in case of issues, and turn your site offline while you do this
  2. Delete all files and folders apart from
    /applications/
    /api/.htaccess (if you have one)
    /datastore/
    /plugins/
    /uploads/
    constants.php (if you have one)
    conf_global.php
    .htaccess
  3. Upload a fresh set of files from your client area

 

Note, this is something done at your own risk, and does not account for any changes you may have made from default, such as upload location changes, third party items etc

 

15 hours ago, Nathan Explosion said:

And /api/.htaccess, if it exists.

Added to the above, in case anyone else comes across this. Thank you

can it do harm if old files are in the root and i leave them there?

7 minutes ago, Jheroen said:

can it do harm if old files are in the root and i leave them there?

There is no harm in leaving them present, no. Many do

If you have htpasswd in place, that may actually be what is causing the issue itself. We would need access in order to take a look at this, and would need to be able to switch it in and out of maintenance mode if needed

  • Author
  • Website A is under construction, it is closed to the public via the firewall and maintenance mode is deactivated.
  • Website B is in maintenance mode, but is accessible to the public (beta tester).

In both cases there's the same problem. The web link we provided points to a sub-domain in which there is no website, but to protect the data with a htpasswd. Both websites must remain inaccessible to the unauthorized public.

Please provide full access to the site in which has issues, but is not protected via htaccess

  • 1 month later...

I pressed Rebuild Sitemap in ACP, but my sitemap.php didn`t change. It didn`t change since 2019 year.

2 hours ago, Aleksandr Timashov said:

I pressed Rebuild Sitemap in ACP, but my sitemap.php didn`t change. It didn`t change since 2019 year.

Looking at the sitemap our software generates on your site, I am seeing lastmod dates of today. Do you have an example?

-rw-r--r--   1 root root   723 Mar  3  2019 sitemap.php

It`s really not need any change during 4 years?

The one generated by IPS is not an actual static file that exists on your server. It’s dynamically generated. So if you are looking at it in your file folder, I can already tell you that’s not the correct one. 

Edited by Randy Calvert

Recently Browsing 0

  • No registered users viewing this page.