Jump to content

Recommended Posts

Posted (edited)

Hello,

I have many search engines that do not respect the rules of the robots.txt .

Google suggests to place the sitemap file(s) in a subfolder and to protect it with a htpasswd, which is a very good idea with this method only authorized search engines can exploit the file sitemap.

I did not manage to move the sitemap.php file to a subfolder.

 

Edited by MEVi
Posted

You do not need to move the file to require http authentication. Simply create a Files rule within your IPS directory that the file requires authentication.

Assuming Apache and that you're not a CIC customer, something like the following should work:

<Directory /var/www/html>
  <Files sitemap.php>
    AuthType basic
    AuthName "Protected Super Secret SiteMap"
    AuthUserFile /path/to/.htpasswd
    Require valid-user
  </Files>
  
  <!-- Include everything else you had in here -->
</Directory>

 

Posted

It's a very good idea
 

<FilesMatch "^(sitemap\.php)$">
    AuthType basic
    AuthName "Unauthorised For Legal Reasons"
    AuthUserFile /path/to/.htpasswd
    Require valid-user
</FilesMatch>

 

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...