MEVi Posted April 19, 2021 Posted April 19, 2021 (edited) Hello, I have many search engines that do not respect the rules of the robots.txt . Google suggests to place the sitemap file(s) in a subfolder and to protect it with a htpasswd, which is a very good idea with this method only authorized search engines can exploit the file sitemap. I did not manage to move the sitemap.php file to a subfolder. Edited April 19, 2021 by MEVi
CoffeeCake Posted April 19, 2021 Posted April 19, 2021 You do not need to move the file to require http authentication. Simply create a Files rule within your IPS directory that the file requires authentication. Assuming Apache and that you're not a CIC customer, something like the following should work: <Directory /var/www/html> <Files sitemap.php> AuthType basic AuthName "Protected Super Secret SiteMap" AuthUserFile /path/to/.htpasswd Require valid-user </Files> <!-- Include everything else you had in here --> </Directory> OptimusBain and MEVi 2
MEVi Posted April 19, 2021 Author Posted April 19, 2021 It's a very good idea <FilesMatch "^(sitemap\.php)$"> AuthType basic AuthName "Unauthorised For Legal Reasons" AuthUserFile /path/to/.htpasswd Require valid-user </FilesMatch>
Recommended Posts