Posted May 24May 24 Community Expert Under ACP > System > Search Engine Optimization > Crawl Management, I have set Robots.txt to Custom, and I've entered my custom rules. However I've noticed that the robots.txt file that's live (mysite.com/robots.txt) does not reflect those custom rules. In fact it doesn't even have the basic IPS rules (to disallow certain folders). It's simply: User-agent: * Disallow: /cdn-cgi/
May 24May 24 Community Expert Unfortunately, I am seeing what is in your robots.txt in the ACP:The last item there, /cdn-cgi/ is appended by our cloud processes.Keep in mind that there is caching on our Cloud so it may take some time for preferences to update for certain cases.
May 24May 24 Author Community Expert I haven't made any changes to my robots.txt file in months, so it's not a cache issue. When I go to mydomainname.com/robots.txt using Chrome, I see this:
May 24May 24 Community Expert You may wish to clear your browsers cache. Unfortunately, I am unable to reproduce that.
May 24May 24 Author Community Expert Just deleted the cache in Chrome on my Mac, same thing. I tried with Safari for iOS on my iPhone, and it offered me to download the robots.txt, which I did: same content, only the two lines. ChatGPT sees the same thing as I do:
May 24May 24 Community Expert This is your community correct and not someone else's community? Unfortunately, both on and off our VPN plus using chatgpt as well, I am getting the robots.txt I showed above.
May 24May 24 Author Community Expert Just noticed something:I'm getting two different robots.txt files depending on whether or not I usehttps://mydomain.comhttps://www.mydomain.comLooks like that could be the issue?Normally there should be a redirect from https:// to https://www on my site, but it looks like that's not working for the robots.txt file? Edited May 24May 24 by David N.
May 26May 26 Community Expert Solution Ive created a ticket for you. Please provide full information on there and we can advise for you
May 26May 26 For what it is worth, I see the same condition on my site. Two different robots.txt pages, dependent upon the url. I don't know if it matters or not but the site in question runs v4. Edited May 26May 26 by My Sharona