Adlago Posted March 22 Posted March 22 This test is from your site - but is also from many too sites Please check - this degrades the SEO performance of sites.
Management Matt Posted March 22 Management Posted March 22 It looks Ok here: https://invisioncommunity.com/robots.txt? I also ran Lighthouse on this site and it didn't show an error about robots.txt.
Adlago Posted March 22 Author Posted March 22 1 minute ago, Matt said: I also ran Lighthouse on this site and it didn't show an error about robots.txt. https://pagespeed.web.dev/analysis/https-invisioncommunity-com-forums/cm12blrd8a?form_factor=mobile Yes, and my robots.txt are OK, and on all the sites I test it is OK, but after the update of the last patch that I did for me half an hour ago, I noticed this issue...
Management Matt Posted March 22 Management Posted March 22 I'm not sure what we can do about that, it must be a page speed tool issue.
Adlago Posted March 22 Author Posted March 22 6 minutes ago, Matt said: I'm not sure what we can do about that, it must be a page speed tool issue. Probably so, some problem in PSI. Now I tested sites with other platforms - and popular sites. Everyone has this issue. Apparently, it is a coincidence in the time of updating your patch, and the appearance of an issue in PSI. You can close this topic. Sorry for the trouble. Matt 1
Adlago Posted March 22 Author Posted March 22 I found a solution in an article about this. Add in robots txt Disallow: /cdn-cgi/ and issue will not be there. It worked OK for me.
Randy Calvert Posted March 22 Posted March 22 But why do it though? These are supposed to be recommendations to help you improve your user experience. I don’t see how doing that does anything like this. The tool clearly has a bug. If it’s not actually helping improve the UX, the fix is a placebo.
Adlago Posted March 23 Author Posted March 23 10 hours ago, Randy Calvert said: But why do it though? These are supposed to be recommendations to help you improve your user experience. I don’t see how doing that does anything like this. The tool clearly has a bug. If it’s not actually helping improve the UX, the fix is a placebo. The error message that occurs when testing with Lighthouse, which causes a drop of 8 points in the SEO metric, "robots.txt is not valid - Lighthouse could not download robots.txt file", is a common problem when using a network for content delivery (CDN) such as Cloudflare. This supplement in robots txt, I mentioned eliminates this issue. Probably Lighthouse changed (updated) their version and introduced a bug. Another issue related to the SEO metric is that 2 zones are indicated in a guest cookie message, for poorly sized tap and click zones. And when viewing these areas, they are not links to touch, but part of the text - which is also a consequence of this bug. Add-on in robots txt removes and this issue.
Recommended Posts