Posted April 23Apr 23 Hi guysWonder if anyone has any standard rules to apply in CloudFlare which you know do not have a negative impact on SEO that can prevent the mass of crawled but not indexed pages, like we see here? Our number of crawled / no indexed pages was already huge and now its doubled in the last few months and our concern is the impact on crawl budget. As you can see, blocking in Robots doesn't prevent the millions of pages being indexed so we're thinking is it safe to block in CF without impacting other aspects of SEO?Thanks
April 23Apr 23 Community Expert There are indeed a lot of pages that will be blocked by robots.txt. However this is to improve SEO for your content rather than hinder, so it would be worth having a read through the topic here, where we discuss why this is the case, before you go ahead and block it :)
April 23Apr 23 Author Stuff like:?do=markRead&fromForum=?app=core&module=system&controller=content&dologin/?ref=aHR0cHM6Ly9h/?do=findComment&comment=11891042We've changed to use the default indexing settings now - don't know why we didn't have that enabled in the first place. But we'll see now if it has any impact.
May 8May 8 Community Expert Some of these links will actually be valid things that are blocked, as mentioned above. Is there a reason you would be looking for things like the markread link to be indexed?