Jump to content

Featured Replies

Posted

Hi guys

Wonder if anyone has any standard rules to apply in CloudFlare which you know do not have a negative impact on SEO that can prevent the mass of crawled but not indexed pages, like we see here? Our number of crawled / no indexed pages was already huge and now its doubled in the last few months and our concern is the impact on crawl budget. As you can see, blocking in Robots doesn't prevent the millions of pages being indexed so we're thinking is it safe to block in CF without impacting other aspects of SEO?

Thanks

image.png

  • Community Expert

There are indeed a lot of pages that will be blocked by robots.txt. However this is to improve SEO for your content rather than hinder, so it would be worth having a read through the topic here, where we discuss why this is the case, before you go ahead and block it :)

  • Author

Stuff like:

?do=markRead&fromForum=
?app=core&module=system&controller=content&do
login/?ref=aHR0cHM6Ly9h
/?do=findComment&comment=11891042

We've changed to use the default indexing settings now - don't know why we didn't have that enabled in the first place. But we'll see now if it has any impact.

  • 2 weeks later...
  • Author

no change in 2 weeks!

  • Community Expert

Some of these links will actually be valid things that are blocked, as mentioned above. Is there a reason you would be looking for things like the markread link to be indexed?

Recently Browsing 0

  • No registered users viewing this page.