Jump to content

Problems Google Search Console


Recommended Posts

Hello everyone, how can I solve this problem? A new reason for pages not indexing Search Console has identified that certain pages on your site are not indexing for the following reason: Indexed, although blocked by robots.txt If the reason is unintentional, fix it to get pages indexed and appearing on Google.

Edited by Caputo
Link to comment
Share on other sites

Click on the individual status types like “Blocată de robots.txt” and check the URLs. Do you want them indexed or not? That’s the first question. There is NOT necessarily a problem there that needs fixing. 

Only if you want those pages indexed, you would need to check your robots.txt located under yourdomain.com/robots.txt  and possibly remove certain entries there. 

Link to comment
Share on other sites

16 minutes ago, Caputo said:

I would like to be able to solve this problem:

Maybe it’s a language issue, but we are going around in circles. You are not explaining what issue you are trying to solve. Those URLs blocked by robots.txt all look fine to me. I WOULD DO ABSOLUTELY NOTHING if I were you. 

 

Link to comment
Share on other sites

I feel you are missing part of the text there

Quote

If the reason is unintentional

That is very much an important part of the notification you are getting. As mentioned above by opentype, there are items blocked there which are fine, and therefore intentional. It may well be you are trying to solve an issue that isnt there to be solved.

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...