April 12, 2018 in Technical Problems
Urgent assistant required. Need to block URL's on my site. I have deleted them and requested google to remove them but according to google I need to add them to a robots.txt or metatags. Not sure I can do the second option as i removed the pages?
so i can basically make a txt file, name it Robots.txt and put that inside?
Where should it be placed, public html?
Yes, the public root. More information: https://en.wikipedia.org/wiki/Robots_exclusion_standard
thanks, seems to be working
I was told to do this in google webmasters support forum
If it's a matter of you wanting to keep the url live on the site but no see it in search results you need to add a robots noindex meta tag to the page and NOT block it in the robots.txt file.
You can then request an outright url removal.
specifically what would I have to put into this section please?
For the meta tags select 'robots' and then in the value field enter noindex.
This topic is now archived and is closed to further replies.
Started 2 hours ago
Started 7 hours ago
Started 13 hours ago