Steve Bullman Posted April 12, 2018 Share Posted April 12, 2018 Urgent assistant required. Need to block URL's on my site. I have deleted them and requested google to remove them but according to google I need to add them to a robots.txt or metatags. Not sure I can do the second option as i removed the pages? Link to comment Share on other sites More sharing options...
opentype Posted April 12, 2018 Share Posted April 12, 2018 Robots.txt User-Agent: * Disallow: /some-folder/some-file.htm Link to comment Share on other sites More sharing options...
Steve Bullman Posted April 12, 2018 Author Share Posted April 12, 2018 so i can basically make a txt file, name it Robots.txt and put that inside? Where should it be placed, public html? Link to comment Share on other sites More sharing options...
opentype Posted April 12, 2018 Share Posted April 12, 2018 Yes, the public root. More information: https://en.wikipedia.org/wiki/Robots_exclusion_standard Link to comment Share on other sites More sharing options...
Steve Bullman Posted April 12, 2018 Author Share Posted April 12, 2018 thanks, seems to be working Link to comment Share on other sites More sharing options...
Steve Bullman Posted April 21, 2018 Author Share Posted April 21, 2018 I was told to do this in google webmasters support forum Quote If it's a matter of you wanting to keep the url live on the site but no see it in search results you need to add a robots noindex meta tag to the page and NOT block it in the robots.txt file. You can then request an outright url removal. specifically what would I have to put into this section please? Link to comment Share on other sites More sharing options...
bfarber Posted April 23, 2018 Share Posted April 23, 2018 For the meta tags select 'robots' and then in the value field enter noindex. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.