Jump to content

blocking URLs without a robots.txt

Featured Replies

Posted

Urgent assistant required.  Need to block URL's on my site.  I have deleted them and requested google to remove them but according to google I need to add them to a robots.txt or metatags.  Not sure I can do the second option as i removed the pages?

  • Community Expert

Robots.txt

User-Agent: * 
Disallow: /some-folder/some-file.htm

 

  • Author

so i can basically make a txt file, name it Robots.txt and put that inside?

 

Where should it be placed, public html?

  • Author

thanks, seems to be working

 

  • 2 weeks later...
  • Author

I was told to do this in google webmasters support forum

Quote
If it's a matter of you wanting to keep the url live on the site but no see it in search results you need to add a robots noindex meta tag to the page and NOT block it in the robots.txt file.
You can then request an outright url removal.

specifically what would I have to put into this section please?

 

 

Screen Shot 2018-04-21 at 21.57.43.png

For the meta tags select 'robots' and then in the value field enter noindex.

Archived

This topic is now archived and is closed to further replies.

Recently Browsing 0

  • No registered users viewing this page.