Data at Your Fingertips: Explore Our New Reporting and Statistical Capabilities By Ryan Ashbrook Yesterday at 01:29 PM
President Evil Posted March 3, 2016 Share Posted March 3, 2016 Hi, I noticed after upgrading to IPB4 and switching webhosts, in google search for my site there is no longer any crawler links, in fact all it shows is initially I thought this had something to do with cloudflare, but it seems it's caused by my robots.txt file User-agent: * Disallow: / Disallow: /cgi-bin/ I have no idea how it was set up before, can anyone tell me how to enable crawlers safely(only giving them access to appropriate stuff) Link to comment Share on other sites More sharing options...
tnn Posted March 4, 2016 Share Posted March 4, 2016 Yes you need: User-agent: * Disallow: Disallow: /cgi-bin/ Instead of: User-agent: * Disallow: / Disallow: /cgi-bin/ The one you have blocks the entire website. You might want to find out how it was gonfigured that way. Also, consider changing your Title tag and description tag. Link to comment Share on other sites More sharing options...
ASTRAPI Posted March 6, 2016 Share Posted March 6, 2016 Most bots don't respect robots.txt and you should find other ways to block them like on Nginx you can map them and set a value like 0 allow 1 block Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.