Invision Community 4: SEO, prepare for v5 and dormant account notifications By Matt Monday at 02:04 PM
President Evil Posted March 3, 2016 Posted March 3, 2016 Hi, I noticed after upgrading to IPB4 and switching webhosts, in google search for my site there is no longer any crawler links, in fact all it shows is initially I thought this had something to do with cloudflare, but it seems it's caused by my robots.txt file User-agent: * Disallow: / Disallow: /cgi-bin/ I have no idea how it was set up before, can anyone tell me how to enable crawlers safely(only giving them access to appropriate stuff)
tnn Posted March 4, 2016 Posted March 4, 2016 Yes you need: User-agent: * Disallow: Disallow: /cgi-bin/ Instead of: User-agent: * Disallow: / Disallow: /cgi-bin/ The one you have blocks the entire website. You might want to find out how it was gonfigured that way. Also, consider changing your Title tag and description tag.
ASTRAPI Posted March 6, 2016 Posted March 6, 2016 Most bots don't respect robots.txt and you should find other ways to block them like on Nginx you can map them and set a value like 0 allow 1 block
Recommended Posts
Archived
This topic is now archived and is closed to further replies.