SeNioR- Posted February 6, 2018 Posted February 6, 2018 (edited) @sadams101 download robots.txt file from the attachment and edit url to sitemap. After editing, add to the main directory (public_html). It works really well for indexing. robots.txt Edited February 6, 2018 by SeNioR-
nodle Posted February 6, 2018 Posted February 6, 2018 I don't want to be a bearer of bad news either, but I have had HTTPS set in my Google search console for quite some time since I went to HTTPS early back in the day. But still whatever has changed has dropped me from Google's search results from #1 for my forums name to about page 7 or 8 in Google's search results. Almost completely removed me now. I haven't changed anything either. This last week I tried replacing my robots.txt to something new hoping anything would make a difference. Just grasping at straws here now. Hoping 4.3 comes out soon and it fixes things. I am at a loss here. prupdated 1
opentype Posted February 6, 2018 Posted February 6, 2018 6 minutes ago, nodle said: Hoping 4.3 comes out soon and it fixes things. Probably not, since there was still no causal connection shown between sitemap creating speed (the actual topic here) and decrease in indexing/ranking. So there is probably nothing to fix. 17 minutes ago, nodle said: I haven't changed anything either. That might be, but IPS didn’t as well. Google changed things and you can check wether their algorithm updates (e.g. Fred in 2017) correlate with your problems. That would actually tell you more about what you could probably improve. Thomas P and Numbered 2
AlexWebsites Posted February 6, 2018 Posted February 6, 2018 I agree with @opentype I don't think IPS changed much but what I do think is that there is room for improvement. Probably an opportunity for someone to come out with a paid app/plugin or two to create a more robust sitemap, add some additional SEO settings with more dynamic and adjustable meta tags to include tags, adjustable description lengths, bring dynamic keywords using tags, default keywords per forum, page titles, etc. Keywords may not be Google's thing but are still picked up by other sources. Interesting reads regarding description lengths in 2018: https://moz.com/blog/how-long-should-your-meta-description-be-2018 https://searchengineland.com/google-officially-increases-length-snippets-search-results-287596 https://searchengineland.com/google-fundamentals-writing-meta-descriptions-not-change-longer-search-snippets-288414 I haven't looked into the IPS code that controls the current cutoff but was interested in testing this out by increasing the current dynamic description tag to see if more info is picked up and weighted better. Cyboman 1
sadams101 Posted February 6, 2018 Posted February 6, 2018 Why again would a proper robots.txt file NOT be included with this app? After running a site continuously for over 20 years, and using multiple applications over those years, a simple robots.txt file would have prevented many issues I've had with this app, for example google's bot clicking and indexing the report button and other URL's that probably should never have been in the index. Why would it be up to each user to search this out or try to create one, when this is clearly a job for the software's designers? Also, the sitemap is very important to the google index, just as site speed is. Blaming indexing issues on Google's Fred or any other update is not productive when your sitemap clearly has issues that need to be fixed--or are we still arguing about that? Not to go off topic, but this is related also....since the Fred update I suspect that some of you here are also being hit hard with site speed issues, and google's penalties for those appear to be increasing. IPB's position on this so far has been that their software doesn't need to address this (in my case they were happy to blame my site's ads, as they've kind of done in this thread too, which were not the root cause of my speed issues, and in fact were totally unrelated because I serve them asynchronously). My site had dropped greatly in rank until I had to find workarounds that should just be built into the software. After implementing some fairly easy changes, you can see my rank improve (I did this around a year ago): https://www.alexa.com/siteinfo/celiac.com Site speed test: https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.celiac.com%2Fgluten-free%2Fforum%2F13-celiac-disease-pre-diagnosis-testing-amp-symptoms%2F Compared to an "out of the box" version of IPB (I noticed that IPB has blocked the site speed site from working on their forum): https://developers.google.com/speed/pagespeed/insights/?url=https%3A%2F%2Fwww.fiat-lancia.org.rs%2Fforum%2Findex.php%3F%2Ftopic%2F75871-kazna-za-izlaženje-iz-auta%2F My point here is that the search engine health of your forum, and your users' forums, are far more important than all the new bells and whistles that you keep adding (no complaints about new features, just that you should re-think your priorities). I am seeing some improvement already in the number of pages in google's index after making the changes recommended here by Upgradeovec, so far I've gone up from the 114,000 low point, to 130,000+ in a short time. I hope this improvement continues. For the rest of you--look into those site speed issues! I am no expert and used the help of Adlago to get my site scoring well on google's speed test. SeNioR-, Adlago and Cyboman 3
opentype Posted February 6, 2018 Posted February 6, 2018 (edited) Quote Blaming indexing issues on Google's Fred or any other update is not productive when your sitemap clearly has issues that need to be fixed--or are we still arguing about that? As long as you cannot demonstrate a connection or make it at least plausible, you bet we are. Edited February 6, 2018 by opentype
Morgin Posted February 6, 2018 Posted February 6, 2018 27 minutes ago, sadams101 said: Why again would a proper robots.txt file NOT be included with this app? After running a site continuously for over 20 years, and using multiple applications over those years, a simple robots.txt file would have prevented many issues I've had with this app, for example google's bot clicking and indexing the report button and other URL's that probably should never have been in the index. Why would it be up to each user to search this out or try to create one, when this is clearly a job for the software's designers? Aren't these all noindex'd in the software? They appear to be on my forum. You don't need a robots.txt to duplicate noindex tags.
sadams101 Posted February 6, 2018 Posted February 6, 2018 Morgin, you may have just hit the nail on the head! I actually wasn't going to implement much of this site map because I do want individual posts to be in their index--why would I not want them to be?? Many users over the years may link directly to a post, and then you would be telling google not to index that (at least to me that seems like what you would be telling google). So if certain links are noindex, like the link to individual posts, I'd like to remove that noindex and see what happens.
sadams101 Posted February 6, 2018 Posted February 6, 2018 So what links in the board are set to noindex?
AlexWebsites Posted February 6, 2018 Posted February 6, 2018 21 minutes ago, sadams101 said: Morgin, you may have just hit the nail on the head! I actually wasn't going to implement much of this site map because I do want individual posts to be in their index--why would I not want them to be?? Many users over the years may link directly to a post, and then you would be telling google not to index that (at least to me that seems like what you would be telling google). So if certain links are noindex, like the link to individual posts, I'd like to remove that noindex and see what happens. If you include urls from individual posts you are going to get a bunch of urls with duplicate content, specifically title tags and description tags. Although it would be interesting to index all posts and I believe vbseo as part of a vbulletin install used to do that but you would need to append to the page title something like " - Post # 1234" or something unique from the topic title and then also add something unique into the page description meta tag to avoid duplication. It can be done but requires some modification and isn't built into IPS from what I see.
Caputo Posted February 7, 2018 Posted February 7, 2018 Hello, please help me, how can I solve this problem? URL address blocked by robots.txt. Your sitemap contains URLs blocked by robots.txt. robots.txt : User-agent: * Allow: / Disallow: /profile/ what's in the robots.txt file?
opentype Posted February 7, 2018 Posted February 7, 2018 (edited) 1 hour ago, MATIS FLORIN PASTOREL said: Hello, please help me, how can I solve this problem? URL address blocked by robots.txt. That question doesn’t really belong in this topic. And it’s also not a big problem actually. You are allowed to block pages through the robots.txt. To avoid this Google message in the future, open the sitemap settings in the ACP and turn off “profiles” by unchecking the “unlimited” and entering “0” instead. Edited February 7, 2018 by opentype Numbered and Caputo 1 1
Management Matt Posted February 8, 2018 Management Posted February 8, 2018 On 06/02/2018 at 8:09 PM, AlexWebsites said: If you include urls from individual posts you are going to get a bunch of urls with duplicate content, specifically title tags and description tags. Although it would be interesting to index all posts and I believe vbseo as part of a vbulletin install used to do that but you would need to append to the page title something like " - Post # 1234" or something unique from the topic title and then also add something unique into the page description meta tag to avoid duplication. It can be done but requires some modification and isn't built into IPS from what I see. We don't have links to a specific post. We have a 301 link which finds the post, and presents it with an anchor, so the post URL is: foo.com/forums/topic/123-topic/#post1019 So there's no duplicate penalty. On 06/02/2018 at 5:55 PM, AlexWebsites said: I agree with @opentype I don't think IPS changed much but what I do think is that there is room for improvement. Probably an opportunity for someone to come out with a paid app/plugin or two to create a more robust sitemap, add some additional SEO settings with more dynamic and adjustable meta tags to include tags, adjustable description lengths, bring dynamic keywords using tags, default keywords per forum, page titles, etc. Keywords may not be Google's thing but are still picked up by other sources. Interesting reads regarding description lengths in 2018: https://moz.com/blog/how-long-should-your-meta-description-be-2018 https://searchengineland.com/google-officially-increases-length-snippets-search-results-287596 https://searchengineland.com/google-fundamentals-writing-meta-descriptions-not-change-longer-search-snippets-288414 I haven't looked into the IPS code that controls the current cutoff but was interested in testing this out by increasing the current dynamic description tag to see if more info is picked up and weighted better. Invision Community already has a dynamic meta tag editor, which allows you to tweak meta tags for virtually any page.
Management Matt Posted February 8, 2018 Management Posted February 8, 2018 4.3 also uses 300 characters for meta description SeNioR-, crmarks, AlexWebsites and 1 other 2 2
AlexWebsites Posted February 8, 2018 Posted February 8, 2018 (edited) 33 minutes ago, Matt said: We don't have links to a specific post. We have a 301 link which finds the post, and presents it with an anchor, so the post URL is: foo.com/forums/topic/123-topic/#post1019 So there's no duplicate penalty. Invision Community already has a dynamic meta tag editor, which allows you to tweak meta tags for virtually any page. I'll have to look at my sites, because my urls don't redirect so pretty and are mostly foo.com/forums/topic/topic-title/?page=2&tab=comments#comment-1019. What is the actual reason for the redirect and not direct post links within topics? The redirects don't get picked up. So in this topic, the following post: https://invisioncommunity.com/forums/topic/442742-large-community-you-have-a-problems-with-sitemap/?do=findComment&comment=2729176 Redirects to: https://invisioncommunity.com/forums/topic/442742-large-community-you-have-a-problems-with-sitemap/?page=5&tab=comments#comment-2729176 The page title is: <title>Large community? You have a problems with sitemap! - Page 5 - Peer to Peer Technical Support - Invision Community</title> So every post url on Page 5 will have the same page title. You could avoid this by appending the post number in title and meta and having direct links that get picked up rather than redirecting, but maybe there is a good reason not to, not sure. I think the other forums like xenforo and vbulletin do it similar to IPS with redirects as well. 27 minutes ago, Matt said: 4.3 also uses 300 characters for meta description That's great news! Edited February 8, 2018 by AlexWebsites
Management Matt Posted February 8, 2018 Management Posted February 8, 2018 14 minutes ago, AlexWebsites said: https://invisioncommunity.com/forums/topic/442742-large-community-you-have-a-problems-with-sitemap/?page=5&tab=comments#comment-2729176 The page title is: <title>Large community? You have a problems with sitemap! - Page 5 - Peer to Peer Technical Support - Invision Community</title> This is correct, but Google will not see that as a unique link for that post. It'll see that link as a unique link for that page, which is why it says "Page 5 -". We do not provide direct permalinks to posts because there is little value, and you create duplicate content. vBSEO did (for a while) load a single post in a new window with a permalink, but that was back in 2008 and I'm sure it's moved on since then. AlexWebsites and crmarks 2
Management Matt Posted February 8, 2018 Management Posted February 8, 2018 Ok, so it's worth rounding up what Invision Community DOES do in terms of SEO. Auto generated meta tags for description Custom meta tag editor for finer control in a single area Uses appropriate header codes 200 for OK, 301 for redirects, 303 for 'the page is actually here', 404 for not founds, 403s for permission errors, etc Uses appropriate HTML markup to highlight important content (h1, h2, etc) Uses rewritten URLs for a cleaner structure packed with keywords Creates and submits a sitemap to show Google which URLS are important to your community Uses nofollow where appropriate to stop pages like 'Contact Us' from being crawled Uses JSON-LD micro data markup to tell Google about what data they are seeing and how it should be used Allows easy integration with Google Search Console for tracking Uses https Has a responsive theme which gets the "Mobile Friendly" badge Here's what is coming in 4.3 Meta description expanded to 300 characters Ability to rebuild your entire sitemap quickly Lastmod tag added to sitemap files Not to mention other retention tools like Bulk emailing tool available Emailed notifications Promote to social media Share to social media There seems to be a level of worry in this topic, and while I'm happy to field any questions you have, Google is a bit mysterious and prone to changing things overnight. We adhere to good standards and do all the right things as you can see from this list. We are not adverse to change and adding new features, but we never do it in a panic or with a knee-jerk until we get some hard evidence which supports the reason for change. We have been monitoring our own Google Search Console and clicks/impressions are up, indexes are down slightly, but Google has seen them and flagged them as 'discovered'. These tend to be profiles from people who have never posted (and we have about 200k of those alone). I do not believe we are facing any crisis, or that anything is substantially wrong. We can always do better, and we're listening. We just need a little more than a few charts to go on before we make drastic change. Jim M, Sheffielder, Numbered and 4 others 4 3
Management Matt Posted February 8, 2018 Management Posted February 8, 2018 On 06/02/2018 at 4:52 PM, nodle said: I don't want to be a bearer of bad news either, but I have had HTTPS set in my Google search console for quite some time since I went to HTTPS early back in the day. But still whatever has changed has dropped me from Google's search results from #1 for my forums name to about page 7 or 8 in Google's search results. Almost completely removed me now. I haven't changed anything either. This last week I tried replacing my robots.txt to something new hoping anything would make a difference. Just grasping at straws here now. Hoping 4.3 comes out soon and it fixes things. I am at a loss here. I've had a quick peek at your site, and I think you're probably being penalised based on your keywords. Given the prevalence of 'fake news' and Google/Facebook being pressurised to do something about those sorts of sites, your keywords 'community forum for members to discuss the paranormal, conspiracies, ufos, games, reviews, technology, politics' probably are a bit of a red flag.
AlexWebsites Posted February 8, 2018 Posted February 8, 2018 16 minutes ago, Matt said: This is correct, but Google will not see that as a unique link for that post. It'll see that link as a unique link for that page, which is why it says "Page 5 -". We do not provide direct permalinks to posts because there is little value, and you create duplicate content. vBSEO did (for a while) load a single post in a new window with a permalink, but that was back in 2008 and I'm sure it's moved on since then. That's a good point, I do recall a permalinks section in vbSEO when I had it, but like you said that was a while ago. I remember a ton of urls in my forum sitemap using vbseo and not just the topic urls and this was because of all the posts. But duplicate content was an issue where its not anymore. Thanks for the explanation.
AlexWebsites Posted February 8, 2018 Posted February 8, 2018 10 minutes ago, Matt said: Ok, so it's worth rounding up what Invision Community DOES do in terms of SEO. Auto generated meta tags for description Custom meta tag editor for finer control in a single area Uses appropriate header codes 200 for OK, 301 for redirects, 303 for 'the page is actually here', 404 for not founds, 403s for permission errors, etc Uses appropriate HTML markup to highlight important content (h1, h2, etc) Uses rewritten URLs for a cleaner structure packed with keywords Creates and submits a sitemap to show Google which URLS are important to your community Uses nofollow where appropriate to stop pages like 'Contact Us' from being crawled Uses JSON-LD micro data markup to tell Google about what data they are seeing and how it should be used Allows easy integration with Google Search Console for tracking Uses https Has a responsive theme which gets the "Mobile Friendly" badge Here's what is coming in 4.3 Meta description expanded to 300 characters Ability to rebuild your entire sitemap quickly Lastmod tag added to sitemap files Not to mention other retention tools like Bulk emailing tool available Emailed notifications Promote to social media Share to social media There seems to be a level of worry in this topic, and while I'm happy to field any questions you have, Google is a bit mysterious and prone to changing things overnight. We adhere to good standards and do all the right things as you can see from this list. We are not adverse to change and adding new features, but we never do it in a panic or with a knee-jerk until we get some hard evidence which supports the reason for change. We have been monitoring our own Google Search Console and clicks/impressions are up, indexes are down slightly, but Google has seen them and flagged them as 'discovered'. These tend to be profiles from people who have never posted (and we have about 200k of those alone). I do not believe we are facing any crisis, or that anything is substantially wrong. We can always do better, and we're listening. We just need a little more than a few charts to go on before we make drastic change. Thanks for posting @Matt, really good stuff. Can you elaborate on: Allows easy integration with Google Search Console for tracking Is there anything more than the validation tag(s)? I just want to make sure I'm not missing something. As far as the sitemap which this topic is about, it's great that you are adding the <lastmod> tag and rebuild functionality but what about <changefreq>. Is there a submit frequency that is standard and built in? Is it daily or upon new topic generation? Have you guys thought about an image sitemap, maybe specifically if you use the gallery app when it comes to images and videos?
Silnei L Andrade Posted February 8, 2018 Posted February 8, 2018 It's been about 7 months since I switched PHPBB over IPBoard and to date the images on my site have not indexed in Google. Before we had many images indexed in Google images and as we are a travel site that brought many visits. Our visits have fallen by half since then. Is there anything that can be done to get the images indexed again?
opentype Posted February 8, 2018 Posted February 8, 2018 3 minutes ago, Silnei L Andrade said: Is there anything that can be done to get the images indexed again? Check out why they aren’t index would be the first step. This has nothing to do with IPS. The software just stores the images publicly and they are indexed by default – unless you do something to prevent that, e.g. hiding posts from google or disallowing indexing through the robots.txt.
Silnei L Andrade Posted February 8, 2018 Posted February 8, 2018 @opentype I saw now that I forgot to create the robots.txt. But can that be the problem?
opentype Posted February 8, 2018 Posted February 8, 2018 8 minutes ago, Silnei L Andrade said: @opentype I saw now that I forgot to create the robots.txt. But can that be the problem? No. The other way around. If you would have a robots.txt there could be something faulty in there which prevents the image indexing. Silnei L Andrade 1
Management Matt Posted February 8, 2018 Management Posted February 8, 2018 37 minutes ago, Silnei L Andrade said: It's been about 7 months since I switched PHPBB over IPBoard and to date the images on my site have not indexed in Google. Before we had many images indexed in Google images and as we are a travel site that brought many visits. Our visits have fallen by half since then. Is there anything that can be done to get the images indexed again? PM me with your site link, I'll take a look. 1 hour ago, AlexWebsites said: Allows easy integration with Google Search Console for tracking We have an "Analytics" section in the ACP to paste in your tracking code making it really easy to get Google Search Console tracking your site. You don't need to worry about editing templates, etc. Silnei L Andrade 1
Recommended Posts