Invision Community 4: SEO, prepare for v5 and dormant account notifications By Matt Monday at 02:04 PM
sadams101 Posted September 25, 2019 Posted September 25, 2019 Since the end of June my site has had a quickly growing number of forum pages that google cannot properly crawl - see the screen shots below (Screen 1). When I open any of these links the Google Search Console (see Screen 2) and run the "Test Live URL" there are zero issues. All issues are for "Crawled as Googlebot smartphone." I did a check of my change log for any changes I made just before this issue began, and I found the following: 6/5/2019 - Upgraded to IPB 4.4.4. 6/5/2019 - Upgraded to PHP 7.3 w/ memcache from PHP 7.1 w/ memcache 6/11/2019 - applied patch for Invision Community v4.4.4 I have switched to using PHP 7.2 to see if this changes anything. Since the issue could be related to the IPB upgrades I ran during that time, I am checking to see if anyone else is having this issue? Screen 1 Screen 2
sadams101 Posted September 27, 2019 Author Posted September 27, 2019 I may go this route, but in all honesty tickets on any topic related to google indexing issues seem to not be well received by your team. Also, I am testing this as a possible fix, and will report back if it works. We reported this error:
bfarber Posted September 28, 2019 Posted September 28, 2019 Preloading (or not) a CSS file is not going to cause a "crawl anomaly". You're welcome to try his feedback, but I would be shocked if it had any impact on the issue you're reporting here.
Sonya* Posted September 28, 2019 Posted September 28, 2019 Hi, we have the same issue since end of June this year: All of them are supposed to have crawl issue. But none of the pages I have checked manually have any issues. They can be opened and viewed normally. Live test on the pages in Google Webmaster performs well. I can also resubmit an URL to index after it. "Validate fix" button fails after few days. I have started it more than 10 times since then. It runs and then fails. I have no explanation for this. The "funny" thing: you can see when the page has been last crawled in the list of examples below the graph. I have checked my logs. On the date the page has been last crawled there were not any requests from Google bot for these pages.
sadams101 Posted October 1, 2019 Author Posted October 1, 2019 Interesting, this is why I posted this, because I suspect that others are having this issue. Your time period matches mine, but it looks like your issues may be getting resolved, as I see a nice drop in issues around 9/15. So far I've not seen a drop.
Sonya* Posted October 2, 2019 Posted October 2, 2019 @sadams101, this is assumption, what going on: Google has changed his algo for calculating crawl budget in June. And now he creates these errors if he runs out of crawl budget. This would also explain why there are no requests for the pages in question on the last crawled date. To reduce these errors: I have reviewed my robots.txt and excluded everything that includes parameter like sort, sortby, sortdirection etc. I have also excluded profiles, calendar weeks, blogs and everything that has "noindex" due to lack of content. I have also excluded any technical URL starting with /application/ and so on. I have also reviewed URL parameters section in the Google webmaster and excluded those parameters either. I have excluded some "unimportant" apps from sitemap generation. And I have limited the number of the URLs in the sitemaps that are generated. There are now 500 to 1000 URLs instead of ALL. It seems that is helps to reduce the number of errors. You can indeed see that the issues tend to be resolved. I cannot confirm right now that it is solved but I can see the reduction indeed. If you have a large project try to reduce the number of crawled pages and see if it helps.
sadams101 Posted October 2, 2019 Author Posted October 2, 2019 Any chance you can share your robots.txt file with us?
Sonya* Posted October 2, 2019 Posted October 2, 2019 36 minutes ago, sadams101 said: Any chance you can share your robots.txt file with us? It heavily depends on the apps you have installed and how excessive you use them, how important are they for your SEO. This file cannot be used "just for all", here is just an example. User-agent: * Disallow: /profile/ Disallow: /notifications/ Disallow: /applications/ Disallow: /calendar/*/week/ Disallow: /*?*sortby= Disallow: /*?*sort= Disallow: /*?*sortdirection= Disallow: /*?*desc= Disallow: /*?*d= Disallow: /*?*tab=reviews Sitemap: https://www.example.com/sitemap.php You have to check if it suits to YOUR project and also triple check the file before uploading. This can ruin your SEO if you make a mistake.
sadams101 Posted October 3, 2019 Author Posted October 3, 2019 Thank you for this. I've seen similar robots.txt files with similar entries, like below. Any idea some do these differently? Disallow: /*sortby=* Would it be possible for you to share a screenshot of your google URL parameters?
sadams101 Posted October 3, 2019 Author Posted October 3, 2019 One issue that seems to be ongoing, which may cause this, is something that I've noticed with caching. I have noticed that I get errors occasionally that my site is not mobile friendly. When this happens I do a live test and see my mobile skin cached wrongly, so that the right column appears like in the desktop version, and then I need to clear my cache by using the support tool, re-test, and then all is ok. I believe that this might be the real culprit with these errors. I just discovered that this was happening, and fixed it. The question is, why does it keep happening? Certainly it could be a bug in my custom skin, but does anyone else have this issue? Does it happen in the default skin?
Sonya* Posted October 3, 2019 Posted October 3, 2019 15 hours ago, sadams101 said: I've seen similar robots.txt files with similar entries, like below. Any idea some do these differently? Disallow: /*sortby=* It is the same. My line makes it sure, that there is a parameter and not part of the URL. On the other hand URL cannot contain "=" in it, so it would match only parameter 😉 15 hours ago, sadams101 said: Would it be possible for you to share a screenshot of your google URL parameters?
sadams101 Posted October 16, 2019 Author Posted October 16, 2019 I am still having issues with the crawl anomaly, and I'm not sure this is a resource issue. I've noticed some issues with the pages after running the Live Test on a page with this problem, then "View Tested Page." For some reason, google cannot load most of the images on my page, and there are javascript errors. Here is a screen shot of what google sees on one of these pages...look to the right and you will see a broken link where my logo should be: When I select the "More Info" tab at the right I see Page resources 26/40 couldn't be loaded...examples below (each of these load fine, yet they do not load for Googlebot smartphone): Other error Image https://sfd.celiac.com/uploads/monthly_2019_06/New-Forum-Logo-5.png.65a8d036ad415ff45b764ff7f7042bcb.png Other error Image https://www.celiac.com/images/email-alerts.jpg Other error Stylesheet https://sfd.celiac.com/uploads/css_built_92/258adbb6e4f3e83cd3b355f84e3fa002_custom.css.b83b4b5cfcb918bc1bccdfd229502414.css?v=71afe62b2f Other error Stylesheet https://sfd.celiac.com/uploads/css_built_92/341e4a57816af3ba440d891ca87450ff_framework.css.f0261c2153b1309bc85f80c48f8c796a.css?v=71afe62b2f Other error Stylesheet https://sfd.celiac.com/uploads/css_built_92/5a0da001ccc2200dc5625c3f3934497d_core_responsive.css.9742300d8adbaf3f5ba3d82897e41e0c.css?v=71afe62b2f Other error Script https://sfd.celiac.com/uploads/javascript_core/front_front_core.js.49bfd1bb4632e56bb3f123c0434eccc6.js JavaScript console messages - I am not sure what these errors mean: 3 messages Error 00:12.000 Uncaught ReferenceError: ips is not defined at https://www.celiac.com/forums/topic/44830-helpful-tips/:11787:4 https://www.celiac.com/forums/topic/44830-helpful-tips/:11786 Error 00:12.000 Uncaught ReferenceError: $ is not defined at https://www.celiac.com/forums/topic/44830-helpful-tips/:12300:11 https://www.celiac.com/forums/topic/44830-helpful-tips/:12299 Error 00:15.000 Uncaught ReferenceError: $ is not defined at init (https://www.celiac.com/forums/topic/44830-helpful-tips/:12283:1) https://www.celiac.com/forums/topic/44830-helpful-tips/:12282 But, the errors are increasing: Any help would be appreciated.
sadams101 Posted October 16, 2019 Author Posted October 16, 2019 PS - I just tried uploading a new logo in my custom skin, then I refreshed the view and fetched the page again as googlebot smartphone, and look at the screenshot to the right...the page is the desktop version! I suspect what is happening is a caching issue, where the page is not mobile friendly, thus the error. Clearing the cache in the support tool fixes this, but it keeps coming back:
Sonya* Posted October 16, 2019 Posted October 16, 2019 2 hours ago, sadams101 said: For some reason, google cannot load most of the images on my page, and there are javascript errors. There is a quota for testing tools similar to crawl budget: Quote Google has a quota per site, for the number of requests its WILLING to make to the particular server. Partly to avoid an accidental DOS (ie if Google made requests without restraint it could quickly overwhelm most servers, Google has more servers than most sites!), but also as a 'resource sharing' system, to avoid devoting too much bandwidth to any particular site, and then not being able to index other sites. .... So some requests 'fail' because Google proactively aborted the request - without contacting the server - because it WOULD potentially push it over quota Source https://support.google.com/webmasters/thread/2293148?hl=en We have checked the "issue" by starting testing tool AND viewing our live server logs at the same time. The resources that are supposed to have crawl issues have not even been requested by Google bot. Google fails but not because your site fails. It fails because it "decided" not to crawl. We are reducing the number of pages in sitemap consequently right now. We also block a lot of "unimportant" URLs in robots.txt and we work with parameters. While the number of indexed pages goes down, the number of errors goes down as well. There is no change in organic traffic. We "loose" pages but not traffic. The pages we have delisted are just wasting our crawling budget. So we concentrate on important and newest pages at the moment.
sadams101 Posted October 16, 2019 Author Posted October 16, 2019 I have a really hard time believing that my requesting one single web page on my site would push any quota that Google might have for my site. Honestly, I'm using their tool which is supposed to help me debug any issues...why would it throw those errors if there were NO issues? It makes no sense whatsoever, and I've seen as many different explanations on this crawl anomaly as can be found using google search for the issue--dozens, and all are different. I can pull up 20 different threads on google's forum that all say very different things from different "Gold Product Experts" like the one you show here. I know you believe it is a crawl budget issue that is causing two very different issues, but why wouldn't google just list it as "Crawl Budget" directly?
sadams101 Posted October 17, 2019 Author Posted October 17, 2019 @Sonya* Regarding your robots.txt, is there a reason you did not add these: Disallow: /submit/ Disallow: /*?*do=add
Sonya* Posted October 17, 2019 Posted October 17, 2019 @sadams101, robots.txt above was just an example. We have even more included there, but it is different from project to project. The both URLs are not reachable by guests / bots in our projects as we do not use "Post before register" feature. Thus there is no need for us to exclude these URLs.
Sonya* Posted October 17, 2019 Posted October 17, 2019 7 hours ago, sadams101 said: It makes no sense whatsoever Yep, that's why I do not really pay much attention to "issues" in Google Webmaster if we cannot reproduce them. As I have said: if there is NO request from Google bot for the URL, that is supposed to have a crawling issue, then it is not much that I can do about it, isn't it? 7 hours ago, sadams101 said: but why wouldn't google just list it as "Crawl Budget" directly? I would also wish Google would not return a generic error "Crawl issue" if there are no issues with the URL itself.
bfarber Posted October 17, 2019 Posted October 17, 2019 19 hours ago, sadams101 said: I suspect what is happening is a caching issue, where the page is not mobile friendly, thus the error. Just a note - this is not actually possible. We do not serve a different version of the page based on whether the request is mobile or not - we serve the same exact HTML to every single client. It's the CSS code that determines when to move things around, hide things, etc. based on the device's display. This is a technique known as "responsiveness". Just wanted to be clear...it's not possible that a "non mobile friendly" page could ever be served.
sadams101 Posted October 17, 2019 Author Posted October 17, 2019 2 hours ago, bfarber said: Just a note - this is not actually possible. We do not serve a different version of the page based on whether the request is mobile or not - we serve the same exact HTML to every single client. It's the CSS code that determines when to move things around, hide things, etc. based on the device's display. This is a technique known as "responsiveness". Just wanted to be clear...it's not possible that a "non mobile friendly" page could ever be served. I took a screen shot yesterday where it happened. It was a desktop page served to googlebot smartphone, thus my theory about a caching issue--notice the right-column showing up in the screen shot:
sadams101 Posted October 24, 2019 Author Posted October 24, 2019 I opened a thread on this topic in Google's forums https://support.google.com/webmasters/thread/16988763?hl=en where, at least in my case, they noticed that I had a large number of 301 redirects in my site's links. I have fixed many of these, with the exception of the one below, which shows up for guest posters. Does anyone know how to do a FURL to make this become https://www.celiac.com/login/ ? https://www.celiac.com/index.php?app=core&module=system&controller=login
bfarber Posted October 25, 2019 Posted October 25, 2019 {url="app=core&module=system&controller=login" seoTemplate="login"}
sadams101 Posted October 25, 2019 Author Posted October 25, 2019 I do see a FURL already set up for this type of link, but for some reason it does not seem to be working in the post as guest field...the original URL is showing up there. Should I put the template bit you shared somewhere to fix that?
bfarber Posted October 28, 2019 Posted October 28, 2019 It sounds like you're saying the non-FURL version is being used, and if a guest clicks it they're redirected to the FURL, is that right? Where is the non-FURL version showing up exactly?
Recommended Posts
Archived
This topic is now archived and is closed to further replies.