Jump to content

Darek_Hugo

Clients
  • Posts

    29
  • Joined

  • Last visited

Reputation Activity

  1. Thanks
    Darek_Hugo reacted to Ehren for a blog entry, Invision Community 5: A more performant, polished UI   
    As showcased in our past blogs, Invision Community 5 introduces a brand new, modern interface which brings improvements to performance, aesthetics and mobile usability.
    An optional side navigation panel, new view modes, light/dark modes, customizable header layouts, a search modal and a mobile navigation bar are some of the things we've showcased previously. Today, lets take a closer look at some other miscellaneous changes that we've been working on while developing Version 5, including some of the code reductions and performance improvements that we've been able to achieve in the process.
    For those of you who are developers, we'll also give some simple explanations of how (and why) we've implemented these changes.
     
    Widgets
    Sidebar widgets are perfect for displaying content feeds, featured members, announcements, advertisements and more on your page. In version 4 however, the widget column would often become an empty space once the widgets had been scrolled past:
      widgets-v4.mp4  
    In version 5, widgets now stick to the screen once the last widget has been reached, ensuring your readers have more convenient  access to your widgets rather than a void space:
      sticky-widgets-v5.mp4  
     
    Messenger
    The Messenger is a great way to reach out to members when a private chat is more appropriate than a topic. Inspired by modern email clients, the messenger in Version 5 has been revamped with a full-height, sticky inbox, a longer message snippet, mini profiles and a more polished UI - all with a 25% reduction in CSS and a 100% reduction in Javascript.
    messenger-v5.mp4
     
    Sticky elements
    We've mentioned sticky elements a couple of times now, so lets take a look behind the scenes at how they're created, and some of the performance improvements with Version 5. Traditionally, sticky elements were created using Javascript which would calculate the position of the element on the page and adjust it's stickiness every time the page was scrolled. Scroll events can be quite taxing for browsers, and when it comes to Javascript, the less, the better (especially when aiming for great page speed scores)!
    With that in mind, all sticky elements are now handled using sticky positioning via CSS, which is a native and much more performant way of controlling these elements. We've been able to replace an entire 400 line Javascript component with just 3 lines of CSS.
     
    Grids and Masonry
    Grids have previously been handled in a similar fashion. Javascript would scan all elements within a grid to determine how many could fit on a single line, and would then shuffle these elements into position after the page was loaded or resized. CSS has since introduced its own grid properties, which has allowed us to replace more than 350 lines of Javascript with just a few lines of CSS, resulting in more performant page rendering and nicer looking grids (especially on small-medium displays such as mobiles and tablets).
     

     
    Fun fact: We first introduced a similar performance improvement to "masonry grids" in our Gallery update from January this year, by replacing more than 400 lines of Javascript with, you guessed it, just a few lines of CSS.
     

     
     
     
    Click targets
    We wanted to make Version 5 as simple as possible to navigate, and one way of doing that has been by implementing larger click targets. Clicking anywhere inside an entry in a table or grid will now take you to that entry (you can still click on other links like normal within the click target, such as subforums or profile links). Click targets are optional and can be disabled via your theme settings if necessary.
     
    click-targets.mp4
     
    Data Lists (tables)
    Speaking of tables, they too have been revamped. Tables automatically adapt to the space they've been assigned to (for those curious, this is done using CSS container-queries), so they're always neat regardless of the screen size, with no overflow or squashed layouts. Behind the scenes, the two columns below are created with identical code, yet they're quite different visually due to the size which they've been allocated. Even with these improvements, tables have received a 25% reduction in CSS.
     

     
    Profiles
    Profiles have been polished for Version 5 and include some nice improvements such as sticky widgets and tabs. 
    profile-desktop.mp4
     
    On mobiles, the side column collapses into a carousel, and the sticky tabs allow you to easily flick between content types without scrolling to the top of the page.
    profile-mobile.mp4
     

    Tabs
    You may have noticed in the above clip that tabs on mobiles are now scrollable, compared to a dropdown menu from version 4. We made this change to ensure that tabs are given more equal exposure on small devices, and have managed to reduce the CSS by a whopping 80%.
     
    Carousels
    Last and certainly not least, are carousels. Carousels are great for displaying large amounts of data in a confined space and they've been rewritten from scratch for version 5. Previously, a Javascript library was used to create the "scroll effect", however this has never been the smoothest experience on laptop trackpads and touch devices.
    In version 5, carousels are powered by native smooth-scrolling and scroll-snapping, which results in a much nicer user experience, especially on touchscreens. We've been able to remove a staggering 95% of the Javascript, substituting it with just a few lines of CSS.
     
    carousel.mp4
     
    To be honest, we've only just scratched the surface here! In addition to these changes, we've modernized (and reduced code) in almost every component throughout the suite including avatars, cover photos, dropdown menus, forms, inputs, buttons, lists, off-canvas menus, side menus, columns and more!
    Combined, these changes result in not only a significant reduction in code, but also a polished UI that performs smoothly on desktop and touch devices. We're excited to continue modernizing Invision Community well into the future as new technologies and techniques become available to us, and are looking forward to getting it in your hands in 2024.
  2. Like
    Darek_Hugo reacted to Matt for a blog entry, SEO: Improving crawling efficiency   
    No matter how good your content is, how accurate your keywords are or how precise your microdata is, inefficient crawling reduces the number of pages Google will read and store from your site.
    Search engines need to look at and store as many pages that exist on the internet as possible. There are currently an estimated 4.5 billion web pages active today. That's a lot of work for Google.
    It cannot look and store every page, so it needs to decide what to keep and how long it will spend on your site indexing pages.
    Right now, Invision Community is not very good at helping Google understand what is important and how to get there quickly. This blog article runs through the changes we've made to improve crawling efficiency dramatically, starting with Invision Community 4.6.8, our November release.

    The short version
    This entry will get a little technical. The short version is that we remove a lot of pages from Google's view, including user profiles and filters that create faceted pages and remove a lot of redirect links to reduce the crawl depth and reduce the volume of thin content of little value. Instead, we want Google to focus wholly on topics, posts and other key user-generated content.
    Let's now take a deep dive into what crawl budget is, the current problem, the solution and finally look at a before and after analysis. Note, I use the terms "Google" and "search engines" interchangeably. I know that there are many wonderful search engines available but most understand what Google is and does.
    Crawl depth and budget
    In terms of crawl efficiency, there are two metrics to think about: crawl depth and crawl budget. The crawl budget is the number of links Google (and other search engines) will spider per day. The time spent on your site and the number of links examined depend on multiple factors, including site age, site freshness and more. For example, Google may choose to look at fewer than 100 links per day from your site, whereas Twitter may see hundreds of thousands of links indexed per day.
    Crawl depth is essentially how many links Google has to follow to index the page. The fewer links to get to a page, is better. Generally speaking, Google will reduce indexing links more than 5 to 6 clicks deep.
    The current problem #1: Crawl depth
    A community generates a lot of linked content. Many of these links, such as permalinks to specific posts and redirects to scroll to new posts in a topic, are very useful for logged in members but less so to spiders. These links are easy to spot; just look for "&do=getNewComment" or "&do=getLastComment" in the URL. Indeed, even guests would struggle to use these convenience links given the lack of unread tracking until logged in.  Although they offer no clear advantage to guests and search engines, they are prolific, and following the links results in a redirect which increases the crawl depth for content such as topics.
    The current problem #2: Crawl budget and faceted content
    A single user profile page can have around 150 redirect links to existing content. User profiles are linked from many pages. A single page of a topic will have around 25 links to user profiles. That's potentially 3,750 links Google has to crawl before deciding if any of it should be stored. Even sites with a healthy crawl budget will see a lot of their budget eaten up by links that add nothing new to the search index. These links are also very deep into the site, adding to the overall average crawl depth, which can signal search engines to reduce your crawl budget.
    Filters are a valuable tool to sort lists of data in particular ways. For example, when viewing a list of topics, you can filter by the number of replies or when the topic was created. Unfortunately, these filters are a problem for search engines as they create faceted navigation, which creates duplicate pages.

    The solution
    There is a straightforward solution to solve all of the problems outlined above.  We can ask that Google avoids indexing certain pages. We can help by using a mix of hints and directives to ensure pages without valuable content are ignored and by reducing the number of links to get to the content. We have used "noindex" in the past, but this still eats up the crawl budget as Google has to crawl the page to learn we do not want it stored in the index.
    Fortunately, Google has a hint directive called "nofollow", which you can apply in the <a href> code that wraps a link. This sends a strong hint that this link should not be read at all. However, Google may wish to follow it anyway, which means that we need to use a special file that contains firm instructions for Google on what to follow and index.
    This file is called robots.txt. We can use this file to write rules to ensure search engines don't waste their valuable time looking at links that do not have valuable content; that create faceted navigational issues and links that lead to a redirect.
    Invision Community will now create a dynamic robots.txt file with rules optimised for your community, or you can create custom rules if you prefer.

    The new robots.txt generator in Invision Community
    Analysis: Before and after
    I took a benchmark crawl using a popular SEO site audit tool of my test community with 50 members and around 20,000 posts, most of which were populated from RSS feeds, so they have actual content, including links, etc. There are approximately 5,000 topics visible to guests.
    Once I had implemented the "nofollow" changes, removed a lot of the redirect links for guests and added an optimised robots.txt file, I completed another crawl.
    Let's compare the data from the before and after.
    First up, the raw numbers show a stark difference.

    Before our changes, the audit tool crawled 176,175 links, of which nearly 23% were redirect links. After, just 6,389 links were crawled, with only 0.4% being redirection links. This is a dramatic reduction in both crawl budget and crawl depth. Simply by guiding Google away from thin content like profiles, leaderboards, online lists and redirect links, we can ask it to focus on content such as topics and posts.

    Note: You may notice a large drop in "Blocked by Robots.txt" in the 'after' crawl despite using a robots.txt for the first time. The calculation here also includes sharer images and other external links which are blocked by those sites robots.txt files. I added nofollow to the external links for the 'after' crawl so they were not fetched and then blocked externally.

    As we can see in this before, the crawl depth has a low peak between 5 and 7 levels deep, with a strong peak at 10+.

    After, the peak crawl depth is just 3. This will send a strong signal to Google that your site is optimised and worth crawling more often.
    Let's look at a crawl visualisation before we made these changes. It's easy to see how most content was found via table filters, which led to a redirect (the red dots), dramatically increasing crawl depth and reducing crawl efficiency.

    Compare that with the after, which shows a much more ordered crawl, with all content discoverable as expected without any red dots indicating redirects.

    Conclusion
    SEO is a multi-faceted discipline. In the past, we have focused on ensuring we send the correct headers, use the correct microdata such as JSON-LD and optimise meta tags. These are all vital parts of ensuring your site is optimised for crawling. However, as we can see in this blog that without focusing on the crawl budget and crawl efficiency, even the most accurately presented content is wasted if it is not discovered and added into the search index.
    These simple changes will offer considerable advantages to how Google and other search engines spider your site.
    The features and changes outlined in this blog will be available in our November release, which will be Invision Community 4.6.8.
×
×
  • Create New...