Jump to content

Everade

Members
  • Posts

    139
  • Joined

  • Last visited

Reputation Activity

  1. Like
    Everade got a reaction from Julia Osipova in SEO: Improving crawling efficiency   
    @Matt
    Please don't get me wrong, i'm happy that this has been adressed. Thanks for that.
    But i think you missed out on something very important here.
    Canonical Tag

    It would make sense to apply canonical tags wherever possible before un-indexing thousands of pages.
    Also robots.txt doesn't mean search engines won't index it. Other pages linking to these specific pages means that the search engine will still index them despite your given instructions. That's why robots.txt most of the time isn't recommended to be used.
    Here's how to consolidate duplicate URLs:
    https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls
    All major search engines highly recommend canonical tags instead of:
    # Block faceted pages and 301 redirect pages Disallow: /*?sortby= Disallow: /*?filter= Disallow: /*?tab=comments Disallow: /*?do=findComment Disallow: /*?do=getLastComment Disallow: /*?do=getNewComment You're basically telling the Search Engines to "get off my website", rather than saying which are the important pages.
     
     
  2. Thanks
    Everade reacted to Matt in SEO: Improving crawling efficiency   
    Thanks for the feedback 🙂

    We do use canonical tags heavily. However, the "do=getNewComment" style links cannot be canonicalised because they are just 301 redirects to another page.

    Canonical links have a place, but they still eat up crawl budget because Google has to index the page to see the tag and then decide what to do with it.

    We certainly are not telling Google "get off my site" - we are just strongly hinting to Google what we see as valuable parts of the site, and what we do not. Things like profile pages, redirect links and so on just eat up the budget for almost no return.
  3. Like
    Everade got a reaction from SEO Guru in SEO: Improving crawling efficiency   
    @Matt
    Please don't get me wrong, i'm happy that this has been adressed. Thanks for that.
    But i think you missed out on something very important here.
    Canonical Tag

    It would make sense to apply canonical tags wherever possible before un-indexing thousands of pages.
    Also robots.txt doesn't mean search engines won't index it. Other pages linking to these specific pages means that the search engine will still index them despite your given instructions. That's why robots.txt most of the time isn't recommended to be used.
    Here's how to consolidate duplicate URLs:
    https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls
    All major search engines highly recommend canonical tags instead of:
    # Block faceted pages and 301 redirect pages Disallow: /*?sortby= Disallow: /*?filter= Disallow: /*?tab=comments Disallow: /*?do=findComment Disallow: /*?do=getLastComment Disallow: /*?do=getNewComment You're basically telling the Search Engines to "get off my website", rather than saying which are the important pages.
     
     
  4. Like
    Everade got a reaction from sobrenome in SEO: Improving crawling efficiency   
    @Matt
    Please don't get me wrong, i'm happy that this has been adressed. Thanks for that.
    But i think you missed out on something very important here.
    Canonical Tag

    It would make sense to apply canonical tags wherever possible before un-indexing thousands of pages.
    Also robots.txt doesn't mean search engines won't index it. Other pages linking to these specific pages means that the search engine will still index them despite your given instructions. That's why robots.txt most of the time isn't recommended to be used.
    Here's how to consolidate duplicate URLs:
    https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls
    All major search engines highly recommend canonical tags instead of:
    # Block faceted pages and 301 redirect pages Disallow: /*?sortby= Disallow: /*?filter= Disallow: /*?tab=comments Disallow: /*?do=findComment Disallow: /*?do=getLastComment Disallow: /*?do=getNewComment You're basically telling the Search Engines to "get off my website", rather than saying which are the important pages.
     
     
  5. Like
    Everade got a reaction from Maxxius in SEO: Improving crawling efficiency   
    @Matt
    Please don't get me wrong, i'm happy that this has been adressed. Thanks for that.
    But i think you missed out on something very important here.
    Canonical Tag

    It would make sense to apply canonical tags wherever possible before un-indexing thousands of pages.
    Also robots.txt doesn't mean search engines won't index it. Other pages linking to these specific pages means that the search engine will still index them despite your given instructions. That's why robots.txt most of the time isn't recommended to be used.
    Here's how to consolidate duplicate URLs:
    https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls
    All major search engines highly recommend canonical tags instead of:
    # Block faceted pages and 301 redirect pages Disallow: /*?sortby= Disallow: /*?filter= Disallow: /*?tab=comments Disallow: /*?do=findComment Disallow: /*?do=getLastComment Disallow: /*?do=getNewComment You're basically telling the Search Engines to "get off my website", rather than saying which are the important pages.
     
     
×
×
  • Create New...