-
Posts
32 -
Joined
-
Last visited
Reputation Activity
-
Julia Osipova got a reaction from sobrenome in SEO: Improving crawling efficiency
A question about canonical tags.
I have set up the '/community/store/' section as the main community section ( '/community/' (default page in ACP)).
After that, I received a notification from the Yandex search server that the '/community/store/' pages were excluded from the search on Yandex because it is a duplicates for '/community/' and the canonical link for these duplicates are not specified.
As a result, all my links in '/community/store' disappeared from the search and this led to very tangible losses for me.
Please tell me in which section of the admin panel I can configure the canonical view for the default community link, i.e. where I can configure the addition of such a tag to the <head> tag of the '/community/' page:
<link rel="canonical" href="http://www.example.com/community/store"/>
For more information, see the instructions of the Yandex search engine:
https://yandex.ru/support/webmaster/robot-workings/canonical.html
-
Julia Osipova got a reaction from sobrenome in SEO: Improving crawling efficiency
No.
I need to add it only for root '/community/', not for all pages, and globalTempalte is bad idiea.
Meta tag editor can add only meta tags, isn't it?
Can I add <link> tag via Meta tag editor ?
-
Julia Osipova got a reaction from sobrenome in SEO: Improving crawling efficiency
Well, I want '/communtiy/store' as default target for all users who types '/community' in browser address string.
Not '/community/forum', and not '/community/gallery' or '/community/blog'.
It's ok. But why I can't set "canonical" for default root page in IPS?
It's very,very,very painfully.....
-
Julia Osipova reacted to Everade in SEO: Improving crawling efficiency
@Matt
Please don't get me wrong, i'm happy that this has been adressed. Thanks for that.
But i think you missed out on something very important here.
Canonical Tag
It would make sense to apply canonical tags wherever possible before un-indexing thousands of pages.
Also robots.txt doesn't mean search engines won't index it. Other pages linking to these specific pages means that the search engine will still index them despite your given instructions. That's why robots.txt most of the time isn't recommended to be used.
Here's how to consolidate duplicate URLs:
https://developers.google.com/search/docs/advanced/crawling/consolidate-duplicate-urls
All major search engines highly recommend canonical tags instead of:
# Block faceted pages and 301 redirect pages Disallow: /*?sortby= Disallow: /*?filter= Disallow: /*?tab=comments Disallow: /*?do=findComment Disallow: /*?do=getLastComment Disallow: /*?do=getNewComment You're basically telling the Search Engines to "get off my website", rather than saying which are the important pages.