Jump to content

Meta tags and indexing user profile pages

Gabriel Torres

Recommended Posts

Today I bring a very interesting suggestion: the ability for the platform to automatically create the meta description tags for the user profile pages. Taking my own profile in our community as an example: https://www.clubedohardware.com.br/profile/281750-gabriel-torres/

See how Google lists is ("There is no information available for this page").


It would be great that the platform automatically created the meta description based on a language phrase that could default to '%'s profile at %s', where the first replacement would be the username and the second one would be the website's name.


Link to comment
Share on other sites

@Stuart Silvester brainfart here. I was aware of that, I even exchanged a couple of messages with you regarding robots.txt. We have in fact this directive in our robots.txt. I edited the title of this topic to reflect our current (and correct) discussion.

# Block profile pages as these have little unique value, consume a lot of crawl time and contain hundreds of 301 links
Disallow: /profile/

I was intrigued on why Google was still indexing my own profile and found this:



If other pages point to your page with descriptive text, Google could still index the URL without visiting the page. If you want to block your page from search results, use another method such as password protection or noindex.

If your web page is blocked with a robots.txt file, its URL can still appear in search results, but the search result will not have a description. Image files, video files, PDFs, and other non-HTML files will be excluded. If you see this search result for your page and want to fix it, remove the robots.txt entry blocking the page. If you want to hide the page completely from Search, use another method.

That is exactly what is happening here.

Therefore, based on what you wrote, besides the nofollow, I believe that probably it should be added the noindex directive to the user profile pages.

However, in order for the noindex to work, the pages must not be blocked in robots.txt:



Important: For the noindex directive to be effective, the page or resource must not be blocked by a robots.txt file, and it has to be otherwise accessible to the crawler. If the page is blocked by a robots.txt file or the crawler can't access the page, the crawler will never see the noindex directive, and the page can still appear in search results, for example if other pages link to it.

Could you please take a look into this?


Edited by Gabriel Torres
Link to comment
Share on other sites

@SeNioR- @Stuart Silvester

Decided to solve this here in our install as follows:

1. Removed from robots.txt: Disallow: /profile/

2. Added a new SEO rule in ACP > Search Engine Optimization > Meta tags as:

Page address: /profile/*

meta tag name: robots

content: noindex,nofollow

I found out that the platform already has all the tools we need to fine-tune SEO! Just need some digging! 🙂


Edited by Gabriel Torres
Link to comment
Share on other sites

  • 4 weeks later...
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Create New...