Jump to content

Recommended Posts

Posted

Hi everyone,

in the past version of IP.Board (3.dontknow) the IPs from search engines were identified and in the list of guests online you would sometimes see "Google" or "Bing".

Is there some way or plugin to add this to 4.5 as well? It was very usefull to understand whether all these guests are actually people or just robots.

Posted

We identify bots where possible in the online user list but group them so only one is displayed (i.e. if 10 googlebots are on your site it will just reflect once).

Note that guest page caching and things of that nature can also reduce the number of guest visitors reflected in the interface.

  • 2 months later...
Posted

I hope we will be getting some help from someone ... this way it's hard to tell how many human guests are really active or whether it's all bots and spiders.

@bfarber mentioned earlier that bots where identified ... but where? :(

Posted

Sorry, I was mistaken and the behavior has changed.

The online list simply lists every active session, while the "Who's Online" widget lists every logged in member and a raw count of guests.

  • 2 weeks later...
Posted

Ok, so that should be an easy fix then to make it show up in the interface of Who's Online, no? Like in the past.

Can you supply a solution for this? To see how man of the Guests where actually bots, or even more precisely, Google, Bing, etc. was extremely helpful.

Posted

Well, only goes to show how this feature is requested.

Is there any paticular reason why the feature was removed or not considered in the current version?

And is there a way to make this working?

Posted

Thanks.

When you write it "will make most bots never initiate a session" - would that not also prevent bots from search engines like Bing and Google crawling the site and making sure all the content is proberly registered? A lot of traffic on my page is from people having searched for some topic on Google which then lead them to my site.

Those search engine bots definitely need to be able to crawl the site. For this they will initiate a session. And I'd like to see this in my "Who's online".

As a user explained above: "Behind the scenes we flag if it's a search engine, but we do not expose this in the interface." I don't quite understand what's so difficult about this then and where exactly the reluctance is coming from.

Posted

@AndreasW2000, if you don't mind my asking, what's motivating the desire to have search engine (or other web crawlers) identified as such in Who's Online?

If the purpose is to determine the level of human traffic on your community as a community administrator, I'd suggest you'd explore some better options that exist to see these measures. If the purpose is for your membership to know this information, I might be helpful to talk a bit more about your use case here.

As an administrator, I'd recommend using a third-party analytics package such as Google Analytics. It's free, can be turned on natively with IPS (ACP > System > Community Enhancements) and will show you in both real-time and historical snapshots human visitors vs. automated traffic. The data presented here is going to be far more accurate and insightful than any efforts to visually monitor the Who's Online page.

1 hour ago, AndreasW2000 said:

When you write it "will make most bots never initiate a session" - would that not also prevent bots from search engines like Bing and Google crawling the site and making sure all the content is proberly registered?

No, this simply means that there won't be session data stored, and the overhead of creating/updating that session data in your database or Redis. Doing this would improve your site's speed and performance for things accessing your site that aren't logged in. Presumably (though I have no idea) these changes would apply to all non-authenticated traffic.

It would not block or prevent them from crawling your site, nor viewing the content. Just an under the hood optimization.

Posted
On 2/11/2021 at 2:50 PM, Paul E. said:

No, this simply means that there won't be session data stored, and the overhead of creating/updating that session data in your database or Redis. Doing this would improve your site's speed and performance for things accessing your site that aren't logged in. Presumably (though I have no idea) these changes would apply to all non-authenticated traffic.

It would not block or prevent them from crawling your site, nor viewing the content. Just an under the hood optimization

Ah, okay, that's good to hear.

To your other questions: I am not using this information for detailed traffic analysis. It´s simply helpful, honest and correct, if in the who´s online view I don´t see 50 different "Guest" listed, but maybe only 30 "Guest" and then additionally some "Google", "Bing", etc. Not only does it tell me as admin and my moderators that there are not actually 50 Guests currently brownsing without requiring the backend or other stats. Also it reminds the members that whatever they write is being observed and tracked and stored by Google and the likes.

Given that this feature was available before (and later as a market place plugin, which seems to have been removed now) and the fact that this information is being tracked anyway internally, I would like to understand why it can not be made available in the frontend as before. It was just really helpful.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...