Neil MackenzieMarch 9, 2014 in Feature Suggestions
For a future upgrade would it be possible to expand the Forum statistics we can access such as:-
- number of visitors
- how they accessed Forum
- geographical location of users
- bandwidth usage
I know that Google Analytics (or similar) can be installed by I would have thought that this was fairly basic information that most websites (let alone Forums) would find useful.
Bandwidth usage isn't really something the software can reliably track - your webserver should show you that information. The forum software doesn't really always know what files are being served on any given page and even when it does know some files it doesn't always know the file size. It's just not the front end software's "job" to track this sort of information.
What do you mean by "how they accessed Forum"? I assume you basically want to know the referrer (whether they clicked a link from a search engine or another site, or not)?
it would be useful to know the referrer and the name (and version) of the browser and the os
I feel that there already excellent free tools that do this job so it doesn't make a lot of sense to try and better them at application level.
Google Analytics and Webmaster Tools are your friends. :)
There are some analytics that I think would fit better in IPB than trying to setup GA to track them.
These are some MySQL queries I setup for my own analytics, but I haven't made anything graphical out of it (yet).
New forum posts by date:
SELECT SUM(`posts`) AS posts, `date` FROM (
( SELECT COUNT(*) AS posts, DATE(FROM_UNIXTIME(`post_date`)) AS 'date' FROM `posts` GROUP BY date )
( SELECT COUNT(*) AS posts, DATE(FROM_UNIXTIME(`archive_content_date`)) AS 'date' FROM `forums_archive_posts` GROUP BY date )
) c GROUP BY `date`
New forum posts by hour of the day:
SELECT SUM(`posts`) AS posts, `hour` FROM (
( SELECT COUNT(*) AS posts, HOUR(FROM_UNIXTIME(`post_date`)) AS 'hour' FROM `posts` GROUP BY hour )
( SELECT COUNT(*) AS posts, HOUR(FROM_UNIXTIME(`archive_content_date`)) AS 'hour' FROM `forums_archive_posts` GROUP BY hour )
) c GROUP BY `hour`
Registrations by date (Note: deleting a user completely can throw these off). I know the dashboard shows this, but it only includes the previous 7 days.
SELECT DATE(FROM_UNIXTIME(`joined`)) AS 'date', COUNT(*) AS registrations,
GROUP BY `date`
ORDER BY `date` DESC
Users last visit by date. Also includes users visit since date (ex: 3 days ago = today + yesterday + day before)
SELECT *, @current:=(@current+a.users) AS users_total
SELECT DATE(FROM_UNIXTIME(`last_visit`)) AS 'date', COUNT(*) AS users, DATEDIFF(NOW(), DATE(FROM_UNIXTIME(`last_visit`)) ) AS days_ago
GROUP BY `date`
ORDER BY `date` DESC
One thing that I would really like to see, is how many inactive users (ex: not logged in for 30+ days) have returned on which dates. This would require a new table though to store last_visit values and would have to be compared on a daily cronjob. It would no doubt be useful to see if things like email campaigns are working to pull users back to the forums.
I get all this from "AWStats" on my server.
If you want to track bandwidth usage at the server level, I highly recommend vnstat. As previously stated, accurately measuring bandwidth consumption at the server level can is very difficult.
For general purpose, self-hosted analytics, I highly recommend Piwik, just be sure your server is powerful enough to use it. Otherwise, Google Analytics is a good source.
Which free tools do you recommend, Matt?
I am on IPS Hosting, which charges by number of users over a 48-hour period, so I am very concerned with keeping out bad bots, spiders, etc. Can I install something on the IPS server to help me do this?
The first question is do these bots/spiders count as a "user" in IPS hosting. I don't use it nor know their rules but this is always your first step.
2nd step (actually blocking these if you have FTP access): Your robots.txt will keep out bots/spiders that obey a robots.txt file. You can also in your .htaccess write rules for bot IP addresses that you also want to block. Make sure you don't block helpful bots/spiders such as Google, Bing, etc...
Yes, bots, etc. count as users. robots.txt is already in place.
I have a lot of Guests not identified as bots, from Amazon AWS, for example, that are not humans.
I am looking for streamlined ways to identify them -- as many Webmasters are!
Other than the main search engines, I don't need any non-humans as Guests on my site.
If you are using apache, you can use a .htaccess file to block visitors from specific IP addresses or with specific user agents.
This topic is now archived and is closed to further replies.
Started Thursday at 02:03 PM
Started 1 hour ago
Started January 10