Jump to content

Forum Statistics in next upgrade?


Neil Mackenzie

Recommended Posts

For a future upgrade would it be possible to expand the Forum statistics we can access such as:-

- number of visitors
- how they accessed Forum
- geographical location of users
- bandwidth usage

I know that Google Analytics (or similar) can be installed by I would have thought that this was fairly basic information that most websites (let alone Forums) would find useful.

Many thanks

Neil Mackenzie

Link to comment
Share on other sites

Bandwidth usage isn't really something the software can reliably track - your webserver should show you that information. The forum software doesn't really always know what files are being served on any given page and even when it does know some files it doesn't always know the file size. It's just not the front end software's "job" to track this sort of information.

What do you mean by "how they accessed Forum"? I assume you basically want to know the referrer (whether they clicked a link from a search engine or another site, or not)?

Link to comment
Share on other sites

  • 1 month later...

There are some analytics that I think would fit better in IPB than trying to setup GA to track them.

These are some MySQL queries I setup for my own analytics, but I haven't made anything graphical out of it (yet).

New forum posts by date:

SELECT SUM(`posts`) AS posts, `date` FROM (
    ( SELECT COUNT(*) AS posts, DATE(FROM_UNIXTIME(`post_date`)) AS 'date' FROM `posts` GROUP BY date )
    UNION
    ( SELECT COUNT(*) AS posts, DATE(FROM_UNIXTIME(`archive_content_date`)) AS 'date' FROM `forums_archive_posts` GROUP BY date )
) c GROUP BY `date`

New forum posts by hour of the day:

SELECT SUM(`posts`) AS posts, `hour` FROM (
    ( SELECT COUNT(*) AS posts, HOUR(FROM_UNIXTIME(`post_date`)) AS 'hour' FROM `posts` GROUP BY hour )
    UNION
    ( SELECT COUNT(*) AS posts, HOUR(FROM_UNIXTIME(`archive_content_date`)) AS 'hour' FROM `forums_archive_posts` GROUP BY hour )
) c GROUP BY `hour`

Registrations by date (Note: deleting a user completely can throw these off). I know the dashboard shows this, but it only includes the previous 7 days.

SELECT DATE(FROM_UNIXTIME(`joined`)) AS 'date', COUNT(*) AS registrations, 
FROM `members`
GROUP BY `date`
ORDER BY `date` DESC

Users last visit by date. Also includes users visit since date (ex: 3 days ago = today + yesterday + day before)

SET @current:=0;
SELECT *, @current:=(@current+a.users) AS users_total
FROM (
    SELECT DATE(FROM_UNIXTIME(`last_visit`)) AS 'date', COUNT(*) AS users, DATEDIFF(NOW(), DATE(FROM_UNIXTIME(`last_visit`)) ) AS days_ago
    FROM `members`
    GROUP BY `date`
    ORDER BY `date` DESC
) a

One thing that I would really like to see, is how many inactive users (ex: not logged in for 30+ days) have returned on which dates. This would require a new table though to store last_visit values and would have to be compared on a daily cronjob. It would no doubt be useful to see if things like email campaigns are working to pull users back to the forums.

Link to comment
Share on other sites

If you want to track bandwidth usage at the server level, I highly recommend vnstat. As previously stated, accurately measuring bandwidth consumption at the server level can is very difficult.

For general purpose, self-hosted analytics, I highly recommend Piwik, just be sure your server is powerful enough to use it. Otherwise, Google Analytics is a good source.

Link to comment
Share on other sites

  • 6 months later...

I feel that there already excellent free tools that do this job so it doesn't make a lot of sense to try and better them at application level.

​Which free tools do you recommend, Matt?

I am on IPS Hosting, which charges by number of users over a 48-hour period, so I am very concerned with keeping out bad bots, spiders, etc. Can I install something on the IPS server to help me do this?

Link to comment
Share on other sites

The first question is do these bots/spiders count as a "user" in IPS hosting. I don't use it nor know their rules but this is always your first step.

2nd step (actually blocking these if you have FTP access): Your robots.txt will keep out bots/spiders that obey a robots.txt file. You can also in your .htaccess write rules for bot IP addresses that you also want to block. Make sure you don't block helpful bots/spiders such as Google, Bing, etc... 

Link to comment
Share on other sites

Yes, bots, etc. count as users. robots.txt is already in place.

I have a lot of Guests not identified as bots, from Amazon AWS, for example, that are not humans.

I am looking for streamlined ways to identify them -- as many Webmasters are!

Other than the main search engines, I don't need any non-humans as Guests on my site.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...