Jump to content

Recommended Posts

Posted

I have contributors downloading the same file repeatedly to fake their rank in the top downloads list. (Or asking other members to download repeatedly)

Can't we throw a "distinct" in that sql count select to make the list fair? (count user only once who downloads repeatedly)

IMG_0939.thumb.JPG.5581b26a465655ca9a155649b74dae4c.JPG

Posted
13 minutes ago, David.. said:

Can't it just be bypassed by creating multiple accounts then?

no because you have to pay to download. IF they go to that extent, then they really aren't cheating they are purchasing multiple times which I am okay with.

Posted

I actually have the same problem on my site. Or did at one point. I had a member download his file over and over 77 times running up my bandwidth. He did it numerous times. I finally put an end to it by posting on my guidelines that anyone caught downloading files over and over will be banned. 

I can see uploading a file then testing it to make sure it works. Or even checking your file later on and making sure it works. 

I wish IPS or a developer would create something that would stop it. That way I don't have to continually monitor the issue. 

Posted
33 minutes ago, CP said:

I actually have the same problem on my site. Or did at one point. I had a member download his file over and over 77 times running up my bandwidth. He did it numerous times. I finally put an end to it by posting on my guidelines that anyone caught downloading files over and over will be banned. 

I can see uploading a file then testing it to make sure it works. Or even checking your file later on and making sure it works. 

I wish IPS or a developer would create something that would stop it. That way I don't have to continually monitor the issue. 

Right, .. it's really just a matter of adding a 'distinct' to the query. Not sure why this doesn't happen.

"count distinct user" instead of "count user"  *sigh* hopefully someone will catch this and make the edit

Posted (edited)

My personal solutions:

  • Must be a limit (or don't log) when a user try to download more x times same file on y days, to improve bandwidth and avoid this behaviour (cheat to win in downloads of the week)
  • New block (if it is not exist) where count buyers not downloads
  • Rework the queries and add group by member_id or ip if member_id is null...
Edited by BomAle
Posted

I think a happier medium would be to count how many people downloaded that specific version. Because a new version download should be counted as a a new download (as that indicates an active license especially for purchases with renewals). I think that would be a fairest way to prevent that sort of stuff.

  • 6 months later...
  • 4 months later...
  • 4 weeks later...
Posted
On 4/23/2018 at 3:43 PM, SJ77 said:

Anyone know if this is still an issue in 4.3?

Yes it is:

NNDvuVz.png

I dont' think I can hook in downlolad() to exclude file submitter. What I can do is make a plugin with a task to delete file submitter's downloads at every hour, for example.

But that won't stop another user from downloading it repeatedly.

Posted (edited)
On 5/6/2017 at 2:07 PM, BomAle said:

My personal solutions:

  • Must be a limit (or don't log) when a user try to download more x times same file on y days, to improve bandwidth and avoid this behaviour (cheat to win in downloads of the week)
  • New block (if it is not exist) where count buyers not downloads
  • Rework the queries and add group by member_id or ip if member_id is null...

They can download all they want. It's the count in the rank that is causing the issue. I don't mind seeing the same user loads of times in the list of who downloaded, I just don't think it should add up to help rank the file.

Just put a "distinct" in the user rank query. This is the whole reason we have SQL distinct functions.

Edited by SJ77
Posted
2 minutes ago, SJ77 said:

I don't mind seeing the same user loads of times in the list of who downloaded, I just don't think it should add up to help rank the file. 

The rank is based in the number of downloads... so you need either:

  • delete consecutive downloads for same users in a short amount of time AND update the number of downloads for the file
  • simply update the number of downloads for the file (excluding the consecutive downloads)
Posted (edited)
On 5/17/2018 at 11:42 AM, Adriano Faria said:

The rank is based in the number of downloads... so you need either:

  • delete consecutive downloads for same users in a short amount of time AND update the number of downloads for the file
  • simply update the number of downloads for the file (excluding the consecutive downloads)

OR only count distinct users in the rank query so each user only gets counted 1 time. I can't be the only person who has heard of "distinct" when writing a sql query.

Edited by SJ77
Posted
1 minute ago, SJ77 said:

OR only count distinct users in the rank query so each user only gets counted 1 time.

That will require IPS changes because we can't hook in the 'middle' of a function/method... you know it can take loooooooong to happen; if it does.

Posted
21 minutes ago, Adriano Faria said:

That will require IPS changes because we can't hook in the 'middle' of a function/method... you know it can take loooooooong to happen; if it does.

I see what you mean. With my original question I was kinda thinking this is something that IPS should fix. Seems like their original rank logic wasn't thought through completely. They really need to drop a distinct in that query.

  • 1 year later...
Posted

I was directed to this thread after posting my own regarding this topic.

It would be great if we had far more options regarding how Downloads are counted and how the whole ranking system works.

On my site, i would prefer to rank files/download rather than the rankings people give them as most are to lazy to use the ranking system and forcing them to rank before downloading...leads to false ranking!!! 🙂

Anyways  ...BUMP...lets see if IPS team will make the ranking/download count system more usable 🙂

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...