Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags
(1 edit) (+2)

I have some general feedback. At first glance the system looks fair and unique, but it has a major positive feedback loop flaw (somebody here is watching Mark Brown!): once games are being highly rated, they have a higher chance to be rated again and again - because the raters can choose from a list of 5 games and probably just choose the most popular ones on the list (that's what I did). On the other hand, this snowball effect is important to find the needle in the haystack as the rating progresses.

I think it would make MUCH more sense to hide the rating information and only publish it in parts, maybe something like that:

  • First 72 hours: don't show any data on the rating page, just give random games to raters.
  • Next 24 hours: show a boolean indicating if the game was rated (>=1) or not.
  • Next 24 hours: only show amount of ratings in gross scale (>5, >50 etc.)
  • Next 48 hours: reveal all the information.

Another thing to consider: if you ask people to rate 25 random games first, you might as well block them from rating any arbitrary games ever (not even after 25 ratings). That might incentivize people to randomly rate games in order to get to the 26th rating (there's no way to enforce the raters to actually play!). I wonder how many of the raters just rate everything but one game with 1 star.

On a personal note: we made what I think is an amazing game, and we get great feedback from our friends and colleagues - but we get so little feedback from jammers and raters and it's really a pity for us. We learned a lot and had a hell lot of fun, that's for sure - I just wish the rating distribution would make more game makers notice our game and give us the honest feedback we need so much.

Admin (2 edits)

I think this is great feedback. I think the issue of people always choosing the best game is unavoidable though, since even if the counts aren’t there, the viewer could go and look at the pages and select the best looking ones. There’s always going to be some bias since we let people rate from a queue size of 5. I’ll think about updating the queue page though for future jams.

I wonder how many of the raters just rate everything but one game with 1 star.

There’s no way I can think of to truly block people cheating or trying to work around the system, but at least something like this can make it become more obvious. At the end we can look over the results to identify raters to disqualify.

It’s the same idea for both issues, we can’t 100% remove these types of issues but we can have systems in place to try to mitigate them and get closer to something that is fair.

Personally 3 stars is 'average' to me, if you get 1 star then you did it really badly.