Quick question: when you say "of every game they'll evaluate" - does this mean they won't look at all the games? Or did you just mean that no one judge will play every game?
Hi, yeah, they'll play a shortlist of the games that we've selected - unfortunately we're a victim of our own success and couldn't get them to guarantee playing all 72. Although most will play many of them anyway, they just can't make a firm commitment that they will. The other option would be to split the games so that different judges play different games to cover them all - it'd mean less consistency in the judging though, but maybe that's a better option for future??
The later is closer to how most professional game dev competitions I've judged for are run (PAX10, IndieCade, Game Dev Choice Awards, etc...). The judges all split up the games and then come back to discuss what they liked and to recommend games for various awards. The judges will often then play the suggested games in a second round, but at least every game is guaranteed a fair shot.
Not having any judges play some of the games perpetuates unfair disadvantages that already exist in the ratings. For example, one of the games in this jam is made to be played by people who are blind and intentionally features a blank screen in hard mode. That will likely hurt their rating for "visuals" and could place it further down in the overall ratings. That should not disqualify it from recognition in other areas.
(The best audio in the world could be featured in a crap game, their audio person should still get an award for best audio.)
Additionally, for games with fewer ratings, 1 or 2 people can tank the ratings for what should be a 5-star game. If someone made a game featuring a transgendered character, a couple of transphobic people could intentionally tank the game's ratings out of spite. The number of ratings a game receives is already influenced by things like popularity on YT/Twitch, whether you worked on a team, and whether there is a cute/attractive face on your title card art.
I think if you reach out to your judges about this now, you'll find some of them are likely to agree to split up and play the games that weren't already "selected".
I figured that judges sponsoring specific categories would judge the top scoring games in those categories and the rest of the judges would split up and judge the remaining games.
Since this jam is labeled as being connected with "Game Jobs", I expected that every submitted game would at least be played by a professional who might be looking to hire in the future. To me, that was the point of this jam. I took time away from my work, away from my family - away from my child - because I thought this was an opportunity to showcase something to professional devs and possibly set myself up for a job in the future. I know you mean well; but, honestly, this is really upsetting.
Apologies if we've upset anyone - we're trying to do something good here and giving up a ton of our own spare time as well. But I can guarantee that every single game will get played by folk in the industry that may be hiring, and it's partly why we do the playthrough livestreams, to ensure that a wider group of industry folk get to see them too. The shortlist is manually curated though, not just whatever games are rated most by the community, as I agree, that would have all sorts of inherent unfairness.
I think it'd cause more problems if we try to change the system mid-way, when many of the judges have already started, but we can speak to the companies about the implications of doing it differently in future - it might mean not being able to get so many companies involved if we're asking them to spend time on games they're less likely to be interested in though.
But apologies again if anyone feels misled, the thought of that is heart-breaking - and we'll definitely take these concerns into consideration for the future when trying to weigh up all the different factors of making things as appealing and viable as possible to all concerned.