Why are ratings and reviews the way they are?
Very likely to save staff from workload for a thing that will not get paid.
The way it is now, I see very few to none cases, where someone tried to fake the ratings. And your example of the player liking fps and trying a puzzle is not valid. For once, that star rating has to have the same weight, or the rating system has no right to exist at all. If you weight the rating of people how would you do that? Have people with a more important opinion?! Why not let the developer rate their own game while we are at it? Should we make qualifying tests to see our expertise in a certain genre?
But the bigger issue is, that is not what I see happening. Most people do not bother to rate. At all. Someone rating each and everything they come across is the exception. And should they do, well, why should that rating be not counted? Sure, there are dishonest ratings, that rate down (or up) things by other criteria than you would apply. Like not liking the genre. Or the developer. Or tools used to create the game. AI comes to mind.
What is a rating worth, if you let only players rate it, that like that content in general? Hmm. Let's think about it. Oh, maybe lot's of seemingly non mainstream things would get sorted high in the highest rating section. That is actually happening, because, as I said, most people do not bother to rate things. So content that is not mainstream will attract their target audience and get high ratings. Just look at the browse section and the highest rated games. Those are not rated by the unbiased general player base.
So in other words, what you want is already happening. Puzzle enthusiasts will rate puzzles. Fps enthusiasts will rate fps games. The once in a while misplaced player that does not like the topics of a game will not distort the result much. One could even argue, if you mis-advertise a game to attract a player that will be disappointed in the game, the negative rating is well earned.
So what about blocking ratings from people that did not download a thing. Tricky. If people want to give dishonest ratings, they will find a way. Maybe it could be implemented for paid games. But you also have to consider cheating in the other direction. Developers giving themselves fake 5 star ratings. And that is easily done for paid games, as the creator can give away the game to fake accounts to get their ratings. But this is likely to get detected and the game to be banned. And if a developer sees lots of 1 star ratings of seemingly fake accounts, that can be reported, should it not get detected.
What I would improve on the current rating system are two things. Show the distribution of ratings. It is interesting to see if your 3 star rating is 2 times 3 or one 1 and one 5. Or if that 4 rating consists of many 4 or mostly 5 and 1.
And making it more clear when rating, that the review is not seen on the game page.
To find games you can try to find collections that have games you like. Either by trawling the feeds or picking one of the collections you see when browsing. You could even follow one of the accounts that regularly rate games to your tastes, so their ratings will show up in your feed.
The general problem for indie games is the relative scale to rate on. 5 stars for Dark Souls is not the same as 5 stars for a hobby game clicked together in a game engine. To me, the biggest information is not the rating score, but how many ratings there are. Because people tend to not rate very much. And mostly in their bubble of favorite topics. So a score of 4.9 for a puzzle game means nothing for a fps player. So puzzle players love the game. Meh. But 700 ratings in an environment where most games have 0 - 9 ratings means a lot, even if the score is 3.4. But also, if you do have those 700 ratings, the few dishonest ratings will not skew the result much.