Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
Tags

I used `all-MiniLM-L6-v2` embeddings for the top 10k most frequent words, then reduced dimensionality to 2d hyperbolic space by following this doc as reference: https://umap-learn.readthedocs.io/en/latest/embedding_space.html

Then, as shown in the doc, mapped into a Poincaré disk

In other words, I don't have an intuitive explanation on how the logic works, lol.

I put a lot of effort in manually tuning the UMAP hyperparameters and qualitatively assess if it made sense to me. This version is the best I got after ~8hs of fiddling with it.