Skip to main content

On Sale: GamesAssetsToolsTabletopComics
Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

Thanks for the support, Kyowa! The open beta is kicking off with our current setup but we may explore more LLM options in the future. We look forward to you snagging the game so the catgirl can snag you back! 💖🔪

If you ever plan on supporting local models, you might want to use the OpenAI API with the ability to change the proxy server so it can be used by many different open source LLM backends like Oobabooga's text-generation-webui etc. (Koboldcpp had it's own API but Ollama has partial OpenAI API support)