Does the OpenAI API support mean you can use custom endpoints/proxies? That would allow for local LLMs to be used if that was the case.
minipasila
Recent community posts
Sure if you aren't tech savvy you can always just pay for the ease of use but if you don't mind figuring things out to do it for free then you should still be allowed to do that.
(Btw you don't need particularly beefy computers to run LLMs especially llamacpp.. it can run on your phone if you have 6-8gb memory.)
You do realize they don't need to force you to use OpenAI? They could just open up the API so that you could use locally run LLMs.. which could keep the game alive long after the devs stop supporting the game.
OpenAI's API is pretty common nowadays and many popular local LLM UIs support it like text-generation-webui.. though Koboldcpp which is probably the easiest to use uses it's own API, but should be fairly easy to implement still. (the devs could easily look at things like SillyTavern and the code on how they implemented those things)
If you ever plan on supporting local models, you might want to use the OpenAI API with the ability to change the proxy server so it can be used by many different open source LLM backends like Oobabooga's text-generation-webui etc. (Koboldcpp had it's own API but Ollama has partial OpenAI API support)