Hi! Love the concept. Do you have any plans to support people utilizing local/other AI models, ala chatgpt/claude keys as with the demo's openAI support? Keeping an eye on this, and will probably snag it soonish!
If you ever plan on supporting local models, you might want to use the OpenAI API with the ability to change the proxy server so it can be used by many different open source LLM backends like Oobabooga's text-generation-webui etc. (Koboldcpp had it's own API but Ollama has partial OpenAI API support)