Configurable Open AI Endpoint

Love what you can do. I'm happy to pay for the prompt engineering and flow behind what you've done in this app based on two test notes. For a variety of reasons, I can't use this for work because I can't cycle client confidential thoughts off of my laptop. I can, however, run local LLMs through tools like Ollama, which expose an Open AI conformant endpoint. (That's why I mark this as Important, not Nice To Have.) What I'd LOVE to see, to use this every day, is an "Advanced Settings" type of feature where I can not only bring my own Open AI key, but also save the endpoint URL - so I can change it from an URL to a localhost URL and use my own LLM. Why hidden as an "Advanced" setting? I shouldn't expect a local 7B or 13B parameter model to perform the synthesis nearly as well as Open AI's models. To maintain expectations - if you understand that point, you're the kind of person who would change URL. If not, then changing the URL is likely to confuse.