Feature Request: Local LLM connection #2033
KG-santa
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Describe the feature
With the introduction of ability to connect OpenAI, it would be nice to add the ability to connect locally run LLMs, e.g Ollama instance.
Why is this feature important?
More privacy over the user data for those who is already hosting a local LLM.
Additional context, screenshots, and relevant links
Have seen the mention of it in an API feuture request thread, but didnt find a dedicated one.
Beta Was this translation helpful? Give feedback.
All reactions