-
Notifications
You must be signed in to change notification settings - Fork 733
Fix: Correct provider name in config from openai to lm_studio #1570
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Fix: Correct provider name in config from openai to lm_studio #1570
Conversation
PR Description This PR fixes a typo in the config.lm_studio.yaml file where the provider was incorrectly set to openai instead of lm_studio. This caused local LLM services (such as LM Studio) to be unrecognized and unusable. Changes Corrected the provider field from openai to lm_studio in the config file Ensured compatibility with local LLM services
WalkthroughThe configuration file for the language model service was updated to change the model provider prefix from "openai/" to "lm_studio/" for both the language model and embedding model entries. The corresponding comments were also revised to reflect this new prefix. No changes were made to other configuration parameters, endpoints, or settings, and no code or exported entities were altered. Changes
Possibly related PRs
Suggested labels
Poem
Tip ⚡💬 Agentic Chat (Pro Plan, General Availability)
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🔇 Additional comments (2)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
@XavierMoon this is not typo. openai prefix means we use openai compatible api for lm studio. This is the rule given by litellm, which is used in wren-ai-service. What issues are you running into actually? |
PR Description
This PR fixes a typo in the config.lm_studio.yaml file where the provider was incorrectly set to openai instead of lm_studio. This caused local LLM services (such as LM Studio) to be unrecognized and unusable. Changes
Corrected the provider field from openai to lm_studio in the config file Ensured compatibility with local LLM services
Summary by CodeRabbit