Skip to content

[Feature] how to set the LLM model's supported parameters like top-p etc. #8046

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
2 tasks
wangweiguo-123 opened this issue Apr 4, 2025 · 3 comments
Closed
2 tasks
Labels
enhancement New feature or request

Comments

@wangweiguo-123
Copy link

What feature would you like to see?

i see the dspy.LM can set the temperature , but not top-p

Would you like to contribute?

  • Yes, I'd like to help implement this.
  • No, I just want to request it.

Additional Context

No response

@wangweiguo-123 wangweiguo-123 added the enhancement New feature or request label Apr 4, 2025
@okhat
Copy link
Collaborator

okhat commented Apr 4, 2025

Just pass it in the constructor

@TomeHirata
Copy link
Collaborator

Can you try like this?

import dspy
lm = dspy.LM(model='openai/gpt-4o-mini', max_tokens=250, top_p=0.1)
dspy.configure(lm=lm)

@TomeHirata
Copy link
Collaborator

I'll close this issue as this should already be supported. Please reopen this if you still see problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants