Skip to content

[Bug] UnsupportedParamsError with COPRO Optimizer and Amazon Bedrock #8058

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
Nasreddine opened this issue Apr 11, 2025 · 3 comments
Open
Labels
bug Something isn't working

Comments

@Nasreddine
Copy link

Nasreddine commented Apr 11, 2025

What happened?

I'm encountering an UnsupportedParamsError when using the COPRO optimizer with Amazon Bedrock models:

  • mistral.mistral-small-2402-v1:0
  • us.meta.llama3-2-3b-instruct-v1:0

The error occurs due to unsupported parameter {'n': 9}.

At the beginig I get this warning :

2025/04/11 12:02:47 WARNING dspy.adapters.json_adapter: Failed to use structured output format. Falling back to JSON mode. Error: litellm.UnsupportedParamsError: bedrock does not support parameters: {'n': 9}, for model=mistral.mistral-small-2402-v1:0. To drop these, set litellm.drop_params=True or for proxy:
litellm_settings: drop_params: true

Then I get following exception error:

File ~/dev/work/playground/.venv/lib/python3.10/site-packages/dspy/adapters/json_adapter.py:69, in JSONAdapter.call(self, lm, lm_kwargs, signature, demos, inputs)
67 return super().call(lm, lm_kwargs, signature, demos, inputs)
68 except Exception as e:
---> 69 raise RuntimeError(
70 "Both structured output format and JSON mode failed. Please choose a model that supports "
71 f"response_format argument. Original error: {e}"
72 ) from e

RuntimeError: Both structured output format and JSON mode failed. Please choose a model that supports response_format argument. Original error: litellm.UnsupportedParamsError: bedrock does not support parameters: {'n': 9}, for model=mistral.mistral-small-2402-v1:0. To drop these, set litellm.drop_params=True or for proxy:

litellm_settings: drop_params: true

Steps to reproduce

Configure and compile COPRO optimizer with Bedrock.

llm = dspy.LM(model="mistral.mistral-small-2402-v1:0")
dspy.settings.configure(lm=llm)

from dspy.teleprompt import COPRO

teleprompter = COPRO(
    metric=my_metric,
    verbose=True,
)

kwargs = dict(display_progress=True, display_table=1)

compiled_prompt_opt = teleprompter.compile(my_pipline, trainset=my_trainset, eval_kwargs=kwargs)

DSPy version

2.6.17

@Nasreddine Nasreddine added the bug Something isn't working label Apr 11, 2025
@arnavsinghvi11
Copy link
Collaborator

Hey @Nasreddine , the error trace states you need to drop the n parameter to work with Bedrock models To drop these, set litellm.drop_params=True
Feel free to reference the LiteLLM AWS Bedrock guide and issues in the LiteLLM repo for anything Bedrock related here.

@okhat
Copy link
Collaborator

okhat commented Apr 11, 2025

@Nasreddine Why use COPRO btw? Consider more modern optimizers like dspy.SIMBA or MIPRO.

@Nasreddine
Copy link
Author

@arnavsinghvi11 I believe LiteLLM is encapsulated within DSPy. Is there a way to pass parameters to it?

@okhat I’m currently testing all the optimizers to determine which one performs best for our task. I’ve already completed testing for MIPRO.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants