Description
What happened?
I'm encountering an UnsupportedParamsError when using the COPRO optimizer with Amazon Bedrock models:
- mistral.mistral-small-2402-v1:0
- us.meta.llama3-2-3b-instruct-v1:0
The error occurs due to unsupported parameter {'n': 9}.
At the beginig I get this warning :
2025/04/11 12:02:47 WARNING dspy.adapters.json_adapter: Failed to use structured output format. Falling back to JSON mode. Error: litellm.UnsupportedParamsError: bedrock does not support parameters: {'n': 9}, for model=mistral.mistral-small-2402-v1:0. To drop these, set
litellm.drop_params=True
or for proxy:
litellm_settings: drop_params: true
Then I get following exception error:
File ~/dev/work/playground/.venv/lib/python3.10/site-packages/dspy/adapters/json_adapter.py:69, in JSONAdapter.call(self, lm, lm_kwargs, signature, demos, inputs)
67 return super().call(lm, lm_kwargs, signature, demos, inputs)
68 except Exception as e:
---> 69 raise RuntimeError(
70 "Both structured output format and JSON mode failed. Please choose a model that supports "
71 f"response_format
argument. Original error: {e}"
72 ) from eRuntimeError: Both structured output format and JSON mode failed. Please choose a model that supports
response_format
argument. Original error: litellm.UnsupportedParamsError: bedrock does not support parameters: {'n': 9}, for model=mistral.mistral-small-2402-v1:0. To drop these, setlitellm.drop_params=True
or for proxy:
litellm_settings: drop_params: true
Steps to reproduce
Configure and compile COPRO optimizer with Bedrock.
llm = dspy.LM(model="mistral.mistral-small-2402-v1:0")
dspy.settings.configure(lm=llm)
from dspy.teleprompt import COPRO
teleprompter = COPRO(
metric=my_metric,
verbose=True,
)
kwargs = dict(display_progress=True, display_table=1)
compiled_prompt_opt = teleprompter.compile(my_pipline, trainset=my_trainset, eval_kwargs=kwargs)
DSPy version
2.6.17