Skip to content

Calling DeepSeek's R1 model returns an empty message #16

@Mark531

Description

@Mark531

Calling DeepSeek's R1 model with this prompt returns an empty message. Since this prompts works perfectly when I use DeepSeek's chat, it makes me think of a bug on OpenRouter's side.

from langchain_openai import ChatOpenAI

free_key = "<KEY FOR FREE R1 MODEL>"

prompt = """
Travelers in different European cities are looking for a destination city in which to meet, knowing that their choice of city will minimize the carbon footprint of all their travel.

They can travel by train (up to 4 hours) or by plane.

The destination city can be anywhere in Europe, and must have a population of at least 10,000.

Here are the departure cities:
Lyon
Zurich
Venice

Which 10 European destinations would you recommend for their reunion, with a description of the journey and the carbon footprint (in kg of CO2) of each trip?
"""

model = ChatOpenAI(
  base_url="https://openrouter.ai/api/v1",
  model="deepseek/deepseek-r1:free",
  api_key=free_key,
  max_tokens=32000,
  temperature=0
)

result = model.invoke(prompt)

Is it a consequence of the limitations of the free model or is it a bug?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions