Skip to content

deepseek-reasoner : APICallError with no tools params in request #925

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
cht527 opened this issue Apr 11, 2025 · 3 comments
Open

deepseek-reasoner : APICallError with no tools params in request #925

cht527 opened this issue Apr 11, 2025 · 3 comments

Comments

@cht527
Copy link

cht527 commented Apr 11, 2025

DEPDENCIES

"dependencies": {
    "@ai-sdk/deepseek": "^0.2.8",
    "@ai-sdk/openai": "^1.3.9",
    "@ai-sdk/react": "^1.2.8",
    "ai": "4.2.10"

MODEL CONFIGURE

export enum LLM_TYPE {
  QWEN_PLUS = 'qwen-plus',
  DEEPSEEK_R1 = 'deepseek-reasoner',
}
export const customMiddleware: Record<LLM_TYPE, LanguageModelV1Middleware> = {
  [LLM_TYPE.DEEPSEEK_R1]: extractReasoningMiddleware({ tagName: 'think' }),
  [LLM_TYPE.QWEN_PLUS] : {}
};

export const customModel: Record<LLM_TYPE, LanguageModelV1> = {
  [LLM_TYPE.DEEPSEEK_R1]: createDeepSeek({apiKey: DEFINE_ENV.DEEPSEEK_API_KEY,baseURL:DEFINE_ENV.DEEPSEEK_BASE_URL })(LLM_TYPE.DEEPSEEK_R1),
  [LLM_TYPE.QWEN_PLUS]: createOpenAI({ apiKey: DEFINE_ENV.QIANWEN_DASHSCOPE_API_KEY,  baseURL: DEFINE_ENV.QIANWEN_DASHSCOPE_API_URL})(LLM_TYPE.QWEN_PLUS)
};

export const getLLM = (modelId = LLM_TYPE.QWEN_PLUS) =>  wrapLanguageModel({
  model:  customModel[modelId],
  middleware: customMiddleware[modelId],
})

LOGIC in streamText function

maxSteps: modelId!==LLM_TYPE.DEEPSEEK_R1 ? 2 : 1,
tools:  modelId!==LLM_TYPE.DEEPSEEK_R1  ? {
          getInformation: tool({
            description: `get information from your knowledge base to answer questions.`,
            parameters: z.object({
              question: z.string().describe('the users question'),
            }),
            execute: getInformationFromKnowledgeBase,
          })}
   : undefined,
onFinish: async ({ response }) => {
            if (session.user && session.user.id) {
              try {
                
                const assistantId = getTrailingMessageId({
                  messages: response.messages.filter(
                    (message) => message.role === 'assistant',
                  ),
                });

                if (!assistantId) {
                  throw new Error('No assistant message found!');
                }

                const [, assistantMessage] = appendResponseMessages({
                  messages: [userMessage],
                  responseMessages: response.messages,
                });

                await saveMessages({
                  messages: [
                    {
                      id: assistantId,
                      chatId: id,
                      role: assistantMessage.role,
                      // content's datastruct same with parts
                      content: assistantMessage.parts,
                      createdAt: new Date(),
                    },
                  ]
                })
    
              } catch (error) {
                console.log(error);
                
                console.error("Failed to save chat");
              }
            }
        },
        
        experimental_telemetry: {
          isEnabled: true,
          functionId: "stream-text",
        },
      });

      result.consumeStream();

      result.mergeIntoDataStream(dataStream, {
        sendReasoning: true,
      })

STEPS

  1. pnpm dev, run project in localhost:3000
  2. select chat model (qwen-plus in my project),which will use tools call in the progress,and the result is ok
  3. select reasoning model (deepseek-reasoner in my project, which will cause error,the detail as shown blow。

ERROR LOG

ERROR: createDataStreamResponse APICallError [AI_APICallError]: deepseek-reasoner does not support Function Calling.
    at eval (webpack-internal:///(rsc)/./node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.mjs:708:14)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async postToApi (webpack-internal:///(rsc)/./node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/provider-utils/dist/index.mjs:608:28)
    at async OpenAICompatibleChatLanguageModel.doStream (webpack-internal:///(rsc)/./node_modules/.pnpm/@[email protected][email protected]/node_modules/@ai-sdk/openai-compatible/dist/index.mjs:491:50)
    at async wrapStream (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:6337:35)
    at async fn (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:5523:25)
    at async eval (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:544:22)
    at async _retryWithExponentialBackoff (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:345:12)
    at async streamStep (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:5478:15)
    at async fn (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:5848:9)
    at async eval (webpack-internal:///(rsc)/./node_modules/.pnpm/[email protected][email protected][email protected]/node_modules/ai/dist/index.mjs:544:22) {
  cause: undefined,
  url: 'https://api.deepseek.com/chat/completions',
  requestBodyValues: {
    model: 'deepseek-reasoner',
    user: undefined,
    max_tokens: undefined,
    temperature: 0,
    top_p: undefined,
    frequency_penalty: undefined,
    presence_penalty: undefined,
    response_format: undefined,
    stop: undefined,
    seed: undefined,
    messages: [
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object], [Object],
      [Object]
    ],
    tools: undefined,
    tool_choice: undefined,
    stream: true
  },
  statusCode: 400,
  responseHeaders: {
    'access-control-allow-credentials': 'true',
    'cf-cache-status': 'DYNAMIC',
    'cf-ray': '92e8167699c5f080-DFW',
    connection: 'keep-alive',
    'content-length': '151',
    'content-type': 'application/json',
    date: 'Fri, 11 Apr 2025 05:29:42 GMT',
    server: 'cloudflare',
    'set-cookie': 'COOKIE VALUE; path=/; expires=Fri, 11-Apr-25 05:59:42 GMT; domain=.deepseek.com; HttpOnly; Secure; SameSite=None',
    'strict-transport-security': 'max-age=31536000; includeSubDomains; preload',
    vary: 'origin, access-control-request-method, access-control-request-headers',
    'x-content-type-options': 'nosniff',
    'x-ds-trace-id': 'dfcbe5aadf4e8be6c55c590e50b20c60'
  },
  responseBody: '{"error":{"message":"deepseek-reasoner does not support Function Calling.","type":"invalid_request_error","param":null,"code":"invalid_request_error"}}',
  isRetryable: false,
  data: {
    error: {
      message: 'deepseek-reasoner does not support Function Calling.',
      type: 'invalid_request_error',
      param: null,
      code: 'invalid_request_error'
    }
  },
  [Symbol(vercel.ai.error)]: true,
  [Symbol(vercel.ai.error.AI_APICallError)]: true
}

message params in ERROR LOG

Image

Since the tools params would be undefined when the model is deepseek-reasoner(LLM_TYPE.DEEPSEEK_R1), I doubt that the messages causes this error, I mark the suspicious message in picture. because I compared the same operation in https://chat.vercel.ai/ . There has no message contains type: "step-start"

@0xrinegade
Copy link

@uwularpy

@ailunc
Copy link

ailunc commented Apr 14, 2025

Hi! @cht527
Here are several approaches you can try:
1. Disable Function Calling for Deepseek Completely:
Since deepseek-reasoner does not support function calling, ensure that no function calling-related parameters are being sent when it is selected. This may involve:
• Bypassing Middleware:
Either conditionally disable the extractReasoningMiddleware when using DEEPSEEK_R1 or adapt it so that it does not insert any extra instructions (such as a "step-start" marker) that imply function calls.
• Modifying Data Stream Merge:
Avoid merging the data stream with sendReasoning: true for deepseek. For example, you could conditionally pass the parameter like:

result.mergeIntoDataStream(dataStream, {
  sendReasoning: modelId !== LLM_TYPE.DEEPSEEK_R1,
});

This way, no reasoning-related function call data is appended when using deepseek.

@cht527
Copy link
Author

cht527 commented Apr 15, 2025

Hi! @cht527 Here are several approaches you can try: 1. Disable Function Calling for Deepseek Completely: Since deepseek-reasoner does not support function calling, ensure that no function calling-related parameters are being sent when it is selected. This may involve: • Bypassing Middleware: Either conditionally disable the extractReasoningMiddleware when using DEEPSEEK_R1 or adapt it so that it does not insert any extra instructions (such as a "step-start" marker) that imply function calls. • Modifying Data Stream Merge: Avoid merging the data stream with sendReasoning: true for deepseek. For example, you could conditionally pass the parameter like:

result.mergeIntoDataStream(dataStream, {
  sendReasoning: modelId !== LLM_TYPE.DEEPSEEK_R1,
});

This way, no reasoning-related function call data is appended when using deepseek.

@ailunc Hi,This is not the valid way to solve the problem, it simply stops receiving reasoning content. The 'step-start' already happens before I switch the model to DEEPSEEK_R1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants