Skip to content

OpenCode sends tools (and Jinja tool template) to llama.cpp results in 500 error unless --jinja; with --jinja, template crashes (reject filter) #1890

@brianluby

Description

@brianluby

Environment

  • Client: OpenCode via Homebrew on macOS — v0.4.41 (also seen with 0.4.40)
  • Server: llama.cpp OpenAI‑compatible server (Windows)
  • Model: Qwen3‑30B A3B Coder (GGUF), served via llama.cpp; alias qwen3-30b
  • Endpoint: http://{server-ip}:8080/v1
  • Auth header: Authorization: Bearer test-api-key

Summary
OpenCode includes a tools parameter and injects a Jinja <tools> prompt block even in “plain chat”.

  • If llama.cpp runs without --jinja, it rejects the request with: tools param requires --jinja flag.
  • If llama.cpp runs with --jinja, it attempts to render OpenCode’s Jinja tools block and fails with a 500 due to missing filters (e.g., reject), producing a long “Value is not callable” stack trace.

Reproduction (no --jinja)

  1. Start llama.cpp:

    llama-server.exe -m C:\models\Qwen3-30B-A3B-Coder-480B-Distill-v2-Q8_0.gguf --alias qwen3-30b --host 0.0.0.0 --port 8080 --api-key test-api-key
  2. OpenCode provider points to http://{server-ip}:8080/v1, agent set to “plain chat” (no tools in config).

  3. Send “hi” in OpenCode.

Actual: Server 500 + log: tools param requires --jinja flag.
Expected: If no tools are configured, OpenCode should omit the tools key entirely.

Reproduction (with --jinja)

  1. Start llama.cpp:

    llama-server.exe -m C:\models\Qwen3-30B-A3B-Coder-480B-Distill-v2-Q8_0.gguf --alias qwen3-30b --host 0.0.0.0 --port 8080 --api-key test-api-key --jinja
  2. Same OpenCode config; send “hi”.

Actual: llama.cpp 500 with Jinja crash while rendering tool template. Excerpt:

Value is not callable: null at row 58, column 110:
{%- for json_key in param_fields.keys() | reject("in", handled_keys) %}
...
{{- "<tools>" }} {%- for tool in tools %} ...

Expected: Either don’t send Jinja in system prompt, or send OpenAI‑native tool schema without relying on server‑side Jinja filters.

Sanity checks (curl)

  • With --jinja, server accepts tools: []:

    curl http://{server-ip}:8080/v1/chat/completions \
      -H "Content-Type: application/json" -H "Authorization: Bearer test-api-key" \
      -d '{"model":"qwen3-30b","messages":[{"role":"user","content":"Say hi"}],"tools":[]}'

Returns completion.

  • Without --jinja, server accepts when no tools key is present:

    curl http://{server-ip}:8080/v1/chat/completions \
      -H "Content-Type: application/json" -H "Authorization: Bearer test-api-key" \
      -d '{"model":"qwen3-30b","messages":[{"role":"user","content":"Say hi"}]}'

Returns completion.

Why this seems client‑side
Even with a “plain chat” agent and no tools in my opencode.json, OpenCode still sends either:

  • a tools field in the JSON payload, or
  • a Jinja <tools> block in the system prompt.

This causes failures on llama.cpp depending on --jinja. Changelog 0.4.40 mentioned “disable todo tools for qwen models”, but the failures persist (they’re triggered by the prompt template/tool scaffolding, not the model).

Requested improvements

  1. Conditional tools emission: If an agent has no tools configured or the provider is generic OpenAI‑compatible (e.g., llama.cpp), omit tools and tool_choice entirely. Consider auto‑retry without tools on 4xx/5xx indicating unsupported tools.
  2. Config switch: A per‑agent flag like "sendTools": false that guarantees no tools are sent.
  3. Jinja‑free prompting mode: Provide a “raw OpenAI” prompt path (no {% ... %} in system prompt).
  4. Qwen defaults: Extend 0.4.40’s Qwen compatibility so all tool scaffolding is disabled by default for Qwen on OpenAI‑compatible providers, unless explicitly enabled.

Workarounds that fixed it for me

  • Start llama.cpp with --jinja and supply a minimal chat template that ignores tools, or
  • Run OpenCode through a tiny local proxy that strips tools/tool_choice from requests (works flawlessly with llama.cpp without --jinja).

Artifacts (can provide on request)

  • Server logs with 500 traces (redacted)
  • opencode.json (redacted)
  • Minimal proxy script that strips tools and demonstrates successful operation

Thanks! Happy to test a dev build or provide more logs.


Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions