Skip to content

Conversation

GeLi2001
Copy link
Contributor

@GeLi2001 GeLi2001 commented Oct 2, 2025

resolves #2231


Note

Adds a new JS package that instruments @anthropic-ai/sdk messages.create() with OpenTelemetry/OpenInference, including streaming, tool use, token usage, examples, and tests.

  • New package: js/packages/openinference-instrumentation-anthropic
    • Core instrumentation: src/instrumentation.ts patches @anthropic-ai/sdk messages.create() to emit OpenInference-compliant LLM spans, capturing model, inputs/outputs, tools, and usage; supports both promise and streaming responses.
    • Exports: src/index.ts re-exports instrumentation class.
    • Config/Build: package.json, tsconfig*.json, vitest.config.ts set up build, types, and tests.
    • Examples: examples/basic-usage.ts, examples/streaming.ts, examples/tool-use.ts showing setup, streaming, and tool-calling flows.
    • Tests: test/instrumentation.test.ts covers enable/disable, custom tracer provider, and trace config.
    • Docs: README.md with install/quickstart/config and CHANGELOG.md for initial release.

Written by Cursor Bugbot for commit da65058. This will update automatically on new commits. Configure here.

@GeLi2001 GeLi2001 requested a review from a team as a code owner October 2, 2025 05:04
@dosubot dosubot bot added the size:XL This PR changes 500-999 lines, ignoring generated files. label Oct 2, 2025
Copy link

pkg-pr-new bot commented Oct 2, 2025

Open in StackBlitz

@arizeai/openinference-core

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-core@2261

@arizeai/openinference-instrumentation-anthropic

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-anthropic@2261

@arizeai/openinference-instrumentation-bedrock

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-bedrock@2261

@arizeai/openinference-instrumentation-bedrock-agent-runtime

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-bedrock-agent-runtime@2261

@arizeai/openinference-instrumentation-beeai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-beeai@2261

@arizeai/openinference-instrumentation-langchain

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-langchain@2261

@arizeai/openinference-instrumentation-mcp

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-mcp@2261

@arizeai/openinference-instrumentation-openai

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-instrumentation-openai@2261

@arizeai/openinference-mastra

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-mastra@2261

@arizeai/openinference-semantic-conventions

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-semantic-conventions@2261

@arizeai/openinference-vercel

npm i https://pkg.pr.new/Arize-ai/openinference/@arizeai/openinference-vercel@2261

commit: da65058

cursor[bot]

This comment was marked as outdated.

@dosubot dosubot bot added size:XXL This PR changes 1000+ lines, ignoring generated files. and removed size:XL This PR changes 500-999 lines, ignoring generated files. labels Oct 2, 2025
@GeLi2001 GeLi2001 marked this pull request as draft October 2, 2025 05:07
@GeLi2001 GeLi2001 marked this pull request as ready for review October 7, 2025 00:04
cursor[bot]

This comment was marked as outdated.

Copy link
Contributor

@cephalization cephalization left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed we will follow up with a fix for token counts and prices on streaming and then do a changeset

anthropicInstrumentation.manuallyInstrument(Anthropic.default || Anthropic);

async function main() {
const anthropic = new Anthropic.default({
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Client Instantiation Fails Without Fallback

The Anthropic client is instantiated directly with Anthropic.default. This could cause a runtime error if Anthropic.default is undefined, as the client instantiation doesn't include the fallback used by the instrumentation.

Fix in Cursor Fix in Web

attributes[
`${toolCallIndexPrefix}${SemanticConventions.TOOL_CALL_FUNCTION_ARGUMENTS_JSON}`
] = JSON.stringify(part.input);
} else if (part.type === "tool_result") {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Tool Call Indexing Error

The getAnthropicInputMessageAttributes function incorrectly indexes tool call attributes. It uses the message.content array's index for MESSAGE_TOOL_CALLS instead of a dedicated, sequential index for tool calls, leading to malformed OpenInference attributes.

Fix in Cursor Fix in Web

@GeLi2001 GeLi2001 merged commit 6887913 into main Oct 8, 2025
11 of 12 checks passed
@GeLi2001 GeLi2001 deleted the donny/anthropic-js branch October 8, 2025 00:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:XXL This PR changes 1000+ lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

🗺️ [instrumentation][js] @arizeai/openinference-instrumentation-anthropic
2 participants