Phoenix
TypeScript APIPython APICommunityGitHubPhoenix Cloud
  • Documentation
  • Self-Hosting
  • Cookbooks
  • Learn
  • Integrations
  • SDK and API Reference
  • Release Notes
  • Overview
  • LLM Providers
    • Amazon Bedrock
      • Amazon Bedrock Tracing
      • Amazon Bedrock Evals
      • Amazon Bedrock Agents Tracing
    • Anthropic
      • Anthropic Tracing
      • Anthropic Evals
    • Google Gen AI
      • Google GenAI Tracing
      • Gemini Evals
    • LiteLLM
      • LiteLLM Tracing
      • LiteLLM Evals
    • MistralAI
      • MistralAI Tracing
      • MistralAI Evals
    • Groq
      • Groq Tracing
    • OpenAI
      • OpenAI Tracing
      • OpenAI Evals
      • OpenAI Agents SDK Tracing
      • OpenAI Node.js SDK
    • VertexAI
      • VertexAI Tracing
      • VertexAI Evals
  • Frameworks
    • Agno
      • Agno Tracing
    • AutoGen
      • AutoGen Tracing
    • BeeAI
      • BeeAI Tracing (JS)
    • CrewAI
      • CrewAI Tracing
    • DSPy
      • DSPy Tracing
    • Flowise
      • Flowise Tracing
    • Guardrails AI
      • Guardrails AI Tracing
    • Haystack
      • Haystack Tracing
    • Hugging Face smolagents
      • smolagents Tracing
    • Instructor
      • Instructor Tracing
    • LlamaIndex
      • LlamaIndex Tracing
      • LlamaIndex Workflows Tracing
    • LangChain
      • LangChain Tracing
      • LangChain.js
    • LangGraph
      • LangGraph Tracing
  • LangFlow
    • LangFlow Tracing
  • Model Context Protocol
    • Phoenix MCP Server
    • MCP Tracing
  • Prompt Flow
    • Prompt Flow Tracing
  • Vercel
    • Vercel AI SDK Tracing (JS)
  • Evaluation Libraries
    • Cleanlab
    • Ragas
  • Vector Databases
    • MongoDB
    • Pinecone
    • Qdrant
    • Weaviate
    • Zilliz / Milvus
Powered by GitBook

Platform

  • Tracing
  • Prompts
  • Datasets and Experiments
  • Evals

Software

  • Python Client
  • TypeScript Client
  • Phoenix Evals
  • Phoenix Otel

Resources

  • Container Images
  • X
  • Blue Sky
  • Blog

Integrations

  • OpenTelemetry
  • AI Providers

© 2025 Arize AI

On this page
  • Install
  • Setup
  • Run Anthropic
  • Observe
  • Resources:

Was this helpful?

  1. LLM Providers
  2. Anthropic

Anthropic Tracing

PreviousAnthropicNextAnthropic Evals

Last updated 6 days ago

Was this helpful?

Anthropic is a leading provider for state-of-the-art LLMs. The Anthropic SDK can be instrumented using the package.

Install

pip install openinference-instrumentation-anthropic anthropic

Setup

Use the register function to connect your application to Phoenix:

from phoenix.otel import register

# configure the Phoenix tracer
tracer_provider = register(
  project_name="my-llm-app", # Default is 'default'
  auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)

Run Anthropic

A simple Anthropic application that is now instrumented

import anthropic

client = anthropic.Anthropic()

message = client.messages.create(
    model="claude-3-5-sonnet-20240620",
    max_tokens=1000,
    temperature=0,
    messages=[
        {
            "role": "user",
            "content": [
                {
                    "type": "text",
                    "text": "Why is the ocean salty?"
                }
            ]
        }
    ]
)
print(message.content)

Observe

Now that you have tracing setup, all invocations of pipelines will be streamed to your running Phoenix for observability and evaluation.

Resources:

Example Messages
Example Tool Calling
OpenInference package
openinference-instrumentation-anthropic

Sign up for Phoenix:

Sign up for an Arize Phoenix account at

Install packages:

pip install arize-phoenix-otel

Set your Phoenix endpoint and API Key:

import os

# Add Phoenix API Key for tracing
PHOENIX_API_KEY = "ADD YOUR API KEY"
os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={PHOENIX_API_KEY}"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "https://app.phoenix.arize.com"

Your Phoenix API key can be found on the Keys section of your .

Launch your local Phoenix instance:

pip install arize-phoenix
phoenix serve

For details on customizing a local terminal deployment, see .

Install packages:

pip install arize-phoenix-otel

Set your Phoenix endpoint:

import os

os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"

See for more details.

docker pull arizephoenix/phoenix:latest

Run your containerized instance:

docker run -p 6006:6006 arizephoenix/phoenix:latest

This will expose the Phoenix on localhost:6006

Install packages:

pip install arize-phoenix-otel

Set your Phoenix endpoint:

import os

os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"

Install packages:

pip install arize-phoenix

Launch Phoenix:

import phoenix as px
px.launch_app()

Pull latest Phoenix image from :

For more info on using Phoenix with Docker, see .

By default, notebook instances do not have persistent storage, so your traces will disappear after the notebook is closed. See or use one of the other deployment options to retain traces.

https://app.phoenix.arize.com/login
dashboard
Terminal Setup
Terminal
Docker Hub
Docker
self-hosting