Google GenAI Tracing

Instrument LLM calls made using the Google Gen AI Python SDK

Launch Phoenix

Sign up for Phoenix:

Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login

Install packages:

pip install arize-phoenix-otel

Set your Phoenix endpoint and API Key:

import os

# Add Phoenix API Key for tracing
PHOENIX_API_KEY = "ADD YOUR API KEY"
os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={PHOENIX_API_KEY}"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "https://app.phoenix.arize.com"

Your Phoenix API key can be found on the Keys section of your dashboard.

Install

pip install openinference-instrumentation-google-genai google-genai

Setup

Set the GEMINI_API_KEY environment variable. To use the GenAI SDK with Vertex AI instead of the Developer API, refer to Google's guide on setting the required environment variables.

export GEMINI_API_KEY=[your_key_here]

Use the register function to connect your application to Phoenix.

from phoenix.otel import register
​
# Configure the Phoenix tracer
tracer_provider = register(
  project_name="my-llm-app", # Default is 'default'
  auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)

Observe

Now that you have tracing setup, all GenAI SDK requests will be streamed to Phoenix for observability and evaluation.

import os
from google import genai
​
def send_message_multi_turn() -> tuple[str, str]:
    client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])
    chat = client.chats.create(model="gemini-2.0-flash-001")
    response1 = chat.send_message("What is the capital of France?")
    response2 = chat.send_message("Why is the sky blue?")
​
    return response1.text or "", response2.text or ""

This instrumentation will support tool calling soon. Refer to this page for the status.

Last updated

Was this helpful?