Vertex AI (Gemini)

Arize has first-class support for instrumenting VertexAI SDK and the Google Cloud AI Platform. Traces are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing.

In this example we will instrument generate_content calls.

pip install openinference-instrumentation-vertexai arize-otel opentelemetry-sdk opentelemetry-exporter-otlp "opentelemetry-proto>=1.12.0"

Set up VertexAIInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.

from openinference.instrumentation.vertexai import VertexAIInstrumentor
from arize.otel import register

# Setup OTel via our convenience function
tracer_provider = register(
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    project_name = "your-project-name", # name this to whatever you would like
)

VertexAIInstrumentor().instrument(tracer_provider=tracer_provider)

Now, all calls by generative_models are instrumented!

import vertexai
from vertexai.generative_models import GenerativeModel

vertexai.init(location="us-central1")
model = GenerativeModel("gemini-1.5-flash")

print(model.generate_content("Why is sky blue?"))

Last updated

Copyright © 2023 Arize AI, Inc