Vertex AI (Gemini)

Arize has first-class support for instrumenting VertexAI SDK and the Google Cloud AI Platform. Traces are fully OpenTelemetry compatible and can be sent to any OpenTelemetry collector for viewing.

In this example we will instrument generate_content calls.

pip install openinference-instrumentation-vertexai arize-otel opentelemetry-sdk opentelemetry-exporter-otlp "opentelemetry-proto>=1.12.0"

Set up VertexAIInstrumentor to trace your application and sends the traces to Phoenix at the endpoint defined below.

from openinference.instrumentation.vertexai import VertexAIInstrumentor
# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints

# Setup OTEL via our convenience function
register_otel(
    endpoints = Endpoints.ARIZE,
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    model_id = "your-model-id", # name this to whatever you would like
)

VertexAIInstrumentor().instrument()

Now, all calls by generative_models are instrumented!

import vertexai
from vertexai.generative_models import GenerativeModel

vertexai.init(location="us-central1")
model = GenerativeModel("gemini-1.5-flash")

print(model.generate_content("Why is sky blue?"))

Last updated

Copyright © 2023 Arize AI, Inc