Vercel AI SDK

This package provides a set of utilities to ingest Vercel AI SDK(>= 3.3) spans into platforms like Arize and Phoenix.

Note: This package requires you to be using the Vercel AI SDK version 3.3 or higher.

Installation

npm install --save @arizeai/openinference-vercel

You will also need to install OpenTelemetry and Vercel packages to your project.

npm install --save @opentelemetry/api @vercel/otel @opentelemetry/exporter-trace-otlp-grpc @grpc/grpc-js

Usage

@arizeai/openinference-vercel provides a set of utilities to help you ingest Vercel AI SDK spans into platforms and works in conjunction with Vercel's OpenTelemetry support. To get started, you will need to add OpenTelemetry support to your Vercel project according to their guide.

To process your Vercel AI SDK Spans add a OpenInferenceSimpleSpanProcessor or OpenInferenceBatchSpanProcessor to your OpenTelemetry configuration.

Note: The OpenInferenceSpanProcessor does not handle the exporting of spans so you will need to pass it an exporter as a parameter.

import { registerOTel } from "@vercel/otel";
import { diag, DiagConsoleLogger, DiagLogLevel } from "@opentelemetry/api";
import {
  isOpenInferenceSpan,
  OpenInferenceSimpleSpanProcessor,
} from "@arizeai/openinference-vercel";
import { Metadata } from "@grpc/grpc-js";
import { OTLPTraceExporter as GrpcOTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-grpc";

// For troubleshooting, set the log level to DiagLogLevel.DEBUG
diag.setLogger(new DiagConsoleLogger(), DiagLogLevel.DEBUG);

export function register() {
  // Arize specific - Create metadata and add your headers
  const metadata = new Metadata();

  // Your Arize Space and API Keys, which can be found in the UI
  metadata.set("space_id", "my_space_id");
  metadata.set("api_key", "my_api_key");
  registerOTel({
    serviceName: "next-app",
    attributes: {
      model_id: "vercel-model",
      model_version: "1.0.0",
    },
    spanProcessors: [
      new OpenInferenceSimpleSpanProcessor({
        exporter: new GrpcOTLPTraceExporter({
          url: "https://otlp.arize.com",
          metadata,
        }),
        spanFilter: (span) => {
          // Only export spans that are OpenInference spans to negate
          return isOpenInferenceSpan(span);
        },
      }),
    ],
  });
}

Since Arize exposes a gRPC endpoint we must use the OpenTelemetry gRPC exporter. The OpenTelemetry gRPC exporter is designed to run in Node.js environments. So, we must make some additional changes to our Next.js configuration in our next.config.js/ts file. Note your file may look slightly different if you are using a JS config file but the underlying changes should be the same.

import type { NextConfig } from "next";

const nextConfig: NextConfig = {
  webpack: (config) => {
    config.resolve.fallback = {
      stream: false,
      fs: false,
      tls: false,
      net: false,
      zlib: false,
      http: false,
      url: false,
      http2: false,
      dns: false,
      os: false,
      path: false,
    };
    return config;
  },
};

export default nextConfig

Last updated

Was this helpful?