Vercel AI SDK
Last updated
Last updated
This package provides a set of utilities to ingest Vercel AI SDK(>= 3.3) spans into platforms like Arize and Phoenix.
Note: This package requires you to be using the Vercel AI SDK version 3.3 or higher.
You will also need to install OpenTelemetry and Vercel packages to your project.
@arizeai/openinference-vercel
provides a set of utilities to help you ingest Vercel AI SDK spans into platforms and works in conjunction with Vercel's OpenTelemetry support. To get started, you will need to add OpenTelemetry support to your Vercel project according to their guide.
To process your Vercel AI SDK Spans add a OpenInferenceSimpleSpanProcessor
or OpenInferenceBatchSpanProcessor
to your OpenTelemetry configuration.
Note: The
OpenInferenceSpanProcessor
does not handle the exporting of spans so you will need to pass it an exporter as a parameter.
Now enable telemetry in your AI SDK calls by setting the experimental_telemetry
parameter to true
.
For details on Vercel AI SDK telemetry see the Vercel AI SDK Telemetry documentation.
To see an example go to the Next.js OpenAI Telemetry Example in the OpenInference repo.
For more information on Vercel OpenTelemetry support see the Vercel AI SDK Telemetry documentation.