When the register_otel function does not offer enough customization for your needs, you can use the opentelemetry_sdk to control how you send traces.
This snippet contains a few OTel concepts:
The header is an environment variable for authentication to send data.
A resource represents an origin (e.g., a particular service, or in this case, a project) from which your spans are emitted.
Span processors filter, batch, and perform operations on your spans prior to export. You can set multiple locations for exporting data, such as the console.
Your tracer provides a handle for you to create spans and add attributes in your application code.
The collector (e.g., Phoenix) receives the spans exported by your application.
The SimpleSpanProcessor is synchronous and blocking. Use the BatchSpanProcessor for non-blocking production application instrumentation.
Here is sample code to setup instrumentation using OpenTelemetry libraries before starting the OpenAI auto instrumentor from openinference.
# Import open-telemetry dependenciesfrom opentelemetry import trace as trace_apifrom opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporterfrom opentelemetry.sdk import trace as trace_sdkfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, BatchSpanProcessorfrom opentelemetry.sdk.resources import Resource# Import the automatic instrumentor from OpenInferencefrom openinference.instrumentation.openai import OpenAIInstrumentor# Set the Space and API keys as headers for authenticationheaders =f"space_id={ARIZE_SPACE_ID},api_key={ARIZE_API_KEY}"os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS']= headers# Set resource attributes for the name and version for your applicationtrace_attributes ={"model_id":"your model name",# This is how your model will show up in Arize"model_version":"v1",# You can filter your spans by model version in Arize}# Define the desired endpoint URL to send tracesendpoint ="https://otlp.arize.com/v1"# Set the tracer providertracer_provider = trace_sdk.TracerProvider( resource=Resource(attributes=trace_attributes))tracer_provider.add_span_processor(BatchSpanProcessor(OTLPSpanExporter(endpoint)))tracer_provider.add_span_processor(BatchSpanProcessor(ConsoleSpanExporter()))trace_api.set_tracer_provider(tracer_provider=tracer_provider)# To get your tracertracer = trace_api.get_tracer(__name__)# Finish automatic instrumentationOpenAIInstrumentor().instrument()
Now start asking questions to your LLM app and watch the traces being collected by Arize. For more examples of editing your OTEL tracer, check our openinferenece-instrumentation-openai examples.
instrumentation.ts should be implemented as below.
/*instrumentation.ts */import { registerInstrumentations } from"@opentelemetry/instrumentation";import { OpenAIInstrumentation } from"@arizeai/openinference-instrumentation-openai";import { ConsoleSpanExporter } from"@opentelemetry/sdk-trace-base";import { NodeTracerProvider, BatchSpanProcessor,} from"@opentelemetry/sdk-trace-node";import { Resource } from"@opentelemetry/resources";import { OTLPTraceExporter as GrpcOTLPTraceExporter } from"@opentelemetry/exporter-trace-otlp-grpc"; // Arize specificimport { diag, DiagConsoleLogger, DiagLogLevel } from"@opentelemetry/api";import { Metadata } from"@grpc/grpc-js"// For troubleshooting, set the log level to DiagLogLevel.DEBUGdiag.setLogger(newDiagConsoleLogger(),DiagLogLevel.DEBUG);// Arize specific - Create metadata and add your headersconstmetadata=newMetadata();// Your Arize Space and API Keys, which can be found in the UImetadata.set('space_id','your-space-id');metadata.set('api_key','your-api-key');constprovider=newNodeTracerProvider({ resource:newResource({// Arize specific - The name of a new or preexisting model you // want to export spans to"model_id":"your-model-id","model_version":"your-model-version" }),});provider.addSpanProcessor(newBatchSpanProcessor(newConsoleSpanExporter()));provider.addSpanProcessor(newBatchSpanProcessor(newGrpcOTLPTraceExporter({ url:"https://otlp.arize.com/v1", metadata, }), ),);registerInstrumentations({ instrumentations: [newOpenAIInstrumentation({})],});provider.register();