AWS Bedrock

Instrument LLM calls to AWS Bedrock via the boto3 client using OpenInference.

boto3 provides Python bindings to AWS services, including Bedrock, which provides access to a number of foundation models. Calls to these models can be instrumented using OpenInference, enabling OpenTelemetry-compliant observability of applications built using these models. Traces collected using OpenInference can be viewed in Arize.

OpenInference Traces collect telemetry data about the execution of your LLM application. Consider using this instrumentation to understand how a Bedrock-managed models are being called inside a complex system and to troubleshoot issues such as extraction and response synthesis.

To get started instrumenting Bedrock calls via boto3, we need to install two components: the OpenInference instrumentation for AWS Bedrock, and an OpenTelemetry exporter used to send these traces to Phoenix.

pip install openinference-instrumentation-bedrock
pip install opentelemetry-exporter-otlp

Instrument boto3 prior to initializing a bedrock-runtime client. All clients created after instrumentation will send traces on all calls to invoke_model.

import os
import boto3
from openinference.instrumentation.bedrock import BedrockInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

# Set the Space and API keys as headers for authentication
headers = f"space_key={ARIZE_SPACE_KEY},api_key={ARIZE_API_KEY}"
os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS'] = headers

# Set resource attributes for the name and version for your application
resource = Resource(
    attributes={
        "model_id":"bedrock-llm-tracing", # Set this to any name you'd like for your app
        "model_version":"1.0", # Set this to a version number string
    }
)

# Setup your tracer with the URL to send traces + API keys
endpoint = "https://otlp.arize.com/v1"
tracer_provider = trace_sdk.TracerProvider(resource=resource)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace_api.set_tracer_provider(tracer_provider=tracer_provider)
BedrockInstrumentor().instrument()

session = boto3.session.Session()
client = session.client("bedrock-runtime")

You can use the following code to test whether your tracing is working.

# All calls to invoke_model are instrumented
prompt = (
    b'{"prompt": "Human: Hello there, how are you? Assistant:", "max_tokens_to_sample": 1024}'
)
response = client.invoke_model(modelId="anthropic.claude-v2", body=prompt)
response_body = json.loads(response.get("body").read())
print(response_body["completion"])

Last updated

Copyright © 2023 Arize AI, Inc