MistralAI

Instrument LLM calls made using MistralAI's SDK

MistralAI is a leading provider for state-of-the-art LLMs. The MistralAI SDK can be instrumented using the openinference-instrumentation-mistralai package.

In this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces in Arize.

pip install openinference-instrumentation-mistralai mistralai opentelemetry-sdk opentelemetry-exporter-grpc

Set the MISTRAL_API_KEY environment variable to authenticate calls made using the SDK.

export MISTRAL_API_KEY=[your_key_here]

In a python file, setup the MistralAIInstrumentor and configure the tracer to send traces to Arize.

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from openinference.instrumentation.mistralai import MistralAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.grpc.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

# Set the Space and API keys as headers for authentication
headers = f"space_key={ARIZE_SPACE_KEY},api_key={ARIZE_API_KEY}"
os.environ['OTEL_EXPORTER_OTLP_TRACES_HEADERS'] = headers

# Set resource attributes for the name and version for your application
resource = Resource(
    attributes={
        "model_id":"mistral-llm-tracing", # Set this to any name you'd like for your app
        "model_version":"1.0", # Set this to a version number string
    }
)

# Define the span processor as an exporter to the desired endpoint
endpoint = "https://otlp.arize.com/v1"
tracer_provider = trace_sdk.TracerProvider(resource=resource)
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))

# Finish automatic instrumentation
trace_api.set_tracer_provider(tracer_provider)
MistralAIInstrumentor().instrument()

To test, run the following code and observe your traces in Arize.

client = MistralClient()
response = client.chat(
    model="mistral-large-latest",
    messages=[
        ChatMessage(
            content="Who won the World Cup in 2018?",
            role="user",
        )
    ],
)
print(response.choices[0].message.content)

Last updated

Copyright © 2023 Arize AI, Inc