MistralAI

Instrument LLM calls made using MistralAI's SDK

MistralAI is a leading provider for state-of-the-art LLMs. The MistralAI SDK can be instrumented using the openinference-instrumentation-mistralai package.

In this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces in Arize.

pip install openinference-instrumentation-mistralai mistralai arize-otel opentelemetry-sdk opentelemetry-exporter-grpc

Set the MISTRAL_API_KEY environment variable to authenticate calls made using the SDK.

export MISTRAL_API_KEY=[your_key_here]

In a python file, setup the MistralAIInstrumentor and configure the tracer to send traces to Arize.

# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints

# Setup OTEL via our convenience function
register_otel(
    endpoints = Endpoints.ARIZE,
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    model_id = "your-model-id", # name this to whatever you would like
)

# Import openinference instrumentor to map Mistral traces to a standard format
from openinference.instrumentation.mistralai import MistralAIInstrumentor

# Turn on the instrumentor
MistralAIInstrumentor().instrument()

To test, run the following code and observe your traces in Arize.

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

client = MistralClient()
response = client.chat(
    model="mistral-large-latest",
    messages=[
        ChatMessage(
            content="Who won the World Cup in 2018?",
            role="user",
        )
    ],
)
print(response.choices[0].message.content)

Last updated

Copyright © 2023 Arize AI, Inc