MistralAI is a leading provider for state-of-the-art LLMs. The MistralAI SDK can be instrumented using the package.
Launch Phoenix
Sign up for Phoenix:
Sign up for an Arize Phoenix account at
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint and API Key:
import os
# Add Phoenix API Key for tracing
PHOENIX_API_KEY = "ADD YOUR API KEY"
os.environ["PHOENIX_CLIENT_HEADERS"] = f"api_key={PHOENIX_API_KEY}"
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "https://app.phoenix.arize.com"
Your Phoenix API key can be found on the Keys section of your .
Launch your local Phoenix instance:
pip install arize-phoenix
phoenix serve
For details on customizing a local terminal deployment, see .
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint:
import os
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"
See for more details
docker pull arizephoenix/phoenix:latest
Run your containerized instance:
docker run -p 6006:6006 arizephoenix/phoenix:latest
This will expose the Phoenix on localhost:6006
Install packages:
pip install arize-phoenix-otel
Set your Phoenix endpoint:
import os
os.environ["PHOENIX_COLLECTOR_ENDPOINT"] = "http://localhost:6006"
Set the MISTRAL_API_KEY environment variable to authenticate calls made using the SDK.
export MISTRAL_API_KEY=[your_key_here]
Connect to your Phoenix instance using the register function.
from phoenix.otel import register
# configure the Phoenix tracer
tracer_provider = register(
project_name="my-llm-app", # Default is 'default'
auto_instrument=True # Auto-instrument your app based on installed OI dependencies
)
Run Mistral
import os
from mistralai import Mistral
from mistralai.models import UserMessage
api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-tiny"
client = Mistral(api_key=api_key)
chat_response = client.chat.complete(
model=model,
messages=[UserMessage(content="What is the best French cheese?")],
)
print(chat_response.choices[0].message.content)
Observe
Now that you have tracing setup, all invocations of Mistral (completions, chat completions, embeddings) will be streamed to your running Phoenix for observability and evaluation.
Resources
Pull latest Phoenix image from:
For more info on using Phoenix with Docker, see .
By default, notebook instances do not have persistent storage, so your traces will disappear after the notebook is closed. See or use one of the other deployment options to retain traces.