MistralAI
Instrument LLM calls made using MistralAI's SDK
Last updated
Was this helpful?
Instrument LLM calls made using MistralAI's SDK
Last updated
Was this helpful?
In this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces in Arize.
Set the MISTRAL_API_KEY
environment variable to authenticate calls made using the SDK.
In a python file, setup the MistralAIInstrumentor
and configure the tracer to send traces to Arize.
To test, run the following code and observe your traces in Arize.
MistralAI is a leading provider for state-of-the-art LLMs. The MistralAI SDK can be instrumented using the package.