LlamaIndex
Last updated
Last updated
Copyright © 2023 Arize AI, Inc
Arize has first-class support for LlamaIndex applications. After instrumentation, you will have a full trace of every part of your LLM application, including input, embeddings, retrieval, functions, and output messages.
We follow a standardized format for how a trace data should be structured using openinference, which is our open source package based on OpenTelemetry.
Use our code block below to get started using our LlamaIndexInstrumentor.
Phoenix supports LlamaIndex's latest instrumentation paradigm.
To get started, pip install the following.
The following code snippet showcases how to automatically instrument your LLM application.
Now start asking questions to your LLM app and watch the traces being collected by Arize. For more in-detail demonstration, check our Colab tutorial: