Last updated
Last updated
Sign up for Phoenix:
Install packages:
Connect your application to your cloud instance:
Initialize the CrewAIInstrumentor before your application code.
CrewAI uses either Langchain or LiteLLM under the hood to call models, depending on the version.
If you're using CrewAI<0.63.0, we recommend adding our LangChainInstrumentor
to get visibility of LLM calls.
If you're using CrewAI>= 0.63.0, we recommend adding our LiteLLMInstrumentor
to get visibility of LLM calls.
From here, you can run CrewAI as normal
Now that you have tracing setup, all calls to your Crew will be streamed to your running Phoenix for observability and evaluation.
Sign up for an Arize Phoenix account at
Your Phoenix API key can be found on the Keys section of your .
For details on customizing a local terminal deployment, see .
See for more details
Pull latest Phoenix image from :
For more info on using Phoenix with Docker, see
By default, notebook instances do not have persistent storage, so your traces will disappear after the notebook is closed. See or use one of the other deployment options to retain traces.
Instrument multi agent applications using CrewAI