LiteLLM
Last updated
Was this helpful?
Last updated
Was this helpful?
LiteLLM allows developers to call all LLM APIs using the openAI format. LiteLLM Proxy is a proxy server to call 100+ LLMs in OpenAI format. Both are supported by this auto-instrumentation.
Any calls made to the following functions will be automatically captured by this integration:
completion()
acompletion()
completion_with_retries()
embedding()
aembedding()
image_generation()
aimage_generation()
Sign up for Phoenix:
Sign up for an Arize Phoenix account at https://app.phoenix.arize.com/login
Install packages:
Set your Phoenix endpoint and API Key:
Your Phoenix API key can be found on the Keys section of your dashboard.
Use the register function to connect your application to Phoenix:
Add any API keys needed by the models you are using with LiteLLM.
You can now use LiteLLM as normal and calls will be traces in Phoenix.
Traces should now be visible in Phoenix!