Configure AI Providers
Last updated
Was this helpful?
Last updated
Was this helpful?
Phoenix natively integrates with OpenAI, Azure OpenAI, Anthropic, and Google AI Studio (gemini) to make it easy to test changes to your prompts. In addition to the above, since many AI providers (deepseek, ollama) can be used directly with the OpenAI client, you can talk to any OpenAI compatible LLM provider.
To securely provide your API keys, you have two options. One is to store them in your browser in local storage. Alternatively, you can set them as environment variables on the server side. If both are set at the same time, the credential set in the browser will take precedence.
API keys can be entered in the playground application via the API Keys dropdown menu. This option stores API keys in the browser. Simply navigate to to settings and set your API keys.
Available on self-hosted Phoenix
If the following variables are set in the server environment, they'll be used at API invocation time.
OpenAI
OPENAI_API_KEY
Azure OpenAI
AZURE_OPENAI_API_KEY
AZURE_OPENAI_ENDPOINT
OPENAI_API_VERSION
Anthropic
ANTHROPIC_API_KEY
Gemini
GEMINI_API_KEY or GOOGLE_API_KEY
Since you can configure the base URL for the OpenAI client, you can use the prompt playground with a variety of OpenAI Client compatible LLMs such as Ollama, DeepSeek, and more.
If you are using an LLM provider, you will have to set the OpenAI api key to that provider's api key for it to work.
OpenAI Client compatible providers Include
DeepSeek
Ollama
Optionally, the server can be configured with the OPENAI_BASE_URL
environment variable to change target any OpenAI compatible REST API.
For app.phoenix.arize.com, this may fail due to security reasons. In that case, you'd see a Connection Error appear.
If there is a LLM endpoint you would like to use, reach out to mailto://phoenix-support@arize.com