Phoenix
TypeScript APIPython APICommunityGitHubPhoenix Cloud
  • Documentation
  • Self-Hosting
  • Cookbooks
  • Learn
  • Integrations
  • SDK and API Reference
  • Release Notes
  • 🪜Fundamentals
    • Agents Hub
    • LLM Evals Hub
    • LLM Ops Hub
  • 🤖Agents
    • Agent Workflow Patterns
      • AutoGen
      • CrewAI
      • Google GenAI SDK (Manual Orchestration)
      • OpenAI Agents
      • LangGraph
  • 🔭Tracing
    • What are Traces
    • How Tracing Works
    • FAQs: Tracing
  • 📃Prompt Engineering
    • Prompts Concepts
  • 🗄️Datasets and Experiments
    • Datasets Concepts
  • 🧠Evaluation
    • Evaluators
    • Eval Data Types
    • Evals With Explanations
    • LLM as a Judge
    • Custom Task Evaluation
  • 🔍Retrieval & Inferences
    • Retrieval with Embeddings
    • Benchmarking Retrieval
  • Retrieval Evals on Document Chunks
  • Inferences Concepts
  • 📚Resources
    • Frequently Asked Questions
      • What is the difference between Phoenix and Arize?
      • What is my Phoenix Endpoint?
      • What is LlamaTrace vs Phoenix Cloud?
      • Will Phoenix Cloud be on the latest version of Phoenix?
      • Can I add other users to my Phoenix Instance?
      • Can I use Azure OpenAI?
      • Can I use Phoenix locally from a remote Jupyter instance?
      • How can I configure the backend to send the data to the phoenix UI in another container?
      • Can I run Phoenix on Sagemaker?
      • Can I persist data in a notebook?
      • What is the difference between GRPC and HTTP?
      • Can I use gRPC for trace collection?
      • How do I resolve Phoenix Evals showing NOT_PARSABLE?
      • Langfuse alternative? Arize Phoenix vs Langfuse: key differences
      • Langsmith alternatives? Arize Phoenix vs LangSmith: key differences
    • Contribute to Phoenix
    • Github
  • OpenInference
Powered by GitBook

Platform

  • Tracing
  • Prompts
  • Datasets and Experiments
  • Evals

Software

  • Python Client
  • TypeScript Client
  • Phoenix Evals
  • Phoenix Otel

Resources

  • Container Images
  • X
  • Blue Sky
  • Blog

Integrations

  • OpenTelemetry
  • AI Providers

© 2025 Arize AI

On this page

Was this helpful?

Edit on GitHub
  1. Resources
  2. Frequently Asked Questions

What is my Phoenix Endpoint?

PreviousWhat is the difference between Phoenix and Arize?NextWhat is LlamaTrace vs Phoenix Cloud?

Last updated 8 days ago

Was this helpful?

There are two endpoints that matter in Phoenix:

  1. Application Endpoint: The endpoint your Phoenix instance is running on

  2. OTEL Tracing Endpoint: The endpoint through which your Phoenix instance receives OpenTelemetry traces

Application Endpoint

If you're accessing a Phoenix Cloud instance through our website, then your endpoint is https://app.phoenix.arize.com

If you're self-hosting Phoenix, then you choose the endpoint when you set up the app. The default value is http://localhost:6006

To set this endpoint, use the PHOENIX_COLLECTOR_ENDPOINT environment variable. This is used by the Phoenix client package to query traces, log annotations, and retrieve prompts.

OTEL Tracing Endpoint

If you're accessing a Phoenix Cloud instance through our website, then your OTEL tracing endpoint is https://app.phoenix.arize.com/v1/traces

If you're self-hosting Phoenix, then you choose the endpoint when you set up the app. The default values are:

  • Using the GRPC protocol: http://localhost:6006/v1/traces

  • Using the HTTP protocol: http://localhost:4317

As of May 2025, Phoenix Cloud only supports trace collection via HTTP

To set this endpoint, use the register(endpoint=YOUR ENDPOINT) function. This endpoint can also be set using environment variables. For more on the register function and other configuration options, .

📚
see here