Phoenix
TypeScript APIPython APICommunityGitHubPhoenix Cloud
  • Documentation
  • Self-Hosting
  • Cookbooks
  • Learn
  • Integrations
  • SDK and API Reference
  • Release Notes
  • 🪜Fundamentals
    • Agents Hub
    • LLM Evals Hub
    • LLM Ops Hub
  • 🤖Agents
    • Agent Workflow Patterns
      • AutoGen
      • CrewAI
      • Google GenAI SDK (Manual Orchestration)
      • OpenAI Agents
      • LangGraph
  • 🔭Tracing
    • What are Traces
    • How Tracing Works
    • FAQs: Tracing
  • 📃Prompt Engineering
    • Prompts Concepts
  • 🗄️Datasets and Experiments
    • Datasets Concepts
  • 🧠Evaluation
    • Evaluators
    • Eval Data Types
    • Evals With Explanations
    • LLM as a Judge
    • Custom Task Evaluation
  • 🔍Retrieval & Inferences
    • Retrieval with Embeddings
    • Benchmarking Retrieval
  • Retrieval Evals on Document Chunks
  • Inferences Concepts
  • 📚Resources
    • Frequently Asked Questions
      • What is the difference between Phoenix and Arize?
      • What is my Phoenix Endpoint?
      • What is LlamaTrace vs Phoenix Cloud?
      • Will Phoenix Cloud be on the latest version of Phoenix?
      • Can I add other users to my Phoenix Instance?
      • Can I use Azure OpenAI?
      • Can I use Phoenix locally from a remote Jupyter instance?
      • How can I configure the backend to send the data to the phoenix UI in another container?
      • Can I run Phoenix on Sagemaker?
      • Can I persist data in a notebook?
      • What is the difference between GRPC and HTTP?
      • Can I use gRPC for trace collection?
      • How do I resolve Phoenix Evals showing NOT_PARSABLE?
      • Langfuse alternative? Arize Phoenix vs Langfuse: key differences
      • Langsmith alternatives? Arize Phoenix vs LangSmith: key differences
    • Contribute to Phoenix
    • Github
  • OpenInference
Powered by GitBook

Platform

  • Tracing
  • Prompts
  • Datasets and Experiments
  • Evals

Software

  • Python Client
  • TypeScript Client
  • Phoenix Evals
  • Phoenix Otel

Resources

  • Container Images
  • X
  • Blue Sky
  • Blog

Integrations

  • OpenTelemetry
  • AI Providers

© 2025 Arize AI

On this page

Was this helpful?

Edit on GitHub
  1. Resources
  2. Frequently Asked Questions

How can I configure the backend to send the data to the phoenix UI in another container?

PreviousCan I use Phoenix locally from a remote Jupyter instance?NextCan I run Phoenix on Sagemaker?

Last updated 8 days ago

Was this helpful?

If you are working on an API whose endpoints perform RAG, but would like the phoenix server not to be launched as another thread.

You can do this by configuring the following the variable PHOENIX_COLLECTOR_ENDPOINT to point to the server running in a different process or container.

📚
environment