Frequently Asked Questions
Can I use Azure OpenAI?
Yes, in fact this is probably the preferred way to interact with OpenAI if your enterprise requires data privacy. Getting the parameters right for Azure can be a bit tricky so check out the models section for details.
Can I use Phoenix locally from a remote Jupyter instance?
Yes, you can use either of the two methods below.
1. Via ngrok (Preferred)
Install pyngrok on the remote machine using the command
pip install pyngrok
.Create a free account on ngrok and verify your email. Find 'Your Authtoken' on the dashboard.
In jupyter notebook, after launching phoenix set its port number as the
port
parameter in the code below. Preferably use a default port for phoenix so that you won't have to set up ngrok tunnel every time for a new port, simply restarting phoenix will work on the same ngrok URL."Visit Site" using the newly printed
public_url
and ignore warnings, if any.
NOTE:
Ngrok free account does not allow more than 3 tunnels over a single ngrok agent session. Tackle this error by checking active URL tunnels using ngrok.get_tunnels()
and close the required URL tunnel using ngrok.disconnect(public_url)
.
2. Via SSH
This assumes you have already set up ssh on both the local machine and the remote server.
If you are accessing a remote jupyter notebook from a local machine, you can also access the phoenix app by forwarding a local port to the remote server via ssh. In this particular case of using phoenix on a remote server, it is recommended that you use a default port for launching phoenix, say DEFAULT_PHOENIX_PORT
.
Launch the phoenix app from jupyter notebook.
In a new terminal or command prompt, forward a local port of your choice from 49152 to 65535 (say
52362
) using the command below. Remote user of the remote host must have sufficient port-forwarding/admin privileges.If successful, visit localhost:52362 to access phoenix locally.
If you are abruptly unable to access phoenix, check whether the ssh connection is still alive by inspecting the terminal. You can also try increasing the ssh timeout settings.
Closing ssh tunnel:
Simply run exit
in the terminal/command prompt where you ran the port forwarding command.
How can I configure the backend to send the data to the phoenix UI in another container?
If you are working on an API whose endpoints perform RAG, but would like the phoenix server not to be launched as another thread.
You can do this by configuring the following the environment variable PHOENIX_COLLECTOR_ENDPOINT to point to the server running in a different process or container. https://docs.arize.com/phoenix/environments
Can I use an older version of LlamaIndex?
Yes you can! You will have to be using arize-phoenix>3.0.0
and downgrade openinference-instrumentation-llama-index<1.0.0
Running on SageMaker
With SageMaker notebooks, phoenix leverages the jupyter-server-proy to host the server under proxy/6006.
Note, that phoenix will automatically try to detect that you are running in SageMaker but you can declare the notebook runtime via a parameter to launch_app
or an environment variable
Can I persistdata in the notbook?
You can persist data in the notebook by either setting the use_temp_dir
flag to false in px.launch_app
which will persit your data in SQLite on your disk at the PHOENIX_WORKING_DIR. Alternatively you can deploy a phoenix instance and point to it via PHOENIX_COLLECTOR_ENDPOINT.
Can I use gRPC for trace collection?
Phoenix does natively support gRPC for trace collection post 4.0 release. See Self-hosting for details.
Last updated