Arize
Easily share data when you discover interesting insights so your data science team can perform further investigation or kickoff retraining workflows.
Last updated
Easily share data when you discover interesting insights so your data science team can perform further investigation or kickoff retraining workflows.
Last updated
Oftentimes, the team that notices an issue in their model, for example a prompt/response LLM model, may not be the same team that continues the investigations or kicks off retraining workflows.
To help connect teams and workflows, Phoenix enables continued analysis of production data from in a notebook environment for fine tuning workflows.
For example, a user may have noticed in that this prompt template is not performing well.
With a few lines of Python code, users can export this data into Phoenix for further analysis. This allows team members, such as data scientists, who may not have access to production data today, an easy way to access relevant product data for further analysis in an environment they are familiar with.
They can then easily augment and fine tune the data and verify improved performance, before deploying back to production.
There are two ways export data out of for further investigation:
The easiest way is to click the export button on the Embeddings and Datasets pages. This will produce a code snippet that you can copy into a Python environment and install Phoenix. This code snippet will include the date range you have selected in the platform, in addition to the datasets you have selected.
Users can also query for data directly using the Arize Python export client. We recommend doing this once you're more comfortable with the in-platform export functionality, as you will need to manually enter in the data ranges and datasets you want to export.