Export Data to Notebook
Easily share data when you discover interesting insights so your data science team can perform further investigation or kickoff retraining workflows.
Last updated
Easily share data when you discover interesting insights so your data science team can perform further investigation or kickoff retraining workflows.
Last updated
Copyright © 2023 Arize AI, Inc
Oftentimes, the team that notices an issue in their model, for example a prompt/response LLM model, may not be the same team that continues the investigations or kicks off retraining workflows.
To help connect teams and workflows, Arize enables continued analysis of production data in a notebook environment for fine tuning workflows.
For example, a user may have noticed in Arize that this prompt template is not performing well.
They can then easily augment and fine tune the data and verify improved performance, before deploying back to production.
There are two ways export data for further investigation:
The easiest way is to click the export button on the Embeddings and Datasets pages. This will produce a code snippet that you can copy into a Python environment and install Phoenix. This code snippet will include the date range you have selected in the Arize platform, in addition to the datasets you have selected.
Users can also query Arize for data directly using the Arize Python export client. We recommend doing this once you're more comfortable with the in-platform export functionality, as you will need to manually enter in the data ranges and datasets you want to export.
With a few lines of Python code, users can export this data into or a Jupyter notebook for further analysis. This allows team members, such as data scientists, who may not have access to production data today, an easy way to access relevant product data for further analysis in an environment they are familiar with.
is Arize's open source ML observability library designed for the notebook, helping visualize, troubleshoot, and monitor your LLM, CV, NLP and tabular models.