Example tutorials of how to use and troubleshoot with Arize.
Access tutorials of what's possible with Arize below:
Your model type determines which performance metrics are available to you. Learn more about model types here.
Python Single Record
Binary Classification (Only Classification Metrics)
Binary Classification (Classification, AUC/Log Loss Metrics)
Binary Classification (Classification, AUC/Log Loss, Regression)
Multiclass Classification (Only Classification Metrics)
Multiclass Classification (Classification, AUC/Log Loss Metrics)
Ranking with Relevance Score
Ranking with Single Label
Ranking with Multiple Labels
NLP Named Entity Recognition (NER)
Tabular Classification w/ Embeddings
Large Language Models (LLMs)
Examples for logging explainability metrics. Click here for more information on how to log feature importance and use explainability.
SHAP: Guide to Getting Started
SHAP: Neural Network on Tabular Data
Surrogate Model Explainability
One Hot Encoding Decomposition
Tutorials on how to log predictions, actuals, and feature importance.
Logging Predictions Only
Logging Predictions First, Then Logging Delayed Actuals
Logging Predictions First, Then Logging SHAPs After
Logging Predictions and Actuals Together
Logging Predictions and SHAP Together
Logging Predictions, Actuals, and SHAP Together
Logging PySpark DataFrames
Arize integrates with platforms across the MLOps toolchain. Don't see a platform you use? Reach out to add yours or ask our team to help!
MLOps platform with APIs to serve, host and manages models
Azure ML & Databricks
Using Arize in an Azure ML Databricks workflow
Use Bento’s ML service platform to turn ML models into production-worthy prediction services
Integrate Arize into the CI/CD workflow - Run checks on every new model version
Deepnote is a Data Science Collaboration Platform
Monitor & Troubleshoot any data inconsistency issue with feature stores Arize.
Google Cloud ML (Vertex AI)
Integrate Arize with Vertex AI
Available on Request
Use Arize to monitor embeddings generated from Hugging Face NLP or Transformer models
Use Arize Pandas SDK to consumes micro-batches of predictions
Effectively monitor the performance of your LLM agents
Integrating Arize and MLflow to track the model across experimentation and deployment
Integrate Arize on models built using Neptune
Build unstructured models with OpenAI
Integrate Arize on models built using Paperspace
To log Spark DataFrames, which have
Ray Serve (Anyscale)
Arize can be easily integrated with Ray Serve with at single entry point during
Combine Spell model servers with Arize model observability
Arize platform can easily integrate with UbiOps to enable model observability, explainability, and monitoring.
Weights & Biases
Integrating Arize and W&B to track the model across experimentation and deployment