Customer Lifetime Value

Overview of how to use Arize for customer lifetime value models

Check out our Customer Lifetime Value Colab to see how you can leverage ML Observability for your models!

Set up a Baseline and Monitors

In just a few clicks, Arize automatically configures monitors that are best suited to your data to proactively detect drift, data quality, and performance issues.

  1. Datasets: Training Version 1.0

  2. Default Metric: RMSE, Trigger Alert When: RMSE is above 80

  3. Turn On Monitoring: Drift ✅, Data Quality ✅, Performance ✅

Exploring model and feature drift

Visualize feature and model drift between various model environments and versions to identify LTV patterns and anomalous distribution behavior. Arize provides drift over time widgets overlaid with your metric of choice (in our case, RMSE) to clearly determine if drift is contributing to our performance degradation.

With the insights provided on Arize, you can deep dive into root causes and quickly gain intuitions, allowing for ML teams to quickly iterate, experiment, and ship new models in production.

Analyzing root cause for low performing cohorts

Arize empowers automatic drill down on low performing slices (feature/value combinations) through the Feature Performance Heatmap.

We could pay close attention to feature/value combinations that could be indicative of model exploitations. In the case of LTV as stated in the colab, we used filters on the Feature Performance Heatmap to narrow down Fiber Optic in Internet Service and Yes in Streaming TV across Cities.

Model Performance Dashboard

As we continue to check in and improve our model's performance, we want to be able to quickly and efficiently view all our important model metrics in a single pane. Use our Regression Model Performance Dashboard to set up a customizable dashboard for a single glance view of your model's important metrics.

In the case of LTV, we look at the model's RMSE, MAPE, and MAE values.

In only a few clicks we can visualize these metrics over time by creating a time series widget and overlaying three plots to showcase the fluctuation of the metrics over time.

By visualizing the feature drift and model performance, and understanding the features responsible, ML Engineers can gain additional information when troubleshooting model performance issues. Possible action items that could improve the model’s performance: Examining possible concept drifts relating to the features in question Retraining model to fit new distributions specific to this drift

Resources

Check out our Customer Lifetime Value Colab to see how you can leverage ML Observability for your models!

Last updated

Copyright © 2023 Arize AI, Inc