utils.types.Metrics

Arize enum for metrics grouping when validating a schema column in log() call.

Use CaseSDK MetricMetric

Regression

Metrics.REGRESSION

MAPE, MAE, RMSE, MSE, R-Squared, Mean Error

Classification

Metrics.CLASSIFICATION

Accuracy, Recall, Precision, FPR, FNR, F1, Sensitivity, Specificity

Ranking

Metrics.RANKING

NDCG

AUC & LogLoss

Metrics.AUC_LOG_LOSS

AUC, PR-AUC, Log Loss

Ranking Label

Metrics.RANKING_LABEL

GroupAUC, MAP, MRR

Method

repr()

To view applicable metrics, pass in your desired SDK Metric from above.

repr(Metrics.[SDK Metric])

Code Example

response = arize_client.log(
    model_id='sample-binary-classification-model', 
    ...
    metrics_validation=[Metrics.CLASSIFICATION]

)

response = arize_client.log(
    model_id='sample-regression-model', 
    ...
    metrics_validation=[Metrics.REGRESSION]
)

response = arize_client.log(
    model_id='sample-ranking-model', 
    ...
    metrics_validation=[Metrics.RANKING, Metrics.RANKING_LABEL]
)

Last updated

Copyright © 2023 Arize AI, Inc