Arize AI
Search…
Ray Serve (Anyscale)
Arize helps you visualize your model performance, understand drift & data quality issues, and share insights learned from your models. Ray Serve is an framework agnostic and scalable model serving library built on Ray.
1
$pip install arize
2
$pip install 'ray[serve]'
Copied!

Arize Integration in 3 Steps

Arize can be easily integrated with Ray Serve with at single entry point during ray.serve.deployment. Following 3 simple steps, (1) Import Arize Client and saving it as model instance attribute (2) Saving important model meta-data for argument passing (3) Log production data using arize.client during HTTP request call function, where models will make prediction during production.
See below for a quick start example.

Quick Start Example

1
from ray import serve
2
3
import numpy as np
4
import pandas as pd
5
from sklearn import datasets
6
from sklearn.ensemble import RandomForestClassifier
7
from sklearn.model_selection import train_test_split
8
9
import uuid
10
import requests
11
import concurrent.futures as cf
12
from arize.api import Client
13
from arize.types import ModelTypes
14
15
data = datasets.load_breast_cancer()
16
X, y = datasets.load_breast_cancer(return_X_y=True)
17
X, y = X.astype(np.float32), y.astype(int)
18
X, y = pd.DataFrame(X, columns=data['feature_names']), pd.Series(y)
19
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=42)
20
model = RandomForestClassifier().fit(X_train, y_train)
21
22
# Integration starts here
23
@serve.deployment(name='ArizeModel')
24
class ArizeModel:
25
"""
26
Rayserve and Arize Quick-start Integration Model
27
"""
28
def __init__(self):
29
self.model = model # change to reading a pkl file, or otherwise
30
# Step 1 Save Arize client
31
self.arize = Client(space_key='YOUR_SPACE_KEY',
32
api_key='YOUR_API_KEY')
33
# Step 2 Saving model metadata for passing in later
34
self.model_id = 'rayserve-model'
35
self.model_version = '1.0'
36
self.model_type = ModelTypes.BINARY
37
38
async def __call__(self, starlette_request):
39
payload = await starlette_request.json()
40
# Reloading data into correct json format
41
X_test = pd.read_json(payload)
42
y_pred = self.model.predict(X_test)
43
44
# Step 3 Log production to Arize
45
ids_df = pd.DataFrame([str(uuid.uuid4()) for _ in range(len(X_test))])
46
log_responses = self.arize.bulk_log(
47
model_id=self.model_id,
48
prediction_ids=ids_df,
49
model_version=self.model_version,
50
prediction_labels=pd.Series(y_pred),
51
features=X_test,
52
model_type=self.model_type,
53
)
54
55
# Record HTTP response of logging to arize
56
arize_success = True
57
for response in cf.as_completed(log_responses):
58
status_code = response.result().status_code
59
arize_success = arize_success and status_code == 200
60
61
# Return production inferences and arize logging results
62
return {'result': y_test.to_numpy(),
63
'arize-sucessful': arize_success}
Copied!
If using version < 4.0.0, replace space_key=YOUR_SPACE_KEY with organization_key=YOUR_SPACE_KEY
After we define our model as above, we can serve and monitor our model production on Arize by running the following code by deploying them with Ray Serve.
1
serve.start()
2
# Model deployment
3
ArizeModel.deploy()
4
5
# Simulate production setting
6
input = X_test.to_json()
7
response = requests.get(
8
"http://localhost:8000/ArizeModel", json=input
9
)
10
# Display results
11
print(response.text)
Copied!
Last modified 3mo ago