Arize AI
Search…
Hugging Face
Arize helps you visualize your model performance, understand drift & data quality issues, and share insights learned from your models. Hugging Face is a library offers both models on-demand in its Model Hub as well as APIs for you to fine-tune NLP models and serve them directly from Hugging Face.

Inference API Integration

Hugging Face Inference API allows you to access public model and ones you have uploaded through Model Hug. Depending on the task, Arize can be directly integrated during production during the query function, or during your model's pipeline.
Integration can be done in 4 simple steps: (1) Set up your Arize and Hugging Face API/SPACE Key, (2) Process output (and/or features) (3) Log to Arize (4) Reformat and return outputs

Fine-Tune a Text Classification Model Example

Please follow this notebook tutorial where we walk you through working with the Hugging Face ecosystem to fine-tune a pre-trained language model for a text classification task. In addition, we will be extracting text embedding vectors and send them to Arize, where we will leverage our embedding tools to learn about and troubleshoot our dataset.

Zero-shot Text Classification Example

For zero-shot text classification problem, we log to Arize underModelType.SCORE_CATEGORICAL since we want to record both class label and probability score.
Depending the specific NLP task and model pipeline, your response will be formatted differently. You may need to update Step 2 (Processing Outputs) to match model output and Step 3 (Logging to Arize) to match the specific NLP task goals.
1
import numpy as np
2
import requests
3
from arize.api import Client
4
from arize.types import ModelTypes
5
6
import json
7
import requests
8
import uuid
9
10
# Step 1: Set up Arize and Hugging Face API/SPACE Key and Tokens
11
ARIZE_SPACE_KEY = 'YOUR_ARIZE_SPACE_KEY'
12
ARIZE_API_KEY = 'YOUR_ARIZE_API_KEY'
13
arize = Client(space_key=ARIZE_SPACE_KEY, api_key=ARIZE_API_KEY)
14
15
API_URL = "https://api-inference.huggingface.co/models/facebook/bart-large-mnli"
16
YOUR_HUGGINGFACE_API_KEY = 'YOUR_HUGGINGFACE_API_KEY'
17
18
headers = {"Authorization": "Bearer {}".format(YOUR_HUGGINGFACE_API_KEY}
19
20
def query(payload):
21
# Step 1: Standard request to Hugging Face Inference API
22
data = json.dumps(payload)
23
response = requests.request("POST", API_URL, headers=headers, data=data)
24
output = json.loads(response.content.decode("utf-8"))
25
26
# Step 2: Process output (and/or features) for logging to Arize
27
idx = np.argmax(output['scores'])
28
prediction, score = output['labels'][idx], output['scores'][idx]
29
30
# optional, if you want to log model features to Arize
31
features = feature_pipeline(data)
32
33
# Step 3: Log to Arize
34
arize_response = arize.log(
35
model_id='facebook/bart-large-mnli',
36
model_version='1.0',
37
model_type=ModelTypes.SCORE_CATEGORICAL,
38
prediction_id = str(uuid.uuid4()),
39
prediction_label=(prediction, score),
40
)
41
42
arize_success = arize_response.result().status_code == 200
43
# Step 4: Return the formatted output
44
return {'prediction': prediction,
45
'score': score,
46
'arize-success': arize_success}
Copied!
If using version < 4.0.0, replace space_key=ARIZE_SPACE_KEY with organization_key=ARIZE_SPACE_KEY
On the client side of production, results can be queries through directly calling the function
1
data = query(
2
{
3
"inputs": """
4
I recently bought a device from your company
5
but it is not working as advertised and
6
I would like to get reimbursed!
7
""",
8
"parameters": {"candidate_labels": ["refund", "legal", "faq"]},
9
}
10
)
11
>>> {'prediction': 'refund', 'score': 0.8680453300476074, 'arize-success': True}
Copied!