Groq

Instrument LLM applications built with Groq

In this example we will instrument an LLM application built using Groq

pip install openinference-instrumentation-groq groq arize-otel

Set up GroqInstrumentor to trace calls to Groq LLM in the application and sends the traces to an Arize model endpoint as defined below.

from openinference.instrumentation.groq import GroqInstrumentor
# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints

# Setup OTEL via our convenience function
register_otel(
    endpoints = Endpoints.ARIZE,
    space_id = "your-space-id", # in app space settings page
    api_key = "your-api-key", # in app space settings page
    model_id = "your-model-id", # name this to whatever you would like
    model_version = "your-model-version" # the version of the model e.g. 'v1'
)

GroqInstrumentor().instrument()

Run a simple Chat Completion via Groq and see it instrumented

import os
from groq import Groq

# get your groq api key by visiting https://groq.com/
os.environ["GROQ_API_KEY"] = "your-groq-api-key" 

client = Groq()

# send a request to the groq client
chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Explain the importance of low latency LLMs",
        }
    ],
    model="mixtral-8x7b-32768",
)
print(chat_completion.choices[0].message.content)

Last updated

Copyright © 2023 Arize AI, Inc