Skip to main content

Integration: Azure OpenAi

Integration with an Amazon Bedrock application is straightforward with openllmtelemetry package.

First, you need to initialize the instrumentation. This will

import openllmtelemetry

openllmtelemetry.instrument()

Once this is done, all of your Amazon Bedrock interactions will be automatically traced. If you have rulesets enabled for blocking in WhyLabs Secure policy, the library will block requests accordingly

from openai import AzureOpenAI

client = AzureOpenAI(
api_key=os.getenv("AZURE_OPENAI_API_KEY"),
api_version="2023-12-01-preview",
azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT")
)

response = client.chat.completions.create(
model="custom-openai-deployment",
messages=[
{
"role": "system",
"content": "You will be provided with a tweet, and your task is to classify its sentiment as positive, neutral, or negative."
},
{
"role": "user",
"content": "I love lasagna!"
}
],
temperature=0.7,
max_tokens=64,
top_p=1
)
Prefooter Illustration Mobile
Run AI With Certainty
Get started for free
Prefooter Illustration