Skip to content

LLM Monitoring not working for Async OpenAI requests #3494

@AltafHussain4748

Description

@AltafHussain4748

Problem Statement

I just experimented LLM monitoring and could not make it work with AsyncOpenAI.

Below is my code

My sentry init is

sentry_sdk.init(
    dsn=os.environ.get("SENTRY_DSN"),
    integrations=[sentry_logging],
    environment=os.environ.get("ENVIRONMENT", "prod"),
    send_default_pii=True,
)

Function i am using:

@ai_track("Tracking Name")
@async_retry(retries=4)
async def func():
    client = AsyncOpenAI()
    with sentry_sdk.start_transaction(op="ai-inference", name="Structured Data Prompt"):
        response = await client.chat.completions.create(
            model=model,
            messages=messages,
            functions=functions,
            temperature=0.0,
            timeout=120,
        )

Versions

sentry-sdk==2.13.0
openai==1.37.1

Solution Brainstorm

No response

Product Area

Insights

Metadata

Metadata

Assignees

No one assigned

    Projects

    Status

    Waiting for: Community

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions