Skip to content

Python: [Bug]: Duplicate LLM Telemetry Emission #4675

@sphenry

Description

@sphenry

Description

Same gen_ai.response.id emitted across multiple spans/services, inflating token and cost accounting in Application Insights. Usage dashboards and cost assertions require manual de-dup logic; raw queries overcount by ~2x.

Workaround is to designate responsesapi as canonical source

Code Sample

Error Messages / Stack Traces

Package Versions

python-1.0.0rc4

Python Version

No response

Additional Context

No response

Metadata

Metadata

Labels

bugSomething isn't workingpython

Type

Projects

Status

In Review

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions