Skip to main content

Overview

respan-instrumentation-openai-agents is an instrumentation plugin that captures traces from the OpenAI Agents SDK and sends them to Respan via the unified OTEL pipeline. It registers a TracingProcessor with the OpenAI Agents SDK that converts native SDK trace/span events into OTEL ReadableSpan objects with proper traceloop.* and gen_ai.* attributes.
pip install respan-instrumentation-openai-agents
Version: 1.0.0 | Python: >=3.11, <3.14

Dependencies

PackageVersion
respan-tracing>=2.3.0
respan-sdk>=2.5.0
openai-agents>=0.3.1

Quick start

from respan import Respan
from respan_instrumentation_openai_agents import OpenAIAgentsInstrumentor

respan = Respan(
    api_key="your-api-key",
    instrumentations=[OpenAIAgentsInstrumentor()],
)

# Now all OpenAI Agents SDK traces are automatically captured
from agents import Agent, Runner

agent = Agent(name="my-agent", instructions="You are a helpful assistant.")
result = await Runner.run(agent, "Hello!")
print(result.final_output)

Public API

OpenAIAgentsInstrumentor

The main instrumentor class. Implements the Instrumentation protocol.
from respan_instrumentation_openai_agents import OpenAIAgentsInstrumentor
Attribute/MethodTypeDescription
namestr"openai-agents" — unique plugin identifier.
activate()() -> NoneRegisters a TracingProcessor with the OpenAI Agents SDK via set_trace_processors().
deactivate()() -> NoneClears the processor reference.

activate()

When called, activate():
  1. Creates an internal _RespanTracingProcessor instance.
  2. Calls agents.tracing.set_trace_processors([processor]) to replace the default OpenAI tracing backend.
  3. All subsequent traces and spans from the OpenAI Agents SDK are routed through Respan instead of OpenAI’s tracing endpoint.
set_trace_processors() replaces all existing processors. If you need OpenAI’s native tracing alongside Respan, you will need a custom setup.

deactivate()

Clears the internal processor reference. Does not restore the original OpenAI processors.

Captured span types

The plugin captures all span types from the OpenAI Agents SDK:
SDK Span TypeOTEL Span KindLog TypeDescription
TraceworkflowworkflowRoot trace span (the overall agent run).
AgentSpanDataagentagentAgent execution span. Includes agent name, tools, and handoffs.
ResponseSpanDatataskresponseLLM response (Responses API call). Includes model, tokens, input/output.
GenerationSpanDatataskgenerationLLM generation (Chat Completions API call). Includes model, tokens, input/output.
FunctionSpanDatatooltoolTool/function call execution. Includes function name, input, output.
HandoffSpanDatataskhandoffAgent-to-agent handoff. Includes from/to agent names.
GuardrailSpanDatataskguardrailGuardrail check execution. Includes guardrail name and triggered status.
CustomSpanDatataskcustomCustom user-defined span data.

Span attributes

All spans include these common attributes:
AttributeDescription
traceloop.span.kindSpan kind: "workflow", "agent", "task", "tool".
traceloop.entity.nameHuman-readable span name.
traceloop.entity.pathEntity path for parent/child relationship tracking.
respan.entity.log_typeRespan log type classification (from respan-sdk constants).
LLM spans (ResponseSpanData, GenerationSpanData) additionally include:
AttributeDescription
llm.request.typeAlways "chat" — triggers backend prompt/completion parsing.
gen_ai.system"openai" (for ResponseSpanData).
gen_ai.request.modelModel name (e.g. "gpt-4o").
gen_ai.usage.prompt_tokensInput token count.
gen_ai.usage.completion_tokensOutput token count.
traceloop.entity.inputJSON-serialized input messages.
traceloop.entity.outputJSON-serialized output.

With propagated attributes

Use propagate_attributes() to attach per-request metadata to all spans:
from respan import Respan, propagate_attributes
from respan_instrumentation_openai_agents import OpenAIAgentsInstrumentor
from agents import Agent, Runner

respan = Respan(
    api_key="your-api-key",
    instrumentations=[OpenAIAgentsInstrumentor()],
)

agent = Agent(name="assistant", instructions="You are helpful.")

async def handle_user_request(user_id: str, message: str):
    with propagate_attributes(
        customer_identifier=user_id,
        thread_identifier="conv_123",
        metadata={"source": "web"},
    ):
        result = await Runner.run(agent, message)
        return result.final_output

With decorators

Combine plugin instrumentation with decorators for additional context:
from respan import Respan, workflow, task
from respan_instrumentation_openai_agents import OpenAIAgentsInstrumentor
from agents import Agent, Runner

respan = Respan(
    api_key="your-api-key",
    instrumentations=[OpenAIAgentsInstrumentor()],
)

agent = Agent(name="researcher", instructions="Research the given topic.")

@task(name="research")
async def research(topic: str):
    result = await Runner.run(agent, f"Research: {topic}")
    return result.final_output

@workflow(name="research_pipeline")
async def pipeline(topics: list[str]):
    results = []
    for topic in topics:
        results.append(await research(topic))
    return results

Architecture

OpenAI Agents SDK

  ├─ Trace (on_trace_end)  ──→  emit_trace()     ──→  build_readable_span() ──→  inject_span()
  │                                                                                    │
  ├─ AgentSpan (on_span_end) ──→ emit_agent()     ──→  build_readable_span() ──→  inject_span()
  │                                                                                    │
  ├─ ResponseSpan            ──→ emit_response()   ──→  build_readable_span() ──→  inject_span()
  │                                                                                    │
  ├─ FunctionSpan            ──→ emit_function()   ──→  build_readable_span() ──→  inject_span()
  │                                                                                    │
  └─ ...                     ──→ ...               ──→  ...                   ──→  ...


                                                                            OTEL Processor Chain


                                                                            RespanSpanExporter


                                                                              POST /v2/traces

Internal modules

These are implementation details, not part of the public API:
ModuleDescription
_instrumentation.pyOpenAIAgentsInstrumentor class and _RespanTracingProcessor (the TracingProcessor implementation).
_otel_emitter.pyPer-type emitter functions (emit_trace, emit_agent, emit_response, etc.) and the emit_sdk_item() dispatcher.
_utils.pySerialization helpers: _serialize(), _format_input_messages(), _format_output(), _parse_ts(), _responses_api_item_to_message().