OpenTelemetry

Send traces from any OpenTelemetry-instrumented application to Respan.

OpenTelemetry (OTel) is the CNCF standard for distributed tracing. If your application already uses OTel, you can send traces directly to Respan without installing any Respan SDK — just point your OTLP exporter at Respan’s endpoint.

Respan accepts traces via the standard OTLP/HTTP protocol in both JSON and Protobuf formats at /api/v2/traces.

Why use OpenTelemetry with Respan?

  • Zero vendor lock-in — Use standard OTel instrumentation, switch backends anytime
  • Works with existing instrumentation — Any OTel-compatible library (Traceloop, OpenInference, OpenLIT, custom) sends to Respan
  • Full semantic convention support — GenAI, Traceloop, and OpenInference attributes are automatically mapped to Respan fields
  • Combine sources — Mix OTel traces with Respan SDK traces in the same dashboard

Quickstart

Option 1: Environment variables

The simplest setup — works with any OTel SDK or instrumentation library.

$export OTEL_EXPORTER_OTLP_ENDPOINT="https://api.respan.ai/api"
$export OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer YOUR_RESPAN_API_KEY"
$export OTEL_EXPORTER_OTLP_PROTOCOL="http/json"

Then run your application as usual. Any OTel-instrumented code exports traces to Respan automatically.

Option 2: Direct SDK configuration

$pip install opentelemetry-sdk opentelemetry-exporter-otlp-proto-http
1from opentelemetry import trace
2from opentelemetry.sdk.trace import TracerProvider
3from opentelemetry.sdk.trace.export import BatchSpanProcessor
4from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
5
6exporter = OTLPSpanExporter(
7 endpoint="https://api.respan.ai/api/v2/traces",
8 headers={"Authorization": "Bearer YOUR_RESPAN_API_KEY"},
9)
10
11provider = TracerProvider()
12provider.add_span_processor(BatchSpanProcessor(exporter))
13trace.set_tracer_provider(provider)
14
15tracer = trace.get_tracer(__name__)
16
17with tracer.start_as_current_span("my-llm-call") as span:
18 span.set_attribute("gen_ai.request.model", "gpt-4o")
19 span.set_attribute("gen_ai.usage.prompt_tokens", 150)
20 span.set_attribute("gen_ai.usage.completion_tokens", 50)
21 # your LLM call here

Option 3: OpenTelemetry Collector

Route traces through an OTel Collector to Respan. Useful when you want to fan out to multiple backends or apply processing.

1# otel-collector-config.yaml
2exporters:
3 otlphttp/respan:
4 endpoint: "https://api.respan.ai/api"
5 headers:
6 Authorization: "Bearer YOUR_RESPAN_API_KEY"
7
8service:
9 pipelines:
10 traces:
11 receivers: [otlp]
12 processors: [batch]
13 exporters: [otlphttp/respan]

Supported semantic conventions

Respan automatically maps attributes from three major ecosystems into queryable fields. You can mix and match — all are supported simultaneously.

GenAI semantic conventions

The OTel GenAI semantic conventions are the emerging standard for LLM observability.

AttributeRespan field
gen_ai.request.modelModel
gen_ai.response.modelModel (response)
gen_ai.usage.prompt_tokensPrompt tokens
gen_ai.usage.completion_tokensCompletion tokens
gen_ai.request.temperatureTemperature
gen_ai.request.max_tokensMax tokens
gen_ai.systemProvider
llm.request.typeLog type (chat, completion, embedding)
llm.is_streamingStreaming flag
llm.usage.total_tokensTotal tokens
llm.usage.reasoning_tokensReasoning tokens
gen_ai.usage.cache_read_input_tokensCached tokens

Traceloop (OpenLLMetry) conventions

Used by Traceloop and the Respan tracing SDK. These provide workflow/task structure.

AttributeRespan field
traceloop.span.kindSpan type (workflow, task, tool, agent)
traceloop.entity.pathHierarchical span path
traceloop.entity.inputSpan input
traceloop.entity.outputSpan output
traceloop.workflow.nameWorkflow name

OpenInference conventions

Used by OpenInference instrumentors (Arize Phoenix). Respan automatically enriches these into Traceloop/GenAI equivalents.

OpenInference attributeMapped to
openinference.span.kindtraceloop.span.kind
llm.model_namegen_ai.request.model
llm.token_count.promptgen_ai.usage.prompt_tokens
llm.token_count.completiongen_ai.usage.completion_tokens
input.valuetraceloop.entity.input
output.valuetraceloop.entity.output

This means any OpenInference instrumentor works with Respan out of the box — install the instrumentor, point OTLP at Respan, and traces appear with full model/token/input/output data.

Respan-specific attributes

For user tracking, grouping, and metadata:

AttributeDescription
respan.customer_params.customer_identifierCustomer/user identifier
respan.threads.thread_identifierThread/conversation ID
respan.trace.trace_group_identifierGroup related traces
respan.metadataJSON merged into span metadata
respan.environmentEnvironment tag (prod, staging)

All other attributes are preserved in the span’s metadata and are queryable in the Respan dashboard.

Using with third-party instrumentors

Any OTel-compatible instrumentation library works with Respan. Just set the OTLP exporter environment variables and install the instrumentor.

Traceloop (OpenLLMetry)

$pip install opentelemetry-instrumentation-openai
1from opentelemetry.instrumentation.openai import OpenAIInstrumentor
2
3OpenAIInstrumentor().instrument()
4# All OpenAI calls are now traced and sent to Respan

OpenInference

$pip install openinference-instrumentation-openai
1from openinference.instrumentation.openai import OpenAIInstrumentor
2
3OpenAIInstrumentor().instrument()
4# OpenInference spans are auto-enriched by Respan

OpenInference span processors

Convert traces from other formats into OpenInference-compatible spans:

$pip install openinference-instrumentation-openllmetry # Convert Traceloop traces
$pip install openinference-instrumentation-openlit # Convert OpenLIT traces

Configuration reference

VariableValue
OTEL_EXPORTER_OTLP_ENDPOINThttps://api.respan.ai/api
OTEL_EXPORTER_OTLP_HEADERSAuthorization=Bearer YOUR_RESPAN_API_KEY
OTEL_EXPORTER_OTLP_PROTOCOLhttp/json or http/protobuf

Direct endpoint: https://api.respan.ai/api/v2/traces (for SDK configuration without the base endpoint auto-discovery).