Langfuse

Langfuse is an open-source LLM observability platform. The Respan instrumentor patches Langfuse’s OTLP exporter to redirect traces to Respan, so your existing @observe decorators and langfuse.trace() calls continue to work unchanged. You also get gateway routing through the OpenAI-compatible Respan endpoint.

Create an account at platform.respan.ai and grab an API key. For gateway, also add credits or a provider key.

Run npx @respan/cli setup to set up with your coding agent.

Setup

1

Install packages

$pip install respan-ai respan-instrumentation-langfuse langfuse openai
2

Set environment variables

$export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
$export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"
$export LANGFUSE_PUBLIC_KEY="YOUR_LANGFUSE_PUBLIC_KEY"
$export LANGFUSE_SECRET_KEY="YOUR_LANGFUSE_SECRET_KEY"

OPENAI_API_KEY is used for LLM requests. RESPAN_API_KEY is used to export traces to Respan. Langfuse keys are still required for the SDK to initialize.

3

Initialize and run

LangfuseInstrumentor must be activated before importing langfuse. The instrumentor patches the OTLP exporter at import time.

1from respan import Respan
2from respan_instrumentation_langfuse import LangfuseInstrumentor
3
4# Activate BEFORE importing langfuse
5respan = Respan(instrumentations=[LangfuseInstrumentor()])
6
7from langfuse.decorators import observe
8from openai import OpenAI
9
10client = OpenAI()
11
12@observe()
13def generate_joke():
14 response = client.chat.completions.create(
15 model="gpt-4.1-nano",
16 messages=[{"role": "user", "content": "Tell me a joke about AI"}],
17 )
18 return response.choices[0].message.content
19
20@observe()
21def joke_pipeline():
22 return generate_joke()
23
24print(joke_pipeline())
25respan.flush()
4

View your trace

Open the Traces page to see your Langfuse traces in Respan.

Configuration

ParameterTypeDefaultDescription
api_keystr | NoneNoneFalls back to RESPAN_API_KEY env var.
base_urlstr | NoneNoneFalls back to RESPAN_BASE_URL env var.
instrumentationslist[]Plugin instrumentations to activate (e.g. LangfuseInstrumentor()).
customer_identifierstr | NoneNoneDefault customer identifier for all spans.
metadatadict | NoneNoneDefault metadata attached to all spans.
environmentstr | NoneNoneEnvironment tag (e.g. "production").

Attributes

In Respan()

Set defaults at initialization — these apply to all spans.

1from respan import Respan
2from respan_instrumentation_langfuse import LangfuseInstrumentor
3
4respan = Respan(
5 instrumentations=[LangfuseInstrumentor()],
6 customer_identifier="user_123",
7 metadata={"service": "chat-api", "version": "1.0.0"},
8)

With propagate_attributes

Override per-request using a context scope.

1from respan import Respan, propagate_attributes
2from respan_instrumentation_langfuse import LangfuseInstrumentor
3
4respan = Respan(instrumentations=[LangfuseInstrumentor()])
5
6from langfuse.decorators import observe
7
8@observe()
9def handle_request(user_id: str, question: str):
10 with propagate_attributes(
11 customer_identifier=user_id,
12 thread_identifier="conv_abc_123",
13 metadata={"plan": "pro"},
14 ):
15 # your Langfuse-traced code here
16 ...
AttributeTypeDescription
customer_identifierstrIdentifies the end user in Respan analytics.
thread_identifierstrGroups related messages into a conversation.
metadatadictCustom key-value pairs. Merged with default metadata.