Langflow

Trace Langflow visual AI workflows with Respan.
  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page

Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.

1{
2 "mcpServers": {
3 "respan-docs": {
4 "url": "https://docs.respan.ai/mcp"
5 }
6 }
7}

What is Langflow?

Langflow is a visual framework by DataStax for building multi-agent and RAG applications. It provides a drag-and-drop interface for composing LLM pipelines with components for models, prompts, tools, and data sources.

Setup

1

Install packages

Langflow is built on LangChain, so tracing uses the LangChain instrumentation.

$pip install respan-ai opentelemetry-instrumentation-langchain langflow
2

Set environment variables

$export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"
$export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
3

Initialize and run

1import os
2from dotenv import load_dotenv
3
4load_dotenv()
5
6from respan import Respan
7from respan_instrumentation_openinference import OpenInferenceInstrumentor
8from opentelemetry.instrumentation.langchain import LangchainInstrumentor
9
10# Initialize Respan with LangChain instrumentation for Langflow
11respan = Respan(
12 instrumentations=[
13 OpenInferenceInstrumentor(instrumentor=LangchainInstrumentor())
14 ]
15)
16
17# Run Langflow with tracing enabled
18# Option 1: Use the Langflow CLI with tracing active
19# langflow run
20
21# Option 2: Use the Langflow API programmatically
22from langflow.load import run_flow_from_json
23
24result = run_flow_from_json(
25 flow="path/to/your/flow.json",
26 input_value="What is the meaning of life?",
27)
28print(result)
29
30respan.flush()
4

View your trace

Open the Traces page to see your Langflow workflow with component-level operations, LLM calls, and pipeline execution.

What gets traced

All Langflow operations are auto-instrumented via the LangChain layer:

  • Visual workflow execution
  • Component-level operations
  • LLM calls with model, tokens, and input/output
  • Vector store queries
  • Tool and API calls

Traces appear in the Traces dashboard.

Learn more