AutoGen

AutoGen is Microsoft’s framework for building multi-agent conversational AI systems. It enables the creation of agents that can converse with each other to solve tasks, with support for human-in-the-loop patterns. Respan gives you full observability over every conversation, agent message, and LLM call — and gateway routing through the OpenAI-compatible Respan endpoint.

Create an account at platform.respan.ai and grab an API key. For gateway, also add credits or a provider key.

Run npx @respan/cli setup to set up with your coding agent.

Setup

1

Install packages

$pip install respan-ai openinference-instrumentation-autogen-agentchat pyautogen
2

Set environment variables

$export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
$export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"

OPENAI_API_KEY is used for LLM requests. RESPAN_API_KEY is used to export traces to Respan.

3

Initialize and run

1import os
2import autogen
3from respan import Respan
4from openinference.instrumentation.autogen_agentchat import AutoGenInstrumentor
5
6respan = Respan(instrumentations=[AutoGenInstrumentor()])
7
8config_list = [{"model": "gpt-4.1-nano", "api_key": os.environ["OPENAI_API_KEY"]}]
9
10assistant = autogen.AssistantAgent(
11 name="assistant",
12 llm_config={"config_list": config_list},
13 system_message="You are a helpful AI assistant.",
14)
15
16user_proxy = autogen.UserProxyAgent(
17 name="user_proxy",
18 human_input_mode="NEVER",
19 max_consecutive_auto_reply=3,
20 code_execution_config={"work_dir": "coding", "use_docker": False},
21)
22
23user_proxy.initiate_chat(
24 assistant,
25 message="Write a Python function to calculate the fibonacci sequence.",
26)
27
28respan.flush()
4

View your trace

Open the Traces page to see your AutoGen conversation with agent messages, LLM calls, and code execution.

Configuration

ParameterTypeDefaultDescription
api_keystr | NoneNoneFalls back to RESPAN_API_KEY env var.
base_urlstr | NoneNoneFalls back to RESPAN_BASE_URL env var.
instrumentationslist[]Plugin instrumentations to activate (e.g. AutoGenInstrumentor()).
customer_identifierstr | NoneNoneDefault customer identifier for all spans.
metadatadict | NoneNoneDefault metadata attached to all spans.
environmentstr | NoneNoneEnvironment tag (e.g. "production").

Attributes

In Respan()

Set defaults at initialization — these apply to all spans.

1from respan import Respan
2from openinference.instrumentation.autogen_agentchat import AutoGenInstrumentor
3
4respan = Respan(
5 instrumentations=[AutoGenInstrumentor()],
6 customer_identifier="user_123",
7 metadata={"service": "autogen-api", "version": "1.0.0"},
8)

With propagate_attributes

Override per-request using a context scope.

1from respan import Respan, propagate_attributes
2from openinference.instrumentation.autogen_agentchat import AutoGenInstrumentor
3
4respan = Respan(instrumentations=[AutoGenInstrumentor()])
5
6def handle_request(user_id: str, message: str):
7 with propagate_attributes(
8 customer_identifier=user_id,
9 thread_identifier="conv_abc_123",
10 metadata={"plan": "pro"},
11 ):
12 user_proxy.initiate_chat(assistant, message=message)
AttributeTypeDescription
customer_identifierstrIdentifies the end user in Respan analytics.
thread_identifierstrGroups related messages into a conversation.
metadatadictCustom key-value pairs. Merged with default metadata.