flight recorder GitHub Install →
Open source · Free · MIT

See what your AI agent
actually did.

GlassPipe is the flight recorder for your AI agents. Every LLM call, every tool, every wasted token — captured and turned into a beautiful, shareable timeline.

$ pip install glasspipe
Try the live demo → GitHub

Free · Open source · No account ever

Your agent is a black box.
Until now.

Before GlassPipe, debugging a broken agent meant reading logs that look like this.

× Without GlassPipe
[2026-04-27 14:32:08] Starting agent...
Calling OpenAI API...
Got response. Processing...
Calling tool: search
Calling OpenAI API...
Got response. Processing...
Calling OpenAI API...
Got response. Processing...
Calling OpenAI API...
Response took 12.4s.
Calling OpenAI API...
Calling OpenAI API...
[ERROR] Rate limit exceeded
Retrying...
[2026-04-27 14:32:55] Done. Cost: $3.21

...why did this cost $3.21?
VS
With GlassPipe
A full visual timeline instantly reveals:
DETECTED summarize_article called 3× with identical input. Wasted: $0.42
DETECTED verify_report looping 6× with no exit condition. Wasted: $1.48
RESULT Fixed in 15 minutes. New cost per run: $0.31 (−90%)

Your agent went from black box to glass pipe.

Try it now —
no install required.

Click any span bar below to inspect it. This is exactly what you see in your own dashboard.

← Click a span to inspect it

This is exactly what you'll see in your own dashboard.

Three steps.
Sixty seconds.

No accounts. No configuration. No environment variables. Just one line.

1
Install
One command. No accounts, no API keys, no configuration.
$ pip install glasspipe
2
Decorate
Add one line above your agent function. Your code is completely untouched.
from glasspipe import trace

@trace
def my_agent(q):
  # your code, untouched
  return answer
3
See & Share
Open the dashboard. Explore the timeline. Share a link in one click.
$ glasspipe dashboard
localhost:3000

# share: glasspipe.dev/t/a1f9c2

See a real trace.

Click either card to explore a live, shared agent run.

Right tool,
right context.

GlassPipe isn't for everyone. Here's the honest picture — use what fits you.

What you need Best choice
Enterprise observability with team workspaces, SSO, SOC2 Langfuse or LangSmith
Production monitoring with alerting and full async support Langfuse or Helicone
Open-source self-hosted observability platform Arize Phoenix or Langfuse
Install in 60s, see a trace, share a debugging link — no account ever GlassPipe
Student learning how AI agents work internally GlassPipe
Share a debugging session like a CodeSandbox link GlassPipe

Use GlassPipe if...

You want the fastest path from a broken agent to a shareable trace link. You're an indie dev, a student, or anyone who bounces at “create an account.”

Use Langfuse or LangSmith if...

You need enterprise features, production monitoring, team workspaces, alerting, async support, or SOC2 compliance. They're built for that.

We're not trying to replace them. We're built for a different moment.

Start tracing in
sixty seconds.

$ pip install glasspipe
View on GitHub
60-second quickstart
from glasspipe import trace, span

@trace
def research_agent(topic):
  with span("plan", kind="custom") as s:
    s.record(input={"topic": topic},
          output={"plan": "..."})

  # ... your agent code, untouched
  return result

research_agent("AI agent observability")
# then: glasspipe dashboard

v1 supports synchronous Python + OpenAI + Anthropic · async & streaming in v1.5 · MIT license