What it does
The trace package provides opt-in observability for Phero. A tracer receives typed Event values
at each significant lifecycle point: agent start and end, loop iterations, LLM requests and responses, tool calls and
results, and memory retrieval and persistence.
Built-in tracers
trace.Noop: default zero-cost tracer that discards all events (top-leveltracepackage)text.New(w)/text.NewDefault(): writes human-readable, colorized trace lines to anyio.Writer;NewDefault()targetsos.Stderrjsonfile.New(filePath): appends one JSON record per line (NDJSON) to a file; callClose()when done
Import paths: github.com/henomis/phero/trace/text and github.com/henomis/phero/trace/jsonfile.
Attach tracing to an agent
Agents expose SetTracer. Once attached, every Run emits lifecycle events automatically.
import (
"context"
"fmt"
"os"
"github.com/henomis/phero/agent"
"github.com/henomis/phero/llm"
"github.com/henomis/phero/llm/openai"
"github.com/henomis/phero/trace/text"
)
func main() {
client := openai.New(os.Getenv("OPENAI_API_KEY"))
a, err := agent.New(
client,
"math-agent",
"You are a helpful math assistant.",
)
if err != nil {
panic(err)
}
a.SetTracer(text.New(os.Stderr))
result, err := a.Run(context.Background(), "What is 12 * 9?")
if err != nil {
panic(err)
}
fmt.Println(result.Content)
}
Trace raw LLM calls
If you want observability around direct llm.LLM.Execute calls, wrap any backend with trace.NewLLM.
tracedClient := trace.NewLLM(client, text.New(os.Stderr))
result, err := tracedClient.Execute(ctx, messages, tools)
When the wrapper runs inside an agent, request and response events are automatically annotated with the agent name and iteration number via context propagation.
JSON file tracer
For machine-readable output, use jsonfile.New to write NDJSON to a file.
Each line is a self-contained JSON record with a type, timestamp, and data field.
import "github.com/henomis/phero/trace/jsonfile"
t, err := jsonfile.New("trace.ndjson")
if err != nil {
panic(err)
}
defer t.Close()
a.SetTracer(t)
The tracer is goroutine-safe and appends to the file if it already exists.
Run summary
After every agent run, a AgentRunSummaryEvent is emitted with a RunSummary
containing aggregated metrics. The same summary is also available on the Result returned by Run().
RunSummary.Usage—UsageSummary{InputTokens, OutputTokens}RunSummary.Latency—LatencySummary{Total, LLM, Tool, Memory}RunSummary.Tools—[]ToolCallSummary{ToolName, Calls, Errors}RunSummary.Iterations,LLMCalls,ToolCalls,ToolErrorsRunSummary.HandoffAgent— non-empty when the run ended via a handoff
result, err := a.Run(ctx, input)
if err != nil {
panic(err)
}
if s := result.Summary; s != nil {
fmt.Printf("iterations=%d llmCalls=%d tokens=%d+%d\n",
s.Iterations, s.LLMCalls,
s.Usage.InputTokens, s.Usage.OutputTokens)
fmt.Printf("total=%s llm=%s tool=%s\n",
s.Latency.Total, s.Latency.LLM, s.Latency.Tool)
}
Event types
AgentStartEventandAgentEndEventAgentIterationEventAgentRunSummaryEvent— emitted once per run with a fullRunSummaryLLMRequestEventandLLMResponseEventToolCallEventandToolResultEventMemoryRetrieveEventandMemorySaveEvent
Implement the trace.Tracer interface if you want to send these events somewhere other than the terminal.
Context propagation
The tracer is carried through the request context with trace.WithTracer and read back with trace.FromContext.
This lets tool handlers and lower-level helpers emit their own trace events without depending directly on the agent type.
Try it
The tracing example shows a calculator tool plus a traced agent run from the terminal.
# from repo root
go run ./examples/tracing