Agent

Turn LLMs into reliable teammates: memory in, tools out.

What it does

The agent package provides a minimal chat-loop orchestration layer. An Agent is configured with a name and a system prompt (description). When you call Run, it:

Typical workflow

Most examples follow the same flow:

Example: minimal agent + tool

This is the essence of the Simple Agent example: define a Go function, wrap it as an LLM tool, add it to the agent, then run.

package main

import (
    "context"
    "fmt"

    "github.com/henomis/phero/agent"
    "github.com/henomis/phero/llm"
)

type CalculatorInput struct {
    Operation string  `json:"operation"`
    A         float64 `json:"a"`
    B         float64 `json:"b"`
}

type CalculatorOutput struct {
    Result float64 `json:"result"`
    Error  string  `json:"error,omitempty"`
}

func calculate(_ context.Context, input *CalculatorInput) (*CalculatorOutput, error) {
    switch input.Operation {
    case "add":
        return &CalculatorOutput{Result: input.A + input.B}, nil
    default:
        return &CalculatorOutput{Error: "unknown operation"}, nil
    }
}

func main() {
    ctx := context.Background()

  // Any llm.LLM works here (OpenAI-compatible, local, etc.)
  llmClient := /* ... */

    calcTool, err := llm.NewTool(
        "calculator",
        "Performs basic arithmetic operations",
        calculate,
    )
    if err != nil {
        panic(err)
    }

    a, err := agent.New(
        llmClient,
        "Math Assistant",
        "You are a helpful math assistant. Use the calculator tool to perform calculations accurately.",
    )
    if err != nil {
        panic(err)
    }

    if err := a.AddTool(calcTool); err != nil {
        panic(err)
    }

    answer, err := a.Run(ctx, "If I have 15 apples and give away 7, then buy 23 more, how many do I have?")
    if err != nil {
        panic(err)
    }

    fmt.Println(answer)
}

Memory

If you attach a memory.Memory, the agent will retrieve messages before each run and will save the conversation at the end of the call.

// From examples/conversational-agent (edited for brevity)

conversationMemory := memory.New(20)

a.SetMemory(conversationMemory)

// Guardrail against tool-call loops
a.SetMaxIterations(10)

Tools

A tool is a Go function wrapped as an *llm.Tool. When the LLM requests a tool call, the agent:

  1. Looks up the tool by name
  2. Calls its handler with the JSON arguments
  3. Appends the tool result as a tool role message

Tool handlers can return any Go value; non-string results are JSON-marshaled before being added to the chat.

Common errors

Run an example

Try the conversational agent example (REPL with memory). Follow the example’s README for provider setup.

# from repo root
go run ./examples/simple-agent

go run ./examples/conversational-agent

Related packages