Build a Stateful AI Agent
Build a SupportAgent from scratch with tools, sessions, streaming, and tests — then deploy it to Cloudflare Workers.
What you'll learn
- Scaffolding a Roost app and registering the AI service provider
- Writing an
Agentsubclass withinstructions() - Adding typed tools via the
HasToolscontract - Streaming responses to the browser with SSE
- Upgrading to
StatefulAgent+@Statefulfor durable conversation history - Testing with
fake(),assertPrompted, andpreventStrayPrompts - Deploying to Cloudflare Workers
Estimated time: ~40 minutes
Prerequisites: Complete the Quick Start guide before starting.
Packages used: @roostjs/ai, @roostjs/schema, @roostjs/testing
Step 1: Scaffold the project
Create a new Roost application and install its dependencies, then add
@roostjs/ai and @roostjs/schema.
roost new support-agent
cd support-agent
bun install
bun add @roostjs/ai @roostjs/schemaRegister AiServiceProvider and declare the AI binding:
import { AiServiceProvider, Lab } from '@roostjs/ai';
export default {
providers: [AiServiceProvider],
ai: {
binding: 'AI',
default: [Lab.WorkersAI],
},
};{
"name": "support-agent",
"compatibility_date": "2026-04-01",
"main": "src/worker.ts",
"ai": { "binding": "AI" }
}Start the dev server (bun run dev) once to confirm the
baseline works. You should see Wrangler output confirming the
AI binding is registered.
Step 2: Write the first prompt
Create the agent class. One method — instructions() — defines the system
prompt. The base Agent class supplies prompt() and the full agentic loop.
import { Agent } from '@roostjs/ai';
export class SupportAgent extends Agent {
instructions(): string {
return 'You are a helpful customer support agent for Acme Inc. Be concise.';
}
}Wire a simple route that forwards the incoming message to the agent:
import { SupportAgent } from '../../agents/support-agent';
export async function POST({ request }: { request: Request }) {
const { message } = await request.json<{ message: string }>();
const response = await new SupportAgent().prompt(message);
return Response.json({ text: response.text });
}Try it:
curl -s -X POST http://localhost:3000/api/chat \
-H 'Content-Type: application/json' \
-d '{"message":"How do I reset my password?"}' | jq .textYou should see a full response string — no streaming yet, just the final model output.
Step 3: Add a tool
Tools are classes implementing the Tool interface. Return them from
tools() on an agent that implements HasTools; the runtime passes them to
the model and executes the handler when the model asks for them.
import type { Tool, ToolRequest } from '@roostjs/ai';
import type { schema } from '@roostjs/schema';
export class LookupTool implements Tool {
constructor(private customers: Map<string, { tier: string }>) {}
name() { return 'customer_lookup'; }
description() { return 'Look up a customer tier by ID.'; }
schema(s: typeof schema) {
return { customerId: s.string().description('The customer ID') };
}
async handle(req: ToolRequest): Promise<string> {
const id = req.get<string>('customerId');
const record = this.customers.get(id);
return record ? `tier=${record.tier}` : 'not found';
}
}Add HasTools to the agent and return the tool instance:
import { Agent, type HasTools } from '@roostjs/ai';
import { LookupTool } from '../tools/lookup-tool';
const customers = new Map([
['42', { tier: 'pro' }],
['101', { tier: 'enterprise' }],
]);
export class SupportAgent extends Agent implements HasTools {
instructions(): string {
return [
'You are a helpful support agent for Acme Inc.',
'When a user mentions a customer ID, look it up via the customer_lookup tool.',
].join(' ');
}
tools() {
return [new LookupTool(customers)];
}
}Send a prompt that mentions a customer ID:
curl -s -X POST http://localhost:3000/api/chat \
-H 'Content-Type: application/json' \
-d '{"message":"What tier is customer 42?"}' | jq .textThe agent invokes customer_lookup behind the scenes, gets
back tier=pro, and folds the result into the final reply.
Step 4: Stream the response to the browser
agent.stream(input) returns a StreamableAgentResponse that is both an
async iterable of StreamEvent and a Response-compatible body. Swap the
JSON route handler for a streaming one:
import { SupportAgent } from '../../agents/support-agent';
export async function POST({ request }: { request: Request }) {
const { message } = await request.json<{ message: string }>();
const stream = new SupportAgent().stream(message);
return new Response(stream, {
headers: { 'content-type': 'text/event-stream' },
});
}On the client, read the SSE stream and append deltas as they arrive:
async function sendMessage(message: string, append: (text: string) => void) {
const res = await fetch('/api/chat-stream', {
method: 'POST',
headers: { 'content-type': 'application/json' },
body: JSON.stringify({ message }),
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
while (true) {
const { value, done } = await reader.read();
if (done) break;
const chunk = decoder.decode(value);
for (const line of chunk.split('\n\n')) {
if (!line.startsWith('data: ')) continue;
const event = JSON.parse(line.slice(6));
if (event.type === 'text-delta') append(event.text);
}
}
}If you're using React, swap the manual reader for the
useAgentStream hook from @roostjs/ai/client —
same events, no boilerplate. See
reference → Streaming.
Step 5: Persist conversation history with StatefulAgent
Upgrade the stateless agent to a Durable Object-backed one. StatefulAgent
(from @roostjs/ai/stateful) implements DurableObject directly; apply the
RemembersConversations mixin to auto-persist message history.
import { StatefulAgent, RemembersConversations } from '@roostjs/ai/stateful';
import { Stateful } from '@roostjs/ai';
import type { HasTools } from '@roostjs/ai';
import { LookupTool } from '../tools/lookup-tool';
const customers = new Map([
['42', { tier: 'pro' }],
['101', { tier: 'enterprise' }],
]);
@Stateful({ binding: 'SUPPORT_AGENT' })
export class SupportAgent extends RemembersConversations(StatefulAgent) implements HasTools {
instructions(): string {
return [
'You are a helpful support agent for Acme Inc.',
'When a user mentions a customer ID, look it up via the customer_lookup tool.',
].join(' ');
}
tools() {
return [new LookupTool(customers)];
}
}Declare the DO binding and export the class from your Worker entry point:
{
"name": "support-agent",
"compatibility_date": "2026-04-01",
"main": "src/worker.ts",
"ai": { "binding": "AI" },
"durable_objects": {
"bindings": [
{ "name": "SUPPORT_AGENT", "class_name": "SupportAgent" }
]
},
"migrations": [
{ "tag": "v1", "new_sqlite_classes": ["SupportAgent"] }
]
}export { SupportAgent } from './agents/support-agent';
// ...existing fetch exportRoute calls through the DO stub so each conversation has an identity:
export async function POST({ request }: { request: Request }) {
const { conversationId, message } = await request.json<{
conversationId: string;
message: string;
}>();
const id = env.SUPPORT_AGENT.idFromName(conversationId);
const stub = env.SUPPORT_AGENT.get(id);
const stream = await stub.stream(message);
return new Response(stream, {
headers: { 'content-type': 'text/event-stream' },
});
}Two requests that share a conversationId now share memory:
the agent remembers that customer 42 is on the pro tier without having
to re-invoke the lookup tool.
Step 6: Test the agent
Fakes replace provider mocking. fake() attaches canned responses;
preventStrayPrompts() throws on any call not matched. assertPrompted
verifies the inputs the agent received.
import { describe, it, expect, afterEach } from 'bun:test';
import { SupportAgent } from '../src/agents/support-agent';
afterEach(() => SupportAgent.restore());
describe('SupportAgent', () => {
it('answers password-reset questions', async () => {
SupportAgent
.fake(['Visit /account/reset to reset your password.'])
.preventStrayPrompts();
const response = await new SupportAgent().prompt('How do I reset my password?');
expect(response.text).toContain('reset');
SupportAgent.assertPrompted('reset');
});
it('routes customer IDs to the lookup flow', async () => {
SupportAgent.fake([(p) => `Checking customer ${p.input.match(/\d+/)?.[0]}.`]);
const response = await new SupportAgent().prompt('What tier is customer 42?');
expect(response.text).toBe('Checking customer 42.');
SupportAgent.assertPrompted('42');
});
});Run the tests:
bun testFor DO-backed assertions (scheduled methods, sub-agent spawns, memory tiers), use
TestStatefulAgentHarness — same fake semantics, plus a mock DO state and clock.
Step 7: Deploy
Deploy to Cloudflare Workers. The DO binding and AI binding are both
provisioned from wrangler.jsonc; no additional dashboard work is required.
bunx wrangler deploySee the Deploy to Cloudflare tutorial for CI/CD patterns, secrets management, and preview environments.
What you built
You now have a stateful AI agent with:
- A
SupportAgentclass extendingStatefulAgentwith durable conversation history viaRemembersConversations - A typed
LookupToolthe model invokes automatically when it sees a customer ID in user input - A streaming SSE endpoint that fans model deltas to the browser
- A passing test suite that uses
fake()andpreventStrayPrompts()to run without the AI binding - A
wrangler.jsoncwired for Workers AI and a Durable Object binding
Next steps
- @roostjs/ai reference — every primitive with signatures and canonical examples, including sessions, workflows, sub-agents, MCP, HITL, memory tiers, and payments.
- @roostjs/ai guides — task-focused how-tos for queueing, scheduling, RAG, and failover.
- @roostjs/ai concepts — the opt-in contract pattern, multi-provider strategy, and the Roost-native integration philosophy.