v1.26.0 — Agentic Workflows and LLM Core Microservice
Released 2026-04-08. GitHub release.
Agentic Workflows
v1.26.0 introduces a first-class workflow engine for orchestrating multi-step agentic processes across microservices.
workflow Package
The new workflow package provides the primitives for defining and executing workflows:
Graph— A directed graph of tasks and transitions. Tasks are added by URL, transitions connect them sequentially or conditionally. Conditional transitions use state expressions (e.g.toolsRequested==true). Fan-out over arrays is supported viaAddTransitionForEach. Subgraph invocation enables workflow composition. Graphs self-validate and can render as Mermaid diagrams.Flow— The state carrier passed to each task. Provides typed getters/setters (GetString,SetInt, etc.) for reading and writing shared state. Tasks can issue control signals:Goto(jump to a task),Retry(with exponential backoff),Sleep(delay execution),Interrupt(pause for external input), andSubgraph(delegate to another workflow).- Reducers — Define how concurrent state modifications from parallel fan-out tasks are merged during fan-in:
replace(last write wins, the default),append(concatenate arrays),add(sum numeric values), andunion(merge arrays with deduplication).
Foreman Core Microservice
The foreman.core microservice orchestrates workflow execution. It manages a sharded SQL-backed persistence layer and a configurable worker pool for concurrent step processing.
Key endpoints include: Create, Start / StartNotify, Run, Snapshot, History, Resume, Fork, Continue, Retry, Cancel, List, Await, BreakBefore, and CreateTask.
The Foreman emits an OnFlowStopped event when a flow reaches a terminal status (completed, failed, cancelled, or interrupted). It tracks metrics for flows started, flows terminated, steps executed, queue depth, and steps recovered after lease expiry. A configurable retention policy purges old flows automatically.
Task and Workflow Features for Microservices
Any microservice can now define task endpoints and workflow graphs as first-class features alongside functions, web handlers, and events.
- Tasks are functional endpoints on port
:428that receive a*workflow.Flowand operate on shared state. Typed input arguments are automatically bound from flow state, and output arguments are written back. - Workflows are graph-definition endpoints on port
:428that return a*workflow.Graphdescribing the task topology, transitions, and reducers.
New agent skills microbus/add-task and microbus/add-workflow scaffold these features.
LLM Core Microservice
The llm.core microservice bridges LLM tool-calling protocols with Microbus endpoints. It provides a Chat endpoint that manages multi-turn conversations with automatic tool resolution and execution.
The LLM service uses a pluggable provider architecture. The active provider is selected via the ProviderHostname configuration property, defaulting to claude.llm.core. Three provider microservices ship out of the box:
| Provider | Hostname | Default Model |
|---|---|---|
| Claude (Anthropic) | claude.llm.core | claude-haiku-4-5 |
| Gemini (Google) | gemini.llm.core | gemini-2.0-flash |
| OpenAI | openai.llm.core | gpt-4 |
Each provider translates the common Turn interface to its vendor-specific API via the HTTP egress proxy. Tools are resolved from OpenAPI specs of Microbus endpoints, allowing LLMs to call any microservice function as a tool.
Internally, the LLM service is itself implemented as an agentic workflow (ChatLoop) with tasks for initialization, LLM calls, response processing, and tool execution — demonstrating the workflow engine in a real use case.
Restructured Agent Coding Rules
The agent coding rules in .claude/rules/ have been restructured to compress their size within LLM context windows. Topic-specific guidance such as workflows.txt and sequel.txt is separated into dedicated files, loaded on demand only when relevant to the microservice being modified.