| name | pocketflow |
| description | Build LLM-powered workflows using the PocketFlow framework — a minimalist Python library for chaining AI nodes into graphs and pipelines. Use when the user wants to create an AI agent, multi-step LLM pipeline, agentic loop, batch processing workflow, or async AI flow. Triggers on "build a pocketflow", "create an AI workflow", "make an agent with pocketflow", "pocketflow". |
PocketFlow Skill
PocketFlow is a ~100-line Python framework for building LLM workflows as graphs of nodes.
The library is bundled at scripts/pocketflow.py — copy it into the user's project.
Core Concepts
shared ← dict passed through all nodes (global state)
Node ← single step: prep → exec → post
Flow ← graph of nodes with conditional transitions
BatchNode ← process a list of items through one node
Node Lifecycle
def prep(self, shared):
def exec(self, prep_res):
def post(self, shared, prep_res, exec_res):
post returns an action string that determines the next node in the flow ("default", "retry", "done", etc.).
Node Types
| Class | Use case |
|---|
Node | Single step with optional retries |
BatchNode | Run exec() on a list of items |
Flow | Orchestrate nodes into a graph |
BatchFlow | Run a flow for each item in a batch |
AsyncNode | Async version of Node |
AsyncBatchNode | Async batch processing |
AsyncParallelBatchNode | Parallel async batch (asyncio.gather) |
AsyncFlow | Async orchestrator |
Usage Pattern
1. Copy the Library
cp path/to/skills/pocketflow/scripts/pocketflow.py ./pocketflow.py
2. Build Nodes
from pocketflow import Node, Flow
class FetchData(Node):
def prep(self, shared):
return shared["query"]
def exec(self, query):
return call_llm(query)
def post(self, shared, prep_res, exec_res):
shared["result"] = exec_res
return "default"
class Validate(Node):
def exec(self, prep_res):
return check_quality(prep_res)
def post(self, shared, prep_res, exec_res):
return "done" if exec_res else "retry"
3. Wire the Flow
fetch = FetchData(max_retries=3, wait=1)
validate = Validate()
fetch >> validate
validate - "retry" >> fetch
flow = Flow(start=fetch)
4. Run
shared = {"query": "What is the capital of France?"}
flow.run(shared)
print(shared["result"])
Retry Pattern
class RobustLLMCall(Node):
def __init__(self):
super().__init__(max_retries=3, wait=2)
def exec(self, prompt):
return call_llm(prompt)
def exec_fallback(self, prep_res, exc):
return "fallback response"
Async Pattern
from pocketflow import AsyncNode, AsyncFlow
class AsyncLLMNode(AsyncNode):
async def exec_async(self, prompt):
return await async_llm_call(prompt)
async def post_async(self, shared, prep_res, exec_res):
shared["result"] = exec_res
return "default"
node = AsyncLLMNode()
flow = AsyncFlow(start=node)
import asyncio
asyncio.run(flow.run_async(shared))
Parallel Batch Pattern
from pocketflow import AsyncParallelBatchNode, AsyncFlow
class ParallelSummarizer(AsyncParallelBatchNode):
async def exec_async(self, doc):
return await summarize(doc)
node = ParallelSummarizer()
class LoadDocs(AsyncNode):
async def prep_async(self, shared):
return shared["docs"]
async def post_async(self, shared, prep_res, exec_res):
shared["summaries"] = exec_res
return "default"
Workflow for Building a PocketFlow App
- Copy
scripts/pocketflow.py into the project
- Map the workflow — identify each distinct step as a node
- Design shared — define what data flows through (keys in shared dict)
- Implement nodes — one class per step
- Wire transitions —
>> for linear, - "action" >> for conditional
- Choose async if making parallel LLM calls
- Add retries on unreliable nodes (LLM calls, APIs)
- Test with
flow.run(shared) and inspect shared dict after
Operator Shortcuts
a >> b
a - "retry" >> b