with one click
arc-dispatching-parallel
// Use when dispatching multiple independent features within a worktree session
// Use when dispatching multiple independent features within a worktree session
[HINT] Download the complete skill directory including SKILL.md and all related files
| name | arc-dispatching-parallel |
| description | Use when dispatching multiple independent features within a worktree session |
Dispatch multiple agents for independent tasks in parallel.
digraph when_to_use {
"Multiple tasks or failures?" [shape=diamond];
"Independent?" [shape=diamond];
"DAG context available?" [shape=diamond];
"Use DAG readiness path" [shape=box];
"Use independent failures path" [shape=box];
"Use sequential/other skill" [shape=box];
"Multiple tasks or failures?" -> "Independent?" [label="yes"];
"Multiple tasks or failures?" -> "Use sequential/other skill" [label="no"];
"Independent?" -> "DAG context available?" [label="yes"];
"Independent?" -> "Use sequential/other skill" [label="no"];
"DAG context available?" -> "Use DAG readiness path" [label="yes"];
"DAG context available?" -> "Use independent failures path" [label="no"];
}
Use when:
Don't use when:
Group by independence:
Independence checks (examples):
Each agent gets:
Prompt template:
Fix <problem-domain> in <file-or-subsystem>.
Context:
- Failure 1: <name> (<error/message>)
- Failure 2: <name> (<error/message>)
Constraints:
- Don't change unrelated files
- Avoid refactors outside this scope
Return:
- Root cause
- Fix summary
- Files changed
Task tool (general-purpose): "Fix issue A in file X"
Task tool (general-purpose): "Fix issue B in file Y"
Task tool (general-purpose): "Fix issue C in file Z"
# All three dispatch simultaneously
When dag.yaml exists (from /arc-planning), use this structured workflow.
If you don't have a DAG, skip to Without DAG: Independent Failures below.
Identify the spec you are working within (from the worktree's .arcforge-epic marker or the --spec-id argument passed by the lead), then:
cat specs/<spec-id>/dag.yaml
Parse the structure to understand:
A feature is "ready" when:
# Pseudocode
ready_features = []
for feature in all_features:
if feature.status == "ready":
ready_features.append(feature)
elif all(dep.status == "complete" for dep in feature.dependencies):
ready_features.append(feature)
Features are independent when:
Example:
Features A, B, C have no dependencies ā Group 1 (parallel)
Feature D depends on A ā Must wait for Group 1
Feature E depends on B and C ā Must wait for Group 1
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā
Parallelization analysis complete
**Can run in parallel NOW:**
- Feature A: <description>
- Feature B: <description>
- Feature C: <description>
**Must wait (blocked):**
- Feature D: <description> (depends on: A)
- Feature E: <description> (depends on: B, C)
**Execution approach:**
Option 1: Sequential (safer)
Implement A ā B ā C ā D ā E
Option 2: Parallel Group 1 (faster)
Dispatch A, B, C in parallel
Then implement D, E after Group 1 complete
Which approach? (1-2)
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
Use arc-implementing to implement features one at a time in dependency order.
For each feature in the parallel group, dispatch a separate subagent:
For each feature in Group 1:
Use Task tool with subagent_type=general-purpose
Prompt: "Implement feature <feature-id> from specs/<spec-id>/epics/<epic>/features/<feature>.md"
Run in parallel (all at once)
Wait for all to complete
Review each implementation
Then proceed to next group
Use arc-coordinating to fetch next work:
# Get next parallelizable tasks
arc-coordinating parallel
# Or get next single task
arc-coordinating next
When you have multiple independent failures but no dag.yaml:
Review found 3 independent issues:
1. Missing validation in auth.py
2. Wrong error message in api.py
3. Missing test for utils.py
[Dispatch 3 agents in parallel]
Agent 1: Fixed auth.py validation
Agent 2: Fixed api.py error message
Agent 3: Added utils.py test
[Verify no conflicts]
[Run full test suite]
All fixes integrated successfully.
| Excuse | Reality |
|---|---|
| "Sequential prevents conflicts" | Parallel is safe when no deps |
| "Parallelization too complex" | DAG makes it clear |
| "User knows the dependencies" | Present structured analysis |
| "Worktrees handle parallelization" | That's epic-level, not feature-level |
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā
Parallel execution planned
Group 1: [Features A, B, C] (in parallel)
Group 2: [Features D, E] (after Group 1)
Approach: [Sequential/Parallel]
Next feature: [Feature ID]
Next: Begin implementation with `/arc-implementing`
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
ā ļø Parallelization analysis blocked
Issue: [dag.yaml not found / No features ready / Parse error]
Location: [file or epic]
To resolve:
1. Create dag.yaml with `/arc-planning`
2. Complete dependency features first
3. Fix dag.yaml syntax
Then retry: `/arc-dispatching-parallel`
āāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāāā
/arc-planning creates dag.yaml/arc-implementing executes features/arc-coordinating parallel (CLI command)| Type | Scope | Tool |
|---|---|---|
| Epic-level | Multiple epics at once | Git worktrees (arc-using-worktrees) |
| Feature-level | Multiple features within epic | This skill (arc-dispatching-parallel) |
Example: