with one click
brainstorm
// Use when exploring a vague idea, finding inspiration, or researching a technique before planning. Triggers on "I want to add...", "what if...", "research", "look up", or when uncertain about what to build.
// Use when exploring a vague idea, finding inspiration, or researching a technique before planning. Triggers on "I want to add...", "what if...", "research", "look up", or when uncertain about what to build.
[HINT] Download the complete skill directory including SKILL.md and all related files
| name | brainstorm |
| description | Use when exploring a vague idea, finding inspiration, or researching a technique before planning. Triggers on "I want to add...", "what if...", "research", "look up", or when uncertain about what to build. |
Turn a vague idea into a researched concept through collaborative dialogue. Explore together, then fetch real references and write a research document.
## Reference Code section. Do NOT paraphrase, summarize, or rewrite it into pseudocode. Implementing agents WILL invent broken alternatives if they only see prose descriptions.Goal: Understand starting point and context
@docs/effects.md
Actions:
docs/research/ for existing research on this topicGoal: Build up understanding through dialogue
Actions:
Note: Skip audio reactivity questions—all parameters expose to modulation engine. Users configure their own routes. The interaction patterns above are about structural relationships between parameters, not about which audio source maps where.
STOP: Do not proceed until concept direction is clear enough to research.
Goal: Determine domain type for research
Actions:
| Type | Characteristics | Example |
|---|---|---|
| Effect (Transform) | Reads input texture, outputs modified pixels | Kuwahara, kaleidoscope |
| Effect (Feedback) | Reads previous frame, accumulates over time | Flow field, blur |
| Effect (Output) | Post-process after transforms | Chromatic aberration, bloom |
| Simulation | Per-agent or field-based compute, writes to texture | Physarum, boids, curl flow |
| Drawable | Generates geometry drawn to accumulation buffer | Waveforms, particle trails |
| General | Software/architecture, not a visual pipeline component | UI redesign, serialization |
Present classification with one sentence explaining why.
Read memories from MEMORY.md matching the classified domain.
STOP: Do not proceed until user confirms classification.
Goal: Find real, fetchable references
Actions:
Search with WebSearch, prioritizing bot-friendly sources:
Blocked sites (do NOT attempt to fetch):
Shadertoy attribution (REQUIRED): When the user pastes Shadertoy code, you MUST collect:
https://www.shadertoy.com/view/XXXX)Fetch promising URLs. For each attempt, report:
If ALL fetches fail: STOP. Ask user to provide a source or paste code.
Pick one approach. If multiple sources describe different methods, select the one that best fits this codebase's complexity level and pipeline. State why. Do not document alternatives.
Goal: Verify the technique works in this pipeline
STOP: Do not proceed until user approves compatibility.
Goal: Create docs/research/<name>.md
Actions:
# [Name]
[One paragraph: what the viewer sees or what emerges]
## Classification
- **Category**: [e.g. TRANSFORMS > Warp, SIMULATIONS, GENERATORS > Texture]
- **Pipeline Position**: [From inventory pipeline order]
- **Compute Model**: [Compute shader + trail texture / Texture ping-pong] (simulations only)
## Attribution
If this effect derives from a Shadertoy shader or other CC-licensed source:
- **Based on**: "[Shader Title]" by [Author]
- **Source**: [URL]
- **License**: [CC BY-NC-SA 3.0 / other]
Omit this section if the effect is original or based only on academic techniques.
## References
- [Title](URL) - [What this source provides]
## Reference Code
Include the EXACT working code from references here — user-pasted Shadertoy shaders, fetched GLSL snippets, algorithm implementations. This is the source of truth for implementing agents. Do NOT paraphrase or rewrite into pseudocode.
```glsl
// Paste complete reference code here, unmodified
[Adaptation notes: what changes are needed to fit this codebase's conventions.]
| Parameter | Type | Range | Default | Effect |
|---|
Document parameter pairs with cascading threshold, competing forces, or resonance dynamics (defined in Phase 2). If no meaningful interactions exist, omit this section.
DO NOT prescribe audio sources or mapping recommendations. Users configure their own modulation routes.
[Caveats, performance, edge cases]
For drawables or general features: use the template above, adjusted for the domain. Skip pipeline position for general features.
---
## Phase 7: Summary
**Actions**:
1. Tell user:
- Research document location
- Classification
- "To plan implementation: `/feature-plan docs/research/<name>.md`"
---
## Output Constraints
- ONLY create `docs/research/<name>.md`
- Do NOT create plan documents
- Do NOT proceed past gates without user approval
- Do NOT present alternative approaches in the research doc
- Do NOT hallucinate algorithms—real references or stop
---
## Red Flags - STOP
| Thought | Reality |
|---------|---------|
| "They need to pick one or the other" | "Both" is valid. Explore the combination. |
| "I'll skip exploration, the idea is clear" | Phase 2 builds shared understanding. Do it. |
| "I can invent the algorithm" | Real references only. Ask for sources if fetches fail. |
| "I'll describe the algorithm in prose" | Include the EXACT reference code. Agents invent broken alternatives from prose. |
| "I'll include multiple approaches in the doc" | One approach. The chosen one. |
| "This technique is obviously compatible" | Run the compatibility gate anyway. |
| "I know what they want" | Ask. One question at a time. |