| name | llm |
| description | Access and interact with Large Language Models from the command line using Simon Willison's llm CLI tool. Supports OpenAI, Anthropic, Gemini, Llama, and dozens of other models via plugins. Features include chat sessions, embeddings, structured data extraction with schemas, prompt templates, conversation logging, and tool use. This skill is triggered when the user says things like "run a prompt with llm", "use the llm command", "call an LLM from the command line", "set up llm API keys", "install llm plugins", "create embeddings", or "extract structured data from text". |
LLM CLI Tool Skill
A CLI tool and Python library for interacting with Large Language Models including OpenAI, Anthropic's Claude, Google's Gemini, Meta's Llama, and dozens of others via remote APIs or locally installed models.
When to Use This Skill
Use this skill when:
- Running prompts against LLMs from the command line
- Managing conversations and chat sessions
- Working with embeddings for semantic search
- Extracting structured data using schemas
- Installing and configuring LLM plugins
- Managing API keys for various providers
- Using templates for reusable prompts
- Logging and analyzing LLM interactions
Quick Reference
Basic Commands
llm "Your prompt here"
llm -m claude-4-opus "Your prompt"
llm chat -m gpt-4.1
llm "describe this" -a image.jpg
cat file.py | llm -s "Explain this code"
Key Management
llm keys set openai
llm keys set anthropic
llm keys set gemini
Plugin Management
llm install llm-anthropic
llm install llm-gemini
llm install llm-ollama
llm plugins
Documentation Index
Core Documentation
- README.md - Project overview and quick start guide
- docs/setup.md - Installation and initial configuration
- docs/usage.md - Comprehensive CLI usage guide (prompts, chat, attachments, conversations)
- docs/help.md - Complete command reference and help text
Model Configuration
Advanced Features
Embeddings
Plugins
Python API & Development
Reference
Common Workflows
Starting a Conversation
llm chat -m gpt-4.1 -s "You are a helpful coding assistant"
llm -c "Follow up question"
Working with Files
cat script.py | llm "Review this code for bugs"
cat *.md | llm "Summarize these documents"
Structured Output
llm -m gpt-4.1 "Extract person info" -a photo.jpg --schema name,age,occupation
Template Usage
llm templates
llm -t summarize < article.txt
Included Templates
This skill includes ready-to-use prompt templates in templates/.
audio-to-article.yaml
Transforms raw audio transcripts into polished, readable articles. Used by the /audio-to-article command.
cat transcript.txt | llm -t templates/audio-to-article.yaml
python3 ../parakeet/srt_to_text.py audio.srt | llm -t templates/audio-to-article.yaml
What it does:
- Removes filler words (um, uh, like, you know)
- Fixes transcription errors from context
- Adds paragraph breaks at topic transitions
- Creates section headers where topics shift
- Preserves the speaker's voice and meaning
- Outputs clean markdown with title
Related skills:
- parakeet - Transcribes audio to SRT, includes
srt_to_text.py helper for conversion
- yt-dlp - Downloads audio from URLs (YouTube, podcasts, etc.)