Skip to main content
Run any Skill in Manus
with one click
$pwd:

distil-cli

// Train task-specific small language models (SLMs) using the Distil Labs CLI and platform. Activate this skill when the user asks about: distil labs, distil CLI, the distil command, training small language models, knowledge distillation, creating SLMs from production traces, data preparation for model training, fine-tuning student models from teacher models, deploying trained SLMs, model evaluation metrics, task-specific model training, classification/QA/tool-calling model training, synthetic data generation for training, or any question about the distil platform, its configuration, or its workflows. Also activate when the user mentions: distil model, distil login, distil register, uploading training data, teacher evaluation, model retuning, model deployment with llama-cpp or vLLM, or when working with config.yaml / job_description.json / train.csv files for model training.

$ git log --oneline --stat
stars:158
forks:9
updated:May 6, 2026 at 07:29
SKILL.md
readonly