Skip to main content
Run any Skill in Manus
with one click
$pwd:

autonomyx-llm-gateway

// Autonomyx LLM Gateway — end-to-end LiteLLM proxy setup for 14 providers: Ollama, vLLM, TGI, OpenAI, Claude, Gemini, Mistral, Groq, Fireworks, Together.ai, OpenRouter, Azure, Bedrock. Produces config.yaml, docker-compose.yml, .env.example, token-count test script, Prometheus/ Grafana billing stack, Langflow wiring, and autonomyx-mcp wiring. Deploys to Coolify or generic Docker. ALWAYS trigger for: LiteLLM, LLM proxy, LLM gateway, model routing, virtual keys, token tracking, cost tracking, LLM billing, "route to multiple models", "unified LLM API", "OpenAI-compatible endpoint", "connect Langflow to LLMs", "LiteLLM config", "docker compose for LLM", or any request to configure, deploy, or extend multi-model LLM infra.

$ git log --oneline --stat
stars:1
forks:0
updated:May 6, 2026 at 09:27
SKILL.md
readonly