feat(external-llm): add external LLM integration (fc-004)
Implements external LLM routing via opencode CLI for: - GitHub Copilot (gpt-5.2, claude-sonnet-4.5, claude-haiku-4.5, o3, gemini-3-pro) - Z.AI (glm-4.7 for code generation) - OpenCode native (big-pickle) Components: - mcp/llm-router/invoke.py: Main router with task-based model selection - mcp/llm-router/delegate.py: Agent delegation helper (respects external mode) - mcp/llm-router/toggle.py: Enable/disable external-only mode - mcp/llm-router/providers/: CLI wrappers for opencode and gemini Features: - Persistent toggle via state/external-mode.json - Task routing: reasoning -> gpt-5.2, code-gen -> glm-4.7, long-context -> gemini - Claude tier mapping: opus -> gpt-5.2, sonnet -> claude-sonnet-4.5, haiku -> claude-haiku-4.5 - Session-start hook announces when external mode is active - Natural language toggle support via component registry Plan: gleaming-routing-mercury Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -143,6 +143,23 @@
|
||||
"block dangerous",
|
||||
"protect"
|
||||
]
|
||||
},
|
||||
"external-mode": {
|
||||
"description": "Toggle between Claude and external LLMs (Copilot, Z.AI, Gemini)",
|
||||
"script": "~/.claude/mcp/llm-router/toggle.py",
|
||||
"triggers": [
|
||||
"external",
|
||||
"use external",
|
||||
"switch to external",
|
||||
"external models",
|
||||
"external only",
|
||||
"use copilot",
|
||||
"use opencode",
|
||||
"back to claude",
|
||||
"use claude again",
|
||||
"disable external",
|
||||
"external mode"
|
||||
]
|
||||
}
|
||||
},
|
||||
"commands": {
|
||||
|
||||
Reference in New Issue
Block a user