feat(external-llm): add external LLM integration (fc-004)
Implements external LLM routing via opencode CLI for: - GitHub Copilot (gpt-5.2, claude-sonnet-4.5, claude-haiku-4.5, o3, gemini-3-pro) - Z.AI (glm-4.7 for code generation) - OpenCode native (big-pickle) Components: - mcp/llm-router/invoke.py: Main router with task-based model selection - mcp/llm-router/delegate.py: Agent delegation helper (respects external mode) - mcp/llm-router/toggle.py: Enable/disable external-only mode - mcp/llm-router/providers/: CLI wrappers for opencode and gemini Features: - Persistent toggle via state/external-mode.json - Task routing: reasoning -> gpt-5.2, code-gen -> glm-4.7, long-context -> gemini - Claude tier mapping: opus -> gpt-5.2, sonnet -> claude-sonnet-4.5, haiku -> claude-haiku-4.5 - Session-start hook announces when external mode is active - Natural language toggle support via component registry Plan: gleaming-routing-mercury Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -37,10 +37,11 @@
|
||||
"category": "integration",
|
||||
"title": "External LLM integration",
|
||||
"description": "Support for non-Claude models in the agent system",
|
||||
"priority": "low",
|
||||
"status": "deferred",
|
||||
"priority": "medium",
|
||||
"status": "planned",
|
||||
"created": "2024-12-28",
|
||||
"notes": "For specialized tasks or cost optimization"
|
||||
"plan": "gleaming-routing-mercury",
|
||||
"notes": "Cloud APIs via subscription (Copilot, Z.AI, Gemini). Uses opencode/gemini CLIs. Session toggle for external-only mode."
|
||||
},
|
||||
{
|
||||
"id": "fc-005",
|
||||
|
||||
Reference in New Issue
Block a user