feat(external-llm): add external LLM integration (fc-004)
Implements external LLM routing via opencode CLI for: - GitHub Copilot (gpt-5.2, claude-sonnet-4.5, claude-haiku-4.5, o3, gemini-3-pro) - Z.AI (glm-4.7 for code generation) - OpenCode native (big-pickle) Components: - mcp/llm-router/invoke.py: Main router with task-based model selection - mcp/llm-router/delegate.py: Agent delegation helper (respects external mode) - mcp/llm-router/toggle.py: Enable/disable external-only mode - mcp/llm-router/providers/: CLI wrappers for opencode and gemini Features: - Persistent toggle via state/external-mode.json - Task routing: reasoning -> gpt-5.2, code-gen -> glm-4.7, long-context -> gemini - Claude tier mapping: opus -> gpt-5.2, sonnet -> claude-sonnet-4.5, haiku -> claude-haiku-4.5 - Session-start hook announces when external mode is active - Natural language toggle support via component registry Plan: gleaming-routing-mercury Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -121,6 +121,13 @@
|
||||
"implemented": "2026-01-07",
|
||||
"category": "enhancement",
|
||||
"notes": "This plan - meta!"
|
||||
},
|
||||
"gleaming-routing-mercury": {
|
||||
"title": "External LLM integration",
|
||||
"status": "pending",
|
||||
"created": "2026-01-08",
|
||||
"category": "feature",
|
||||
"notes": "fc-004 - Cloud API integration via opencode/gemini CLIs with session toggle"
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user