Files
claude-code/mcp/llm-router/providers/opencode.py
OpenCode Test df6cf94dae feat(external-llm): add external LLM integration (fc-004)
Implements external LLM routing via opencode CLI for:
- GitHub Copilot (gpt-5.2, claude-sonnet-4.5, claude-haiku-4.5, o3, gemini-3-pro)
- Z.AI (glm-4.7 for code generation)
- OpenCode native (big-pickle)

Components:
- mcp/llm-router/invoke.py: Main router with task-based model selection
- mcp/llm-router/delegate.py: Agent delegation helper (respects external mode)
- mcp/llm-router/toggle.py: Enable/disable external-only mode
- mcp/llm-router/providers/: CLI wrappers for opencode and gemini

Features:
- Persistent toggle via state/external-mode.json
- Task routing: reasoning -> gpt-5.2, code-gen -> glm-4.7, long-context -> gemini
- Claude tier mapping: opus -> gpt-5.2, sonnet -> claude-sonnet-4.5, haiku -> claude-haiku-4.5
- Session-start hook announces when external mode is active
- Natural language toggle support via component registry

Plan: gleaming-routing-mercury

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-08 13:34:35 -08:00

53 lines
1.4 KiB
Python
Executable File

#!/usr/bin/env python3
"""OpenCode CLI wrapper for GitHub Copilot, Z.AI, and other providers."""
import subprocess
from typing import List
def invoke(cli_args: List[str], prompt: str, timeout: int = 300) -> str:
"""
Invoke opencode CLI with given args and prompt.
Args:
cli_args: Model args like ["-m", "github-copilot/gpt-5.2"]
prompt: The prompt text
timeout: Timeout in seconds (default 5 minutes)
Returns:
Model response as string
Raises:
RuntimeError: If opencode CLI fails
TimeoutError: If request exceeds timeout
Example invocation:
opencode -m github-copilot/gpt-5.2 -p "Hello world"
"""
cmd = ["opencode"] + cli_args + ["-p", prompt]
try:
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=timeout
)
except subprocess.TimeoutExpired:
raise TimeoutError(f"opencode timed out after {timeout}s")
if result.returncode != 0:
raise RuntimeError(f"opencode failed (exit {result.returncode}): {result.stderr}")
return result.stdout.strip()
if __name__ == "__main__":
# Quick test
import sys
if len(sys.argv) > 1:
response = invoke(["-m", "github-copilot/gpt-5.2"], sys.argv[1])
print(response)
else:
print("Usage: opencode.py 'prompt'")