Files
OpenCode Test df6cf94dae feat(external-llm): add external LLM integration (fc-004)
Implements external LLM routing via opencode CLI for:
- GitHub Copilot (gpt-5.2, claude-sonnet-4.5, claude-haiku-4.5, o3, gemini-3-pro)
- Z.AI (glm-4.7 for code generation)
- OpenCode native (big-pickle)

Components:
- mcp/llm-router/invoke.py: Main router with task-based model selection
- mcp/llm-router/delegate.py: Agent delegation helper (respects external mode)
- mcp/llm-router/toggle.py: Enable/disable external-only mode
- mcp/llm-router/providers/: CLI wrappers for opencode and gemini

Features:
- Persistent toggle via state/external-mode.json
- Task routing: reasoning -> gpt-5.2, code-gen -> glm-4.7, long-context -> gemini
- Claude tier mapping: opus -> gpt-5.2, sonnet -> claude-sonnet-4.5, haiku -> claude-haiku-4.5
- Session-start hook announces when external mode is active
- Natural language toggle support via component registry

Plan: gleaming-routing-mercury

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-08 13:34:35 -08:00

50 lines
1.2 KiB
Python
Executable File

#!/usr/bin/env python3
"""Gemini CLI wrapper for Google models."""
import subprocess
from typing import List
def invoke(cli_args: List[str], prompt: str, timeout: int = 300) -> str:
"""
Invoke gemini CLI with given args and prompt.
Args:
cli_args: Model args like ["-m", "gemini-3-pro"]
prompt: The prompt text
timeout: Timeout in seconds (default 5 minutes)
Returns:
Model response as string
Raises:
RuntimeError: If gemini CLI fails
TimeoutError: If request exceeds timeout
"""
cmd = ["gemini"] + cli_args + ["-p", prompt]
try:
result = subprocess.run(
cmd,
capture_output=True,
text=True,
timeout=timeout
)
except subprocess.TimeoutExpired:
raise TimeoutError(f"gemini timed out after {timeout}s")
if result.returncode != 0:
raise RuntimeError(f"gemini failed (exit {result.returncode}): {result.stderr}")
return result.stdout.strip()
if __name__ == "__main__":
# Quick test
import sys
if len(sys.argv) > 1:
response = invoke(["-m", "gemini-3-pro"], sys.argv[1])
print(response)
else:
print("Usage: gemini.py 'prompt'")