Implements external LLM routing via opencode CLI for: - GitHub Copilot (gpt-5.2, claude-sonnet-4.5, claude-haiku-4.5, o3, gemini-3-pro) - Z.AI (glm-4.7 for code generation) - OpenCode native (big-pickle) Components: - mcp/llm-router/invoke.py: Main router with task-based model selection - mcp/llm-router/delegate.py: Agent delegation helper (respects external mode) - mcp/llm-router/toggle.py: Enable/disable external-only mode - mcp/llm-router/providers/: CLI wrappers for opencode and gemini Features: - Persistent toggle via state/external-mode.json - Task routing: reasoning -> gpt-5.2, code-gen -> glm-4.7, long-context -> gemini - Claude tier mapping: opus -> gpt-5.2, sonnet -> claude-sonnet-4.5, haiku -> claude-haiku-4.5 - Session-start hook announces when external mode is active - Natural language toggle support via component registry Plan: gleaming-routing-mercury Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
904 lines
21 KiB
Markdown
904 lines
21 KiB
Markdown
# External LLM Integration Implementation Plan
|
|
|
|
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
|
|
|
|
**Goal:** Enable agents to use external LLMs (Copilot, Z.AI, Gemini) via CLI tools with a session toggle.
|
|
|
|
**Architecture:** Python router reads model-policy.json, invokes appropriate CLI (opencode/gemini), returns response. State file controls Claude vs external mode. PA exposes toggle commands.
|
|
|
|
**Tech Stack:** Python 3, subprocess, JSON state files, opencode CLI, gemini CLI
|
|
|
|
---
|
|
|
|
## Task 1: Create External Mode State File
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/state/external-mode.json`
|
|
|
|
**Step 1: Create state file**
|
|
|
|
```bash
|
|
cat > ~/.claude/state/external-mode.json << 'EOF'
|
|
{
|
|
"enabled": false,
|
|
"activated_at": null,
|
|
"reason": null
|
|
}
|
|
EOF
|
|
```
|
|
|
|
**Step 2: Verify file**
|
|
|
|
Run: `cat ~/.claude/state/external-mode.json | jq .`
|
|
Expected: Valid JSON with `enabled: false`
|
|
|
|
**Step 3: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add state/external-mode.json
|
|
git -C ~/.claude commit -m "feat(external-llm): add external-mode state file"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 2: Extend Model Policy
|
|
|
|
**Files:**
|
|
- Modify: `~/.claude/state/model-policy.json`
|
|
|
|
**Step 1: Read current file**
|
|
|
|
Run: `cat ~/.claude/state/model-policy.json | jq .`
|
|
|
|
**Step 2: Add external_models section**
|
|
|
|
Add to model-policy.json (after `skill_delegation` section):
|
|
|
|
```json
|
|
"external_models": {
|
|
"copilot/gpt-5.2": {
|
|
"cli": "opencode",
|
|
"cli_args": ["--provider", "copilot", "--model", "gpt-5.2"],
|
|
"use_cases": ["reasoning", "fallback"],
|
|
"tier": "opus-equivalent"
|
|
},
|
|
"copilot/sonnet-4.5": {
|
|
"cli": "opencode",
|
|
"cli_args": ["--provider", "copilot", "--model", "sonnet-4.5"],
|
|
"use_cases": ["general", "fallback"],
|
|
"tier": "sonnet-equivalent"
|
|
},
|
|
"copilot/haiku-4.5": {
|
|
"cli": "opencode",
|
|
"cli_args": ["--provider", "copilot", "--model", "haiku-4.5"],
|
|
"use_cases": ["simple"],
|
|
"tier": "haiku-equivalent"
|
|
},
|
|
"zai/glm-4.7": {
|
|
"cli": "opencode",
|
|
"cli_args": ["--provider", "zai", "--model", "glm-4.7"],
|
|
"use_cases": ["code-generation"],
|
|
"tier": "sonnet-equivalent"
|
|
},
|
|
"gemini/gemini-3-pro": {
|
|
"cli": "gemini",
|
|
"cli_args": ["-m", "gemini-3-pro"],
|
|
"use_cases": ["long-context"],
|
|
"tier": "opus-equivalent"
|
|
}
|
|
},
|
|
"claude_to_external_map": {
|
|
"opus": "copilot/gpt-5.2",
|
|
"sonnet": "copilot/sonnet-4.5",
|
|
"haiku": "copilot/haiku-4.5"
|
|
},
|
|
"task_routing": {
|
|
"reasoning": "copilot/gpt-5.2",
|
|
"code-generation": "zai/glm-4.7",
|
|
"long-context": "gemini/gemini-3-pro",
|
|
"default": "copilot/sonnet-4.5"
|
|
}
|
|
```
|
|
|
|
**Step 3: Validate JSON**
|
|
|
|
Run: `cat ~/.claude/state/model-policy.json | jq .`
|
|
Expected: Valid JSON, no errors
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add state/model-policy.json
|
|
git -C ~/.claude commit -m "feat(external-llm): add external model definitions to policy"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 3: Create Router Directory Structure
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/mcp/llm-router/`
|
|
- Create: `~/.claude/mcp/llm-router/providers/`
|
|
- Create: `~/.claude/mcp/llm-router/providers/__init__.py`
|
|
|
|
**Step 1: Create directories**
|
|
|
|
```bash
|
|
mkdir -p ~/.claude/mcp/llm-router/providers
|
|
```
|
|
|
|
**Step 2: Create __init__.py**
|
|
|
|
```bash
|
|
touch ~/.claude/mcp/llm-router/providers/__init__.py
|
|
```
|
|
|
|
**Step 3: Verify structure**
|
|
|
|
Run: `ls -la ~/.claude/mcp/llm-router/`
|
|
Expected: `providers/` directory exists
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add mcp/llm-router/
|
|
git -C ~/.claude commit -m "feat(external-llm): create llm-router directory structure"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 4: Create OpenCode Provider
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/mcp/llm-router/providers/opencode.py`
|
|
|
|
**Step 1: Write provider**
|
|
|
|
```python
|
|
#!/usr/bin/env python3
|
|
"""OpenCode CLI wrapper for Copilot, Z.AI, and other providers."""
|
|
|
|
import subprocess
|
|
from typing import List
|
|
|
|
|
|
def invoke(cli_args: List[str], prompt: str, timeout: int = 300) -> str:
|
|
"""
|
|
Invoke opencode CLI with given args and prompt.
|
|
|
|
Args:
|
|
cli_args: Provider/model args like ["--provider", "copilot", "--model", "gpt-5.2"]
|
|
prompt: The prompt text
|
|
timeout: Timeout in seconds (default 5 minutes)
|
|
|
|
Returns:
|
|
Model response as string
|
|
|
|
Raises:
|
|
RuntimeError: If opencode CLI fails
|
|
TimeoutError: If request exceeds timeout
|
|
"""
|
|
cmd = ["opencode", "--print"] + cli_args + ["-p", prompt]
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
cmd,
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=timeout
|
|
)
|
|
except subprocess.TimeoutExpired:
|
|
raise TimeoutError(f"opencode timed out after {timeout}s")
|
|
|
|
if result.returncode != 0:
|
|
raise RuntimeError(f"opencode failed (exit {result.returncode}): {result.stderr}")
|
|
|
|
return result.stdout.strip()
|
|
|
|
|
|
if __name__ == "__main__":
|
|
# Quick test
|
|
import sys
|
|
if len(sys.argv) > 1:
|
|
response = invoke(["--provider", "copilot", "--model", "gpt-5.2"], sys.argv[1])
|
|
print(response)
|
|
else:
|
|
print("Usage: opencode.py 'prompt'")
|
|
```
|
|
|
|
**Step 2: Make executable**
|
|
|
|
```bash
|
|
chmod +x ~/.claude/mcp/llm-router/providers/opencode.py
|
|
```
|
|
|
|
**Step 3: Verify syntax**
|
|
|
|
Run: `python3 -m py_compile ~/.claude/mcp/llm-router/providers/opencode.py`
|
|
Expected: No output (success)
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add mcp/llm-router/providers/opencode.py
|
|
git -C ~/.claude commit -m "feat(external-llm): add opencode provider wrapper"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 5: Create Gemini Provider
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/mcp/llm-router/providers/gemini.py`
|
|
|
|
**Step 1: Write provider**
|
|
|
|
```python
|
|
#!/usr/bin/env python3
|
|
"""Gemini CLI wrapper for Google models."""
|
|
|
|
import subprocess
|
|
from typing import List
|
|
|
|
|
|
def invoke(cli_args: List[str], prompt: str, timeout: int = 300) -> str:
|
|
"""
|
|
Invoke gemini CLI with given args and prompt.
|
|
|
|
Args:
|
|
cli_args: Model args like ["-m", "gemini-3-pro"]
|
|
prompt: The prompt text
|
|
timeout: Timeout in seconds (default 5 minutes)
|
|
|
|
Returns:
|
|
Model response as string
|
|
|
|
Raises:
|
|
RuntimeError: If gemini CLI fails
|
|
TimeoutError: If request exceeds timeout
|
|
"""
|
|
cmd = ["gemini"] + cli_args + ["-p", prompt]
|
|
|
|
try:
|
|
result = subprocess.run(
|
|
cmd,
|
|
capture_output=True,
|
|
text=True,
|
|
timeout=timeout
|
|
)
|
|
except subprocess.TimeoutExpired:
|
|
raise TimeoutError(f"gemini timed out after {timeout}s")
|
|
|
|
if result.returncode != 0:
|
|
raise RuntimeError(f"gemini failed (exit {result.returncode}): {result.stderr}")
|
|
|
|
return result.stdout.strip()
|
|
|
|
|
|
if __name__ == "__main__":
|
|
# Quick test
|
|
import sys
|
|
if len(sys.argv) > 1:
|
|
response = invoke(["-m", "gemini-3-pro"], sys.argv[1])
|
|
print(response)
|
|
else:
|
|
print("Usage: gemini.py 'prompt'")
|
|
```
|
|
|
|
**Step 2: Make executable**
|
|
|
|
```bash
|
|
chmod +x ~/.claude/mcp/llm-router/providers/gemini.py
|
|
```
|
|
|
|
**Step 3: Verify syntax**
|
|
|
|
Run: `python3 -m py_compile ~/.claude/mcp/llm-router/providers/gemini.py`
|
|
Expected: No output (success)
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add mcp/llm-router/providers/gemini.py
|
|
git -C ~/.claude commit -m "feat(external-llm): add gemini provider wrapper"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 6: Create Main Router (invoke.py)
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/mcp/llm-router/invoke.py`
|
|
|
|
**Step 1: Write router**
|
|
|
|
```python
|
|
#!/usr/bin/env python3
|
|
"""
|
|
Invoke external LLM via configured provider.
|
|
|
|
Usage:
|
|
invoke.py --model copilot/gpt-5.2 -p "prompt"
|
|
invoke.py --task reasoning -p "prompt"
|
|
invoke.py --task code-generation -p "prompt" --json
|
|
|
|
Model selection priority:
|
|
1. Explicit --model flag
|
|
2. Task-based routing (--task flag)
|
|
3. Default from policy
|
|
"""
|
|
|
|
import argparse
|
|
import json
|
|
import sys
|
|
from pathlib import Path
|
|
|
|
STATE_DIR = Path.home() / ".claude/state"
|
|
ROUTER_DIR = Path(__file__).parent
|
|
|
|
|
|
def load_policy() -> dict:
|
|
"""Load model policy from state file."""
|
|
policy_file = STATE_DIR / "model-policy.json"
|
|
with open(policy_file) as f:
|
|
return json.load(f)
|
|
|
|
|
|
def resolve_model(args: argparse.Namespace, policy: dict) -> str:
|
|
"""Determine which model to use based on args and policy."""
|
|
if args.model:
|
|
return args.model
|
|
if args.task and args.task in policy.get("task_routing", {}):
|
|
return policy["task_routing"][args.task]
|
|
return policy.get("task_routing", {}).get("default", "copilot/sonnet-4.5")
|
|
|
|
|
|
def invoke(model: str, prompt: str, policy: dict) -> str:
|
|
"""Invoke the appropriate provider for the given model."""
|
|
external_models = policy.get("external_models", {})
|
|
|
|
if model not in external_models:
|
|
raise ValueError(f"Unknown model: {model}. Available: {list(external_models.keys())}")
|
|
|
|
model_config = external_models[model]
|
|
cli = model_config["cli"]
|
|
cli_args = model_config.get("cli_args", [])
|
|
|
|
# Import and invoke appropriate provider
|
|
if cli == "opencode":
|
|
sys.path.insert(0, str(ROUTER_DIR))
|
|
from providers.opencode import invoke as opencode_invoke
|
|
return opencode_invoke(cli_args, prompt)
|
|
elif cli == "gemini":
|
|
sys.path.insert(0, str(ROUTER_DIR))
|
|
from providers.gemini import invoke as gemini_invoke
|
|
return gemini_invoke(cli_args, prompt)
|
|
else:
|
|
raise ValueError(f"Unknown CLI: {cli}")
|
|
|
|
|
|
def main():
|
|
parser = argparse.ArgumentParser(
|
|
description="Invoke external LLM via configured provider"
|
|
)
|
|
parser.add_argument(
|
|
"-p", "--prompt",
|
|
required=True,
|
|
help="Prompt text"
|
|
)
|
|
parser.add_argument(
|
|
"--model",
|
|
help="Explicit model (e.g., copilot/gpt-5.2)"
|
|
)
|
|
parser.add_argument(
|
|
"--task",
|
|
choices=["reasoning", "code-generation", "long-context", "general"],
|
|
help="Task type for automatic model routing"
|
|
)
|
|
parser.add_argument(
|
|
"--json",
|
|
action="store_true",
|
|
help="Output as JSON with model info"
|
|
)
|
|
parser.add_argument(
|
|
"--timeout",
|
|
type=int,
|
|
default=300,
|
|
help="Timeout in seconds (default: 300)"
|
|
)
|
|
|
|
args = parser.parse_args()
|
|
|
|
try:
|
|
policy = load_policy()
|
|
model = resolve_model(args, policy)
|
|
result = invoke(model, args.prompt, policy)
|
|
|
|
if args.json:
|
|
output = {
|
|
"model": model,
|
|
"response": result,
|
|
"success": True
|
|
}
|
|
print(json.dumps(output, indent=2))
|
|
else:
|
|
print(result)
|
|
|
|
except Exception as e:
|
|
if args.json:
|
|
output = {
|
|
"model": args.model or "unknown",
|
|
"error": str(e),
|
|
"success": False
|
|
}
|
|
print(json.dumps(output, indent=2))
|
|
sys.exit(1)
|
|
else:
|
|
print(f"Error: {e}", file=sys.stderr)
|
|
sys.exit(1)
|
|
|
|
|
|
if __name__ == "__main__":
|
|
main()
|
|
```
|
|
|
|
**Step 2: Make executable**
|
|
|
|
```bash
|
|
chmod +x ~/.claude/mcp/llm-router/invoke.py
|
|
```
|
|
|
|
**Step 3: Verify syntax**
|
|
|
|
Run: `python3 -m py_compile ~/.claude/mcp/llm-router/invoke.py`
|
|
Expected: No output (success)
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add mcp/llm-router/invoke.py
|
|
git -C ~/.claude commit -m "feat(external-llm): add main router invoke.py"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 7: Create Delegation Helper
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/mcp/llm-router/delegate.py`
|
|
|
|
**Step 1: Write delegation helper**
|
|
|
|
```python
|
|
#!/usr/bin/env python3
|
|
"""
|
|
Agent delegation helper. Routes to external or Claude based on mode.
|
|
|
|
Usage:
|
|
delegate.py --tier sonnet -p "prompt"
|
|
delegate.py --tier opus -p "complex reasoning task" --json
|
|
"""
|
|
|
|
import argparse
|
|
import json
|
|
import subprocess
|
|
import sys
|
|
from pathlib import Path
|
|
|
|
STATE_DIR = Path.home() / ".claude/state"
|
|
ROUTER_DIR = Path(__file__).parent
|
|
|
|
|
|
def is_external_mode() -> bool:
|
|
"""Check if external-only mode is enabled."""
|
|
mode_file = STATE_DIR / "external-mode.json"
|
|
if mode_file.exists():
|
|
with open(mode_file) as f:
|
|
data = json.load(f)
|
|
return data.get("enabled", False)
|
|
return False
|
|
|
|
|
|
def get_external_model(tier: str) -> str:
|
|
"""Get the external model equivalent for a Claude tier."""
|
|
policy_file = STATE_DIR / "model-policy.json"
|
|
with open(policy_file) as f:
|
|
policy = json.load(f)
|
|
mapping = policy.get("claude_to_external_map", {})
|
|
if tier not in mapping:
|
|
raise ValueError(f"No external mapping for tier: {tier}")
|
|
return mapping[tier]
|
|
|
|
|
|
def delegate(tier: str, prompt: str, use_json: bool = False) -> str:
|
|
"""
|
|
Delegate to appropriate model based on mode.
|
|
|
|
Args:
|
|
tier: Claude tier (opus, sonnet, haiku)
|
|
prompt: The prompt text
|
|
use_json: Return JSON output
|
|
|
|
Returns:
|
|
Model response as string
|
|
"""
|
|
if is_external_mode():
|
|
# Use external model
|
|
model = get_external_model(tier)
|
|
invoke_script = ROUTER_DIR / "invoke.py"
|
|
|
|
cmd = [sys.executable, str(invoke_script), "--model", model, "-p", prompt]
|
|
if use_json:
|
|
cmd.append("--json")
|
|
|
|
result = subprocess.run(cmd, capture_output=True, text=True)
|
|
|
|
if result.returncode != 0:
|
|
raise RuntimeError(f"External invoke failed: {result.stderr}")
|
|
|
|
return result.stdout.strip()
|
|
else:
|
|
# Use Claude
|
|
cmd = ["claude", "--print", "--model", tier, prompt]
|
|
|
|
result = subprocess.run(cmd, capture_output=True, text=True)
|
|
|
|
if result.returncode != 0:
|
|
raise RuntimeError(f"Claude failed: {result.stderr}")
|
|
|
|
response = result.stdout.strip()
|
|
|
|
if use_json:
|
|
return json.dumps({
|
|
"model": f"claude/{tier}",
|
|
"response": response,
|
|
"success": True
|
|
}, indent=2)
|
|
|
|
return response
|
|
|
|
|
|
def main():
|
|
parser = argparse.ArgumentParser(
|
|
description="Delegate to Claude or external model based on mode"
|
|
)
|
|
parser.add_argument(
|
|
"--tier",
|
|
required=True,
|
|
choices=["opus", "sonnet", "haiku"],
|
|
help="Claude tier (maps to external equivalent when in external mode)"
|
|
)
|
|
parser.add_argument(
|
|
"-p", "--prompt",
|
|
required=True,
|
|
help="Prompt text"
|
|
)
|
|
parser.add_argument(
|
|
"--json",
|
|
action="store_true",
|
|
help="Output as JSON"
|
|
)
|
|
|
|
args = parser.parse_args()
|
|
|
|
try:
|
|
result = delegate(args.tier, args.prompt, args.json)
|
|
print(result)
|
|
except Exception as e:
|
|
if args.json:
|
|
print(json.dumps({"error": str(e), "success": False}, indent=2))
|
|
sys.exit(1)
|
|
else:
|
|
print(f"Error: {e}", file=sys.stderr)
|
|
sys.exit(1)
|
|
|
|
|
|
if __name__ == "__main__":
|
|
main()
|
|
```
|
|
|
|
**Step 2: Make executable**
|
|
|
|
```bash
|
|
chmod +x ~/.claude/mcp/llm-router/delegate.py
|
|
```
|
|
|
|
**Step 3: Verify syntax**
|
|
|
|
Run: `python3 -m py_compile ~/.claude/mcp/llm-router/delegate.py`
|
|
Expected: No output (success)
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add mcp/llm-router/delegate.py
|
|
git -C ~/.claude commit -m "feat(external-llm): add delegation helper"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 8: Create Toggle Script
|
|
|
|
**Files:**
|
|
- Create: `~/.claude/mcp/llm-router/toggle.py`
|
|
|
|
**Step 1: Write toggle script**
|
|
|
|
```python
|
|
#!/usr/bin/env python3
|
|
"""
|
|
Toggle external-only mode.
|
|
|
|
Usage:
|
|
toggle.py on [--reason "user requested"]
|
|
toggle.py off
|
|
toggle.py status
|
|
"""
|
|
|
|
import argparse
|
|
import json
|
|
import sys
|
|
from datetime import datetime
|
|
from pathlib import Path
|
|
|
|
STATE_FILE = Path.home() / ".claude/state/external-mode.json"
|
|
|
|
|
|
def load_state() -> dict:
|
|
"""Load current state."""
|
|
if STATE_FILE.exists():
|
|
with open(STATE_FILE) as f:
|
|
return json.load(f)
|
|
return {"enabled": False, "activated_at": None, "reason": None}
|
|
|
|
|
|
def save_state(state: dict):
|
|
"""Save state to file."""
|
|
with open(STATE_FILE, "w") as f:
|
|
json.dump(state, f, indent=2)
|
|
|
|
|
|
def enable(reason: str = None):
|
|
"""Enable external-only mode."""
|
|
state = {
|
|
"enabled": True,
|
|
"activated_at": datetime.now().isoformat(),
|
|
"reason": reason or "user-requested"
|
|
}
|
|
save_state(state)
|
|
print("External-only mode ENABLED")
|
|
print(f" Activated: {state['activated_at']}")
|
|
print(f" Reason: {state['reason']}")
|
|
print("\nAll agent requests will now use external LLMs.")
|
|
print("Run 'toggle.py off' or '/pa --external off' to disable.")
|
|
|
|
|
|
def disable():
|
|
"""Disable external-only mode."""
|
|
state = {
|
|
"enabled": False,
|
|
"activated_at": None,
|
|
"reason": None
|
|
}
|
|
save_state(state)
|
|
print("External-only mode DISABLED")
|
|
print("\nAll agent requests will now use Claude.")
|
|
|
|
|
|
def status():
|
|
"""Show current mode status."""
|
|
state = load_state()
|
|
if state.get("enabled"):
|
|
print("External-only mode: ENABLED")
|
|
print(f" Activated: {state.get('activated_at', 'unknown')}")
|
|
print(f" Reason: {state.get('reason', 'unknown')}")
|
|
else:
|
|
print("External-only mode: DISABLED")
|
|
print(" Using Claude for all requests.")
|
|
|
|
|
|
def main():
|
|
parser = argparse.ArgumentParser(description="Toggle external-only mode")
|
|
subparsers = parser.add_subparsers(dest="command", required=True)
|
|
|
|
# on command
|
|
on_parser = subparsers.add_parser("on", help="Enable external-only mode")
|
|
on_parser.add_argument("--reason", help="Reason for enabling")
|
|
|
|
# off command
|
|
subparsers.add_parser("off", help="Disable external-only mode")
|
|
|
|
# status command
|
|
subparsers.add_parser("status", help="Show current mode")
|
|
|
|
args = parser.parse_args()
|
|
|
|
if args.command == "on":
|
|
enable(args.reason)
|
|
elif args.command == "off":
|
|
disable()
|
|
elif args.command == "status":
|
|
status()
|
|
|
|
|
|
if __name__ == "__main__":
|
|
main()
|
|
```
|
|
|
|
**Step 2: Make executable**
|
|
|
|
```bash
|
|
chmod +x ~/.claude/mcp/llm-router/toggle.py
|
|
```
|
|
|
|
**Step 3: Test toggle**
|
|
|
|
Run: `~/.claude/mcp/llm-router/toggle.py status`
|
|
Expected: Shows "External-only mode: DISABLED"
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add mcp/llm-router/toggle.py
|
|
git -C ~/.claude commit -m "feat(external-llm): add toggle script"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 9: Update Session Start Hook
|
|
|
|
**Files:**
|
|
- Modify: `~/.claude/hooks/scripts/session-start.sh`
|
|
|
|
**Step 1: Read current hook**
|
|
|
|
Run: `cat ~/.claude/hooks/scripts/session-start.sh`
|
|
|
|
**Step 2: Add external mode check**
|
|
|
|
Add before the final output section:
|
|
|
|
```bash
|
|
# Check external mode
|
|
if [ -f ~/.claude/state/external-mode.json ]; then
|
|
EXTERNAL_ENABLED=$(jq -r '.enabled // false' ~/.claude/state/external-mode.json)
|
|
if [ "$EXTERNAL_ENABLED" = "true" ]; then
|
|
echo "external-mode:enabled"
|
|
fi
|
|
fi
|
|
```
|
|
|
|
**Step 3: Verify hook**
|
|
|
|
Run: `bash -n ~/.claude/hooks/scripts/session-start.sh`
|
|
Expected: No output (success)
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add hooks/scripts/session-start.sh
|
|
git -C ~/.claude commit -m "feat(external-llm): announce external mode in session-start"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 10: Add Component Registry Triggers
|
|
|
|
**Files:**
|
|
- Modify: `~/.claude/state/component-registry.json`
|
|
|
|
**Step 1: Read current registry**
|
|
|
|
Run: `cat ~/.claude/state/component-registry.json | jq '.skills'`
|
|
|
|
**Step 2: Add external-mode skill entry**
|
|
|
|
Add to skills array:
|
|
|
|
```json
|
|
{
|
|
"id": "external-mode-toggle",
|
|
"name": "External Mode Toggle",
|
|
"description": "Toggle between Claude and external LLMs",
|
|
"path": "mcp/llm-router/toggle.py",
|
|
"triggers": [
|
|
"use external",
|
|
"switch to external",
|
|
"external models",
|
|
"external only",
|
|
"use copilot",
|
|
"use opencode",
|
|
"back to claude",
|
|
"use claude again",
|
|
"disable external",
|
|
"external mode"
|
|
]
|
|
}
|
|
```
|
|
|
|
**Step 3: Validate JSON**
|
|
|
|
Run: `cat ~/.claude/state/component-registry.json | jq .`
|
|
Expected: Valid JSON
|
|
|
|
**Step 4: Commit**
|
|
|
|
```bash
|
|
git -C ~/.claude add state/component-registry.json
|
|
git -C ~/.claude commit -m "feat(external-llm): add external-mode triggers to registry"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 11: Integration Test
|
|
|
|
**Step 1: Test toggle**
|
|
|
|
```bash
|
|
~/.claude/mcp/llm-router/toggle.py status
|
|
~/.claude/mcp/llm-router/toggle.py on --reason "testing"
|
|
~/.claude/mcp/llm-router/toggle.py status
|
|
~/.claude/mcp/llm-router/toggle.py off
|
|
```
|
|
|
|
Expected: Status changes correctly
|
|
|
|
**Step 2: Test router (mock - will fail without actual CLIs)**
|
|
|
|
```bash
|
|
~/.claude/mcp/llm-router/invoke.py --model copilot/gpt-5.2 -p "Say hello" --json 2>&1 || echo "Expected: CLI not found (normal if opencode not installed)"
|
|
```
|
|
|
|
**Step 3: Test delegation helper status check**
|
|
|
|
```bash
|
|
python3 -c "
|
|
import sys
|
|
sys.path.insert(0, '$HOME/.claude/mcp/llm-router')
|
|
from delegate import is_external_mode
|
|
print(f'External mode: {is_external_mode()}')
|
|
"
|
|
```
|
|
|
|
Expected: "External mode: False"
|
|
|
|
**Step 4: Final commit**
|
|
|
|
```bash
|
|
git -C ~/.claude commit --allow-empty -m "feat(external-llm): integration complete - gleaming-routing-mercury"
|
|
```
|
|
|
|
---
|
|
|
|
## Summary
|
|
|
|
After completing all tasks:
|
|
|
|
| Component | Status |
|
|
|-----------|--------|
|
|
| `state/external-mode.json` | Created |
|
|
| `state/model-policy.json` | Extended |
|
|
| `mcp/llm-router/invoke.py` | Created |
|
|
| `mcp/llm-router/delegate.py` | Created |
|
|
| `mcp/llm-router/toggle.py` | Created |
|
|
| `mcp/llm-router/providers/opencode.py` | Created |
|
|
| `mcp/llm-router/providers/gemini.py` | Created |
|
|
| `hooks/scripts/session-start.sh` | Updated |
|
|
| `state/component-registry.json` | Updated |
|
|
|
|
**To use:**
|
|
```bash
|
|
# Enable external mode
|
|
~/.claude/mcp/llm-router/toggle.py on
|
|
|
|
# Or via PA
|
|
/pa --external on
|
|
/pa switch to external models
|
|
|
|
# Invoke directly
|
|
~/.claude/mcp/llm-router/invoke.py --task reasoning -p "Explain quantum computing"
|
|
|
|
# Delegate (respects mode)
|
|
~/.claude/mcp/llm-router/delegate.py --tier sonnet -p "Check disk space"
|
|
```
|