Add LLM routing principles to MEMORY.md

This commit is contained in:
William Valentin
2026-01-26 22:35:00 -08:00
parent 3a3838e19b
commit 4e4abcf6cd

View File

@@ -51,6 +51,22 @@
**Why different from Claude Code:** Claude Code is CLI-first with routing-heavy design. Clawdbot is conversational — single capable assistant who calls for backup when needed. **Why different from Claude Code:** Claude Code is CLI-first with routing-heavy design. Clawdbot is conversational — single capable assistant who calls for backup when needed.
### 2026-01-26 - LLM Routing Principles
**Local First** — Use local LLMs (llama-swap @ :8080) when:
1. **Privacy/Confidentiality** — Sensitive data never leaves the machine
2. **Long-running tasks** — No API costs, no rate limits, no timeouts
3. **Parallel work** — Spawn multiple agents hitting local endpoint
**Always check availability** — Local LLMs may not be running:
```bash
curl -sf http://127.0.0.1:8080/health
```
**Routing priority:**
1. Local (free, private) → 2. Copilot (free-ish) → 3. Cloud APIs (paid)
**Never send sensitive data to cloud APIs without explicit permission.**
## Preferences Discovered ## Preferences Discovered
*(Add as I learn them)* *(Add as I learn them)*