Multi-agent parallelism applies to all LLMs, not just local

This commit is contained in:
William Valentin
2026-01-26 22:37:39 -08:00
parent 4e4abcf6cd
commit 81d8ece62f
2 changed files with 12 additions and 4 deletions

View File

@@ -55,7 +55,12 @@
**Local First** — Use local LLMs (llama-swap @ :8080) when:
1. **Privacy/Confidentiality** — Sensitive data never leaves the machine
2. **Long-running tasks** — No API costs, no rate limits, no timeouts
3. **Parallel work** — Spawn multiple agents hitting local endpoint
**Multi-agent parallelism** — Spawn multiple agents for speed:
- Works with local AND cloud LLMs
- Local: best for privacy, no rate limits
- Cloud: best for complex tasks needing quality
- Mix based on task requirements
**Always check availability** — Local LLMs may not be running:
```bash