Multi-agent parallelism applies to all LLMs, not just local

This commit is contained in:
William Valentin
2026-01-26 22:37:39 -08:00
parent 4e4abcf6cd
commit 81d8ece62f
2 changed files with 12 additions and 4 deletions

View File

@@ -65,14 +65,17 @@ curl http://127.0.0.1:8080/v1/chat/completions \
- No rate limits - No rate limits
- No cost accumulation - No cost accumulation
### 🚀 Parallel Work → **MULTI-AGENT LOCAL** ### 🚀 Parallel Work → **MULTI-AGENT**
When speed matters, spawn multiple workers: When speed matters, spawn multiple workers:
```bash ```bash
# Flynn can spawn sub-agents hitting local LLMs # Flynn can spawn sub-agents targeting any LLM
# Each agent works independently, results merge # Each agent works independently, results merge
``` ```
- Use for: bulk analysis, multi-file processing, research tasks - Use for: bulk analysis, multi-file processing, research tasks
- Coordinate via sessions_spawn with local model routing - Coordinate via `sessions_spawn` with model param
- **Local:** best for privacy + no rate limits
- **Cloud:** best for complex tasks needing quality
- Mix and match based on task requirements
### ⚡ Quick One-Shot → **COPILOT or LOCAL** ### ⚡ Quick One-Shot → **COPILOT or LOCAL**
```bash ```bash

View File

@@ -55,7 +55,12 @@
**Local First** — Use local LLMs (llama-swap @ :8080) when: **Local First** — Use local LLMs (llama-swap @ :8080) when:
1. **Privacy/Confidentiality** — Sensitive data never leaves the machine 1. **Privacy/Confidentiality** — Sensitive data never leaves the machine
2. **Long-running tasks** — No API costs, no rate limits, no timeouts 2. **Long-running tasks** — No API costs, no rate limits, no timeouts
3. **Parallel work** — Spawn multiple agents hitting local endpoint
**Multi-agent parallelism** — Spawn multiple agents for speed:
- Works with local AND cloud LLMs
- Local: best for privacy, no rate limits
- Cloud: best for complex tasks needing quality
- Mix based on task requirements
**Always check availability** — Local LLMs may not be running: **Always check availability** — Local LLMs may not be running:
```bash ```bash