Files
clawdbot/TOOLS.md
William Valentin 2c3d6afcdd Add workspace utility scripts
- llm: Local LLM wrapper for llama-swap
- homelab-status: Quick K8s/cluster health check
- calc: Python/JS REPL for quick calculations
- transcribe: Whisper audio transcription wrapper

Added to fish PATH.
2026-01-26 22:53:52 -08:00

5.9 KiB

TOOLS.md - Local Notes

Skills define how tools work. This file is for your specifics — the stuff that's unique to your setup.


🏠 Homelab Infrastructure

Kubernetes Cluster

  • Distribution: k0s
  • Nodes: 3 (2x Pi5 8GB, 1x Pi3 1GB)
  • Architecture: arm64
  • Storage: Longhorn (StorageClass: longhorn)
  • GitOps: ArgoCD
  • Monitoring: kube-prometheus-stack, Loki, Grafana, Alertmanager

Network

  • Tailnet: taildb3494.ts.net
  • MetalLB Pool: 192.168.153.240-254
  • Ingress (nginx): 192.168.153.240
  • Ingress (haproxy): 192.168.153.241
  • DNS Pattern: <app>.<ns>.<ip>.nip.io

Key Services & URLs

Service URL
Grafana grafana.monitoring.192.168.153.240.nip.io
Longhorn UI ui.longhorn-system.192.168.153.240.nip.io
Open WebUI oi.ai-stack.192.168.153.240.nip.io
ArgoCD via Tailscale
Home Assistant ha.home-assistant.192.168.153.241.nip.io
n8n n8n.ai-stack.192.168.153.240.nip.io

Namespaces

ai-stack, argocd, monitoring, loki-system, longhorn-system, metallb-system, minio, nginx-ingress-controller, tailscale-operator, gitea, home-assistant, pihole, plex, ghost, kubernetes-dashboard, docker-registry

AI Stack

  • Namespace: ai-stack
  • Components: Open WebUI, Ollama, LiteLLM, SearXNG, n8n, vLLM
  • Ollama Host: 100.85.116.57:11434
  • Models: gpt-oss:120b, qwen3-coder

💻 Workstation

  • Hostname: willlaptop
  • IP: 192.168.153.117
  • OS: Arch Linux
  • Desktop: GNOME
  • Shell: fish
  • Terminals: ghostty, alacritty
  • Theme: Dracula
  • Dotfiles: chezmoi

Dev Tools

  • Editors: VSCode, Zed, Vim
  • Languages: Go, Rust, Python, TypeScript, Zig
  • K8s Tools: k9s, kubectl, argocd CLI, krew, kubecolor
  • Containers: Docker, Podman, Distrobox

Local AI (llama-swap)

  • Endpoint: http://127.0.0.1:8080
  • Service: systemctl --user status llama-swap
  • Config: ~/.config/llama-swap/config.yaml
  • GPU: RTX 5070 Ti (12GB VRAM)

Available Models:

Alias Model Notes
qwen3 Qwen3-30B-A3B General purpose MoE, 8k ctx
coder Qwen3-Coder-30B-A3B Code specialist MoE
glm GLM-4.7-Flash Fast reasoning
gemma Gemma-3-12B Balanced, fits fully
reasoning Ministral-3-14B-Reasoning Reasoning specialist
gpt-oss GPT-OSS-20B Experimental

Usage:

curl http://127.0.0.1:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{"model": "gemma", "messages": [{"role": "user", "content": "Hello"}]}'

Web UI: http://127.0.0.1:8080/ui


🛠️ Workspace Scripts

Scripts in ~/clawd/scripts/ — add to PATH with:

export PATH="$HOME/clawd/scripts:$PATH"

llm — Local LLM wrapper

llm "What is 2+2?"                    # Uses default model (gemma)
llm -m qwen3 "Explain kubernetes"     # Use specific model
llm coder "Write a Python function"   # Short alias

homelab-status — Quick cluster check

homelab-status         # Quick overview
homelab-status --full  # Include storage details

calc — Code REPL

calc "2 + 2"                      # Quick math
calc "sum([1,2,3,4,5])"           # Python functions
calc "math.sqrt(144)"             # With imports
calc -j "[1,2,3].map(x => x*2)"   # JavaScript mode

transcribe — Audio transcription

transcribe meeting.mp3              # Quick transcription (base model)
transcribe -m medium interview.wav  # Better accuracy
transcribe -l en -f srt podcast.mp3 # English subtitles

📁 Key Repos

Homelab GitOps

  • Path: ~/Code/active/devops/homelab/homelab
  • Remote: git@github.com:will666/homelab.git
  • Structure:
    • argocd/ — ArgoCD Application manifests
    • charts/<svc>/values.yaml — Helm values
    • charts/<svc>/manifests/ — Raw K8s manifests
    • ansible/ — Node provisioning

Workstation Config

  • Path: ~/Code/active/devops/willlaptop
  • Remote: git@gitea-gitea-ssh.taildb3494.ts.net:will/willlaptop.git
  • Purpose: Ansible playbooks, package lists, scripts

🔧 Claude Code Integration

William also has a Claude Code setup at ~/.claude/ with:

  • Multi-agent hierarchy (PA → Master Orchestrator → Domain Agents)
  • Skills for gmail, gcal, k8s-quick-status, rag-search
  • Component registry for routing
  • Guardrails (PreToolUse safety hooks)
  • future-considerations.json for deferred work

When working on k8s or homelab tasks, can reference:

  • ~/.claude/state/kb.json — Infrastructure facts
  • ~/.claude/state/future-considerations.json — Deferred features
  • ~/.claude/repos/homelab/ — Symlink to homelab repo

📧 Google Workspace (gog CLI)

Use gog for Gmail, Calendar, Drive, Contacts, Sheets, Docs.

Gmail

gog gmail search 'newer_than:7d is:unread' --max 10
gog gmail search 'from:github.com' --max 20
gog gmail search 'subject:urgent' --max 5

Calendar

# Week ahead
gog calendar events primary --from "$(date -I)" --to "$(date -I -d '+7 days')"

# Today only
gog calendar events primary --from "$(date -I)" --to "$(date -I -d '+1 day')"

Tasks

# List task lists
gog tasks lists list

# List tasks in main list
gog tasks list "MTM5MjM0MjM3MzYwODI4NzQ3NDE6MDow"

Auth Status

  • Gmail: authorized
  • Calendar: authorized
  • Tasks: authorized

🤖 LLM Routing

See LLM-ROUTING.md for full guide. Quick reference:

Task Use
Simple parsing opencode run -m github-copilot/claude-haiku-4.5
Code work opencode run -m github-copilot/claude-sonnet-4.5
Complex reasoning claude -p --model opus
Long context gemini -m gemini-2.5-pro

Principle: Start cheap, escalate if needed.


🎙️ Preferences

(Add voice, speaker, camera preferences as discovered)


Add whatever helps you do your job. This is your cheat sheet.