These aren’t reviews — they’re working configs. Each note has the setup I run, the plugins I kept, and the stuff I tried and dropped. Most of this runs on WSL and gets deployed to my home server.
- Running LLMs Locally with Ollama — What actually runs well on 8GB VRAM, when to use local vs cloud, and WSL GPU setup.
- Open WebUI — A ChatGPT-like interface for local models. The real value is custom system prompts per model.
- Oh My Zsh Plugins — Out of 275+ plugins, here are the ~10 that matter. Plus Powerlevel10k.
- WSL for Development — Simple rule: if it touches a server, use WSL. If it talks to Windows hardware, stay native.
- Obsidian Dataview Queries — The queries I actually use daily, not the ones I bookmarked and forgot. These power the project system.