Opencode; Usability with Local LLMs on iGPU w 128GB vram: My Tests
Testing and configuring opencode for usage with local llms
Popular topics

Welcome to my personal Blog
Programmer, Student, Founder, Hobby Game Dev.
Testing and configuring opencode for usage with local llms
Strix Halo 128GB RAM, 100% local LLM agents, my tests
A comparison of Antigravity, Cursor, Windsurf, and Codium + Continue for agentic coding tasks.
A comprehensive guide to setting up fully self-hosted AI code editing with Codium and Continue.dev, keeping your code and AI interactions completely private and under your control.