Opencode; Usability with Local LLMs on iGPU w 128GB vram: My Tests
- Mar 7, 2026
- •
- 04 min
Popular topics

Recent Posts
Testing and configuring opencode for usage with local llms
Strix Halo 128GB RAM, 100% local LLM agents, my tests
A comparison of Antigravity, Cursor, Windsurf, and Codium + Continue for agentic coding tasks.