Opencode; Usability with Local LLMs on iGPU w 128GB vram: My Tests
Testing and configuring opencode for usage with local llms
Popular topics
Showing posts from
Testing and configuring opencode for usage with local llms
Strix Halo 128GB RAM, 100% local LLM agents, my tests
A comparison of Antigravity, Cursor, Windsurf, and Codium + Continue for agentic coding tasks.