Opencode; Usability with Local LLMs on iGPU w 128GB vram: My Tests
Testing and configuring opencode for usage with local llms
Popular topics
Testing and configuring opencode for usage with local llms
Strix Halo 128GB RAM, 100% local LLM agents, my tests
A comparison of Antigravity, Cursor, Windsurf, and Codium + Continue for agentic coding tasks.
A comprehensive guide to setting up fully self-hosted AI code editing with Codium and Continue.dev, keeping your code and AI interactions...
How to self host a Matrix.org server
How Open-Chats Federation Enables anybody to host anything anywhere