
The AI Horde is a free, open source, privacy focused distributed computing service that offers what you’re looking for. If you want to run models locally and have ~16GB VRAM, then Broken-Tutu-24B and Cydonia-24B-V4 are pretty popular right now. For 8GB VRAM, Super-Nova-8B and L3-Lunaris-8B are surprisingly decent.