Comment by chrisischris
Comment by chrisischris 21 hours ago
I was hitting Claude Code's rate limit pretty often while paying for their max subscription. Started thinking – I've got a decent GPU sitting at home doing nothing most of the day.
So I'm building a distributed AI inference platform where you can run models on your own hardware and access it from anywhere privately. Keep your data on infrastructure you control, but also leverage a credit system to tap into more powerful compute when you need it. Your idle GPU time can earn credits for accessing bigger models. The goal is making it dead simple to use your home hardware from wherever you're working.
It's for anyone who wants infrastructure optionality: developers who don't want vendor lock-in, businesses with compliance requirements, or just people who don't want their data sent to third parties.
Get notified when we launch: https://sporeintel.com
I don't want to get into blockchain-shenanigans but I did love the Folding@home model for democratized compute. Could spare cycles on GPUs at home be used for a P2P network of inferencing?