Comment by bdhcuidbebe
Comment by bdhcuidbebe 2 hours ago
> using the local model (Ollama) is 'free' in terms of watts since my laptop is on anyway
Now that’s a cursed take on power efficency
Comment by bdhcuidbebe 2 hours ago
> using the local model (Ollama) is 'free' in terms of watts since my laptop is on anyway
Now that’s a cursed take on power efficency
efficiency is just a mindset. if i save 3 seconds of my own attention by burning 300 watts of gpu, the math works out in my favor!