Comment by js8

Comment by js8 a day ago

1 reply

You could put 64TBs of storage into your pocket with current technology. There are 4TB microSD cards available.

Not sure about the stated GFlops.. but I suspect we find that AI doesn't need that much compute to begin with.

fragmede a day ago

You can run models locally on high end smartphones today with apps like PocketPal or Local LLM.