Comment by js8
You could put 64TBs of storage into your pocket with current technology. There are 4TB microSD cards available.
Not sure about the stated GFlops.. but I suspect we find that AI doesn't need that much compute to begin with.
You could put 64TBs of storage into your pocket with current technology. There are 4TB microSD cards available.
Not sure about the stated GFlops.. but I suspect we find that AI doesn't need that much compute to begin with.
You can run models locally on high end smartphones today with apps like PocketPal or Local LLM.