Comment by netdur Comment by netdur 2 months ago 0 replies Copy Link View on Hacker News Yes, the size is 99% 2 models weights required to run inference offline, there no way around it.