Comment by gazarsgo Comment by gazarsgo 15 days ago 2 replies Copy Link View on Hacker News I dunno I ran `ollama run gpt-oss:20b` locally and it only used 16GB locally and I had decent enough inference on my Macbook.
Copy Link latchkey 15 days ago Collapse Comment - Now do the 120b model. Reply View | 1 reply Copy Link Bud 15 days ago Parent Collapse Comment - [dead] Reply View | 0 replies
Now do the 120b model.