Comment by DanHulton

Comment by DanHulton 2 days ago

2 replies

You can build this today exactly as efficiently as you can when inference is 1000x faster, because the only things you can build with this is things that absolutely don't matter. The first bored high schooler who realizes that there's an LLM between them and the database is going to WRECK you.

feifan 2 days ago

this assumes the application is hosted as SaaS, but if the application makes sense as a personal/"desktop" app, that likely wouldn't matter.

  • hyperadvanced a day ago

    It’s actually extremely motivating to consider what LLM or similar AI agent could do for us if we were to free our minds from 2 decades of SaaS brainrot.

    What if we ran AI locally and used it to actually do labor-intensive things with computers that make money rather than assuming everything were web-connected, paywalled, rate-limited, authenticated, tracked, and resold?