Comment by avbanks
I don't think people fully realize how good the open source models are and how easy it is to switch.
I don't think people fully realize how good the open source models are and how easy it is to switch.
Everyone cares about OSS as in "free", the capital spending of AI firms and market capitalization hinges on the concept that they will save enterprises tons of money by off-sourcing employees.
You think we have these crazy valuations because the market thinks that OpenAI will make joe-schmoe buy enough of their services? (Them introducing "shopping" into the service honestly feels like a bit of a panicky move to target Google).
We're prototyping some LLM assisted products, but right now the cost-model isn't entirely there since we need to use more expensive models to get good results that leaves a small margin, spinning up a moderately sized VM would probably be more cost effective option and more people will probably run into this and start creating easy to setup models/service-VM's (maybe not just yet, but it'll come).
Sure they could start hosting things themselves, but what's stopping anyone from finding a cheaper but "good enough" alternative?
My input to our recent AI strategy workshop was basically:
- OpenAI,etc will go bankrupt (unless one manages to capture search from a struggling Google)
- We will have a new AI winter with corresponding research slowdown like in the 1980s when funding dries up
- Opensource LLM instances will be deployed to properly manage privacy concerns.