Comment by dangus
Comment by dangus a day ago
Perhaps the real takeaway is that there really is only one product, two if you count image generation.
Perhaps the only reason Cursor is so good is because editing code is so similar to the basic function of an LLM without anything wrapped around it.
Like, someone prove me wrong by linking 3 transformative AI products that:
1. Have nothing to do with "chatting" to a thin wrapper (couldn't just be done inside a plain LLM with a couple of file uploads added for additional context)
2. Don't involve traditional ML that has existed for years and isn't part of the LLM "revolution."
3. Has nothing to do with writing code
For example, I recently used an AI chatbot that was supposed to help me troubleshoot a consumer IoT device. It basically regurgitated steps from the manual and started running around in circles because my issue was simply not covered by documentation. I then had to tell it to send me to a human. The human had more suggestions that the AI couldn't think of but still couldn't help because the product was a piece of shit.
Or just look at Amazon Q. Ask it a basic AWS question and it'll just give you a bogus "sorry I can't help with that" answer where you just know that running over to chatgpt.com will actually give you a legitimate answer. Most AI "products" seem to be castrated versions of ChatGPT/Claude/Gemini.
That sort of overall garbage experience seems to be what is most frequently associated with AI. Basically, a futile attempt to replace low-wage employees that didn't end up delivering any value to anyone, especially since any company interested in eliminating employees just because "fuck it why not" without any real strategy probably has a busted-ass product to begin with.
Putting me on hold for 15 minutes would have been more effective at getting me to go away and no compute cycles would have been necessary.
Outside of coding, Google's NotebookLM is quite useful for analysing complex documentation - things like standards and complicated API specs.
But yes, an AI chatbot that can't actually take any actions is effectively just regurgitating documentation. I normally contact support because the thing I need help with is either not covered in documentation, or requires an intervention. If AI can't make interventions, it's just a fancy kind of search with an annoying interface.