Comment by nottorp

Comment by nottorp 17 hours ago

12 replies

You remember when Google used to do the same thing for you way before "AI"?

Okay, maybe sometimes the post about the stack trace was in Chinese, but a plain search used to be capable of giving the same answer as a LLM.

It's not that LLMs are better, it's search that got entshittified.

socalgal2 16 hours ago

I remember when I could paste an error message into Google and get an answer. I do not remember pasting a 60 line stack trace into Google and getting an answer, though I'm pretty sure I honestly never tried that. Did it work?

  • 0x000xca0xfe 12 hours ago

    Yes, pasting lots of seemingly random context into Google used to work shockingly well.

    I could break most passwords of an internal company application by googling the SHA1 hashes.

    It was possible to reliably identify plants or insects by just googling all the random words or sentences that would come to mind describing it.

    (None of that works nowadays, not even remotely)

averageRoyalty 15 hours ago

A horse used to get you places just like a car could. A wisk worked as well as a blender.

We have a habit of finding efficiencies in our processes, even if the original process did work.

Philpax 17 hours ago

Google has never identified the logical error in a block of code for me. I could find what an error code was, yes, but it's of very little help when you don't have a keyword to search.

chasd00 12 hours ago

I don’t think search use to do everything LLMs do now but you have a very good point. Search has gotten much worse. I would say search is about the quality it was just before google launched. My general search needs are being met more and more by Claude, I use google only when I know very specific keywords because of seo spam and ads.

jasode 16 hours ago

>You remember when Google used to do the same thing for you way before "AI"? [...] stack trace [...], but a plain search used to be capable of giving the same answer as a LLM.

The "plain" Google Search before LLM never had the capability to copy&paste an entire lengthy stack trace (e.g. ~60 frames of verbose text) because long strings like that exceeds Google's UI. Various answers say limit of 32 words and 5784 characters: https://www.google.com/search?q=limit+of+google+search+strin...

Before LLM, the human had to manually visually hunt through the entire stack trace to guess at a relevant smaller substring and paste that into Google the search box. Of course, that's do-able but that's a different workflow than an LLM doing it for you.

To clarify, I'm not arguing that the LLM method is "better". I'm just saying it's different.

  • nottorp 13 hours ago

    That's a good point, because now that I think of it, I never pasted a full stack trace in a search engine. I selected what looked to be the relevant part and pasted that.

    But I did it subconsciously. I never thought of it until today.

    Another skill that LLM use can kill? :)

  • swader999 12 hours ago

    Those truly were the dark ages. I don't know how people did it. They were a different breed.

FranzFerdiNaN 16 hours ago

It was just as likely that Google would point you towards a stackoverflow question that was closed because it was considered a duplicate of a completely different question.

nsonha 15 hours ago

> when Google used to do the same thing for you way before "AI"?

Which is never? Do you often just lie to win arguments? LLM gives you a synthesized answer, search engine only returns what already exists. By definition it can not give you anything that is not a super obvious match

  • nottorp 14 hours ago

    > Which is never?

    In my experience it was "a lot". Because my stack traces were mostly hardware related problems on arm linux in that period.

    But I suppose your stack traces were much different and superior and no one can have stack traces that are different from yours. The world is composed of just you and your project.

    > Do you often just lie to win arguments?

    I do not enjoy being accused of lying by someone stuck in their own bubble.

    When you said "Which is never" did you lie consciously or subconsciously btw?

    • SpaceNugget 10 hours ago

      According to a quick search on google, which is not very useful these days, the maximum query length is 32 words or 2000 characters and change depending on which answer you trust.

      Whatever it is specifically, the idea that you could just paste a 600 line stack trace unmodified into google, especially "way before AI" and get pointed to the relevant bit for your exact problem is obviously untrue.