Comment by warkdarrior

Comment by warkdarrior 11 hours ago

4 replies

The consumer apps use RAG and traditional search to give the LLM recent information in the prompt when it answers your query. This basically bridges over the knowledge gap between the end of training and today.

vel0city 10 hours ago

I'm fully aware, I'm just wanting to point out to people that the actual AI apps they'll use can and do return recent information due to integrations like that. Lots of people think AI can only answer stuff in its training set, but it can answer anything from whatever data you hand to it, including any data on the internet.

Lots of AI tools can easily answer "who won the basketball game last night".

  • reilly3000 6 hours ago

    GCP is sort of blending this into their Gemini APIs.

    https://ai.google.dev/gemini-api/docs/google-search

    You don’t have to do RAG or use a serp scraper MCP, just add the Search Grounding tool to the APzi request and it does the rest at the model’s discretion and $0.014 / search.

    I think that’s generally a fair price for my time vs doing my own search queries at 1/100th the speed. It could get expensive for deep research type queries.

    • dragonwriter 6 hours ago

      The pricing may be reasonable, but Grounding with Google Search has extremely restrictive terms of use.

      https://ai.google.dev/gemini-api/terms#grounding-with-google...

      • verdverm 4 hours ago

        yup, I looked at it GwGS and it's not interesting. I want actual results, not a summary and some links to sites (not pages), then to go crawl those pages

        Don't want Google as the middleman between the internet and my agent. Their search has become so bad I don't even use it any more