Comment by warkdarrior
Comment by warkdarrior 11 hours ago
The consumer apps use RAG and traditional search to give the LLM recent information in the prompt when it answers your query. This basically bridges over the knowledge gap between the end of training and today.
I'm fully aware, I'm just wanting to point out to people that the actual AI apps they'll use can and do return recent information due to integrations like that. Lots of people think AI can only answer stuff in its training set, but it can answer anything from whatever data you hand to it, including any data on the internet.
Lots of AI tools can easily answer "who won the basketball game last night".