Comment by nobunaga

Comment by nobunaga 18 hours ago

2 replies

Yeah, doing web searches could be called research but thats not what we are talking bout. Read the parent of the parent. Its about being able to answer questions thats not in its training data. People are talking about LLMs making scientific discoveries that humans haven't. A ridiculous take. Its not possible and with the current state of tech never will be. I know what LLMs are trained on. Thats not the topic of conversation.

oezi 16 hours ago

A large part of research is just about creatively re-arranging symbolic information and LLMs are great at this kind of research. For example discovering relevant protein sequences.

semiquaver 15 hours ago

> Its about being able to answer questions thats not in its training data.

This happens all the time via RAG. The model “knows” certain things via its weights, but it can also inject much more concrete post-training data into its context window via RAG (e.g. web searches for documentation), from which it can usefully answer questions about information that may be “not in its training data”.