Comment by deepdarkforest

Comment by deepdarkforest 10 hours ago

0 replies

On the foundational level, test time compute(reasoning), heavy RL post training, 1M+ plus context length etc.

On the application layer, connecting with sandboxes/VM's is one of the biggest shifts. (Cloudfares codemode etc). Giving an llm a sandbox unlocks on the fly computation, calculations, RPA, anything really.

MCP's, or rather standardized function calling is another one.

Also, local llm's are becoming almost viable because of better and better distillation, relying on quick web search for facts etc.