ThrowawayR2 2 days ago

Humans are already investigating whether LLMs might work more efficiently if they work directly in latent space representations for the entirety of the calculation: https://news.ycombinator.com/item?id=43744809. It doesn't seem unlikely that two LLMs instances using the same underlying model could communicate directly in latent space representations and, from there, it's not much of a stretch for two LLMs with different underlying models could communicate directly in latent space representations as long as some sort of conceptual mapping between the two models could be computed.

lbhdc 2 days ago

Such an underrated movie. Great watch for anyone interested in classic scifi.

nowittyusername 2 days ago

First time in a while I've watched a movie from the 70's in full. Thanks for the gem...

geraneum 2 days ago

> talking to each other in something other than English

WiFi?

otabdeveloper4 a day ago

They don't have an internal representation that isn't English. The embeddings arithmetic meme is a lie promulgated by disingenuous people.