Comment by extr

Comment by extr 2 days ago

0 replies

Modeling the distribution that produced a piece of text is what LLMs literally exist for, so in some sense this is unsurprising. But it calls into question almost all existing alignment research.