Comment by throwanem

Comment by throwanem 2 days ago

2 replies

Mistaking model for meaning is the sort of mistake I very rarely see a human make, at least in the sense as here of literally referring to map ("text"), in what ostensibly strives to be a discussion of the presence or absence of underlying territory, a concept the model gives no sign of attempting to invoke or manipulate. It's also a behavior I would expect from something capable of producing valid utterances but not of testing their soundness.

I'm glad you didn't write that paragraph by yourself; I would be concerned on your behalf if you had.

fc417fc802 2 days ago

"Concerned on your behalf" seems a bit of an overstatement. Getting caught up on textual representation and failing to notice that the issue is fundamental and generalizes is indeed an error but it's not at all uncharacteristic of even fairly intelligent humans.

  • throwanem 2 days ago

    All else equal, I wouldn't find it cause for concern. In a discussion where being able to keep the distinction clear in mind at all times absolutely is table stakes, though? I could be fairly blamed for a sprinkle of hyperbole perhaps, but surely you see how an error that is trivial in many contexts would prove so uncommonly severe a flaw in this one, alongside which I reiterate the unusually obtuse nature of the error in this example.

    (For those no longer able to follow complex English grammar: Yeah, I exaggerate, but there is no point trying to participate in this kind of discussion if that's the sort of basic error one has to start from, and the especially weird nature of this example of the mistake also points to LLMs synthesizing the result of consciousness rather than experiencing it.)