Comment by jrm4

Comment by jrm4 3 hours ago

9 replies

For what it's worth, I'm stuck on the very first x = x + 1 thing.

Not sure if you want to call it a screwup or bad grammar or whatnot, but it is perhaps the huge mistake that the "equals" sign was used for something that feels like, but emphatically DOES NOT mean, "is equal to."

It's "put this into that". It's an action verb. Should have perhaps insisted on x <- x + 1 or maybe better x + 1 -> x

9rx 39 minutes ago

> It's an action verb.

The difference is that it is an instruction. Conventional mathematical notation, while declarative by default, switches into instruction mode just the same with the "let" keyword. The usage of the "=" operator then becomes equivalent: e.g. let x = 1.

But as the aforementioned x = x + 1 comes from notations that are never declarative, where every statement is an instruction, the "let" keyword is unnecessary and redundant. You already know you are looking at an instruction.

> Should have perhaps insisted on x <- x + 1 or maybe better x + 1 -> x

While that would obviously work, it strays further from the conventional notation. Which doesn't matter to the universe in any way, but the topic of discussion here is about trying to stay true to conventional notation, so...

  • jrm4 7 minutes ago

    Is that the topic?

    I do think what I'm trying to say here is: sorry, but your "conventional notation" sucks because "what is actually happening" is so very different from how the thing is overwhelmingly used for most people.

jrm4 9 minutes ago

Doubling down on my downvotes, then:

Look, I teach IT (to both novices and interested folks at a college) for a living; count this idea as one that, when I present it this simply, many a lightbulb goes on for those new to programming. Much like for zero-indexed arrays, I do very well with "look, this is stupid, but here's why it got that way, we'll just deal with it."

And while on occasion I take pride in nerd-dom, I feel like -- especially with the advent of AI for coding -- this is dinosaur stuff, where we would do better to go along with what's WAY more intuitive than to try to stick with mathematical or programming purity, etc.

[removed] an hour ago
[deleted]
munificent 27 minutes ago

I believe you can blame Ken Thompson for this. In DMR's paper about early C history, he says:

> Other fiddles in the transition from BCPL to B were introduced as a matter of taste, and some remain controversial, for example the decision to use the single character = for assignment instead of :=.

I think Ken did most of the early design of B and DMR came along later to help with C. Ken has a famously terse style, so I can definitely see it being on brand to shave of a character from `:=`, which is what BCPL uses.

It's sort of a tricky little syntax problem. It makes perfect sense to use `=` for declarations:

    int x = 1;
At that point, you really are defining a thing, and `=` is the natural syntax for a definition. (It's what BCPL uses for defining most named things.)

You also need a syntax for mutating assignment. You can define a separate operator for that (like `:=`). Then there is the question of equality. From math, `=` is the natural notation because the operator there is often overloaded as an equality predicate.

Now you're in a funny spot. Declaring a variable and assigning to it later are semantically very similar operations. Both ultimately calculate a value and store it in memory. But they have different syntax. Meanwhile, defining a variable and testing two expressions for equality use the same syntax but have utterly unrelated semantics.

Given that math notation has grown chaotically over hundreds of years, it's just really hard to build an elegant notation that is both familiar to people and consistent and coherent.

mock-possum 2 hours ago

That’s funny, because to me, it was always immediately obvious that once that line runs, then it must be true - once that line runs, x is now equal to whatever it was before, + 1. It’s the price opposite of the lie, it’s the literal definition of truth, in the sense that what is true is set by that line.

Programming is telling things to the computer, and in this case, you’re telling it what x is. Whatever you tell a computer is all the computer knows, whatever a computer does can only be what it’s told. If you never told it what x was, then x wouldn’t be anything… that’s the truth.

  • munificent 40 minutes ago

    > once that line runs

    This is the key point. Some people have a mental model that algebraic syntax is describing a set of immutable properties. You are defining things and giving them names, but not changing anything. There is no notion of time or sequence, because the universe is unchanging anyway. All you're doing is elucidating it.

    Then there is a model where you are molding the state of a computer one imperative modification at a time. Sequence is essential because everything is a delta based on the state of the world before.

    You have the latter model, which is indeed how the hardware behaves. But people with a mathematical mindset often have the former and find the latter very unintuitive.

    • jrm4 4 minutes ago

      It's hard for me to not suggest though that, essentially -- the math people are "right." Which is to say, "=" meant what it meant for 400 years, and then computers come along and redefine it to be an action verb and not an identity. And I think it's fair to consider that a mistake.