Comment by ghaff
Comment by ghaff 6 days ago
Itanium needs a lot longer discussion than can be covered in an HN comment.
https://bitmason.blogspot.com/2024/02/the-sinking-of-itanic-...
Comment by ghaff 6 days ago
Itanium needs a lot longer discussion than can be covered in an HN comment.
https://bitmason.blogspot.com/2024/02/the-sinking-of-itanic-...
I’m curious what kind of code his 30 lines were - I’m betting something FP-heavy based on the public focus benchmarks gave thst over branchy business logic. I still remember getting the pitch that you had to buy Intel’s compilers to get decent performance. I worked at a software vendor and later a computational research lab, and both times that torpedoed any interest in buying hardware because it boiled down to paying a couple of times more upfront and hoping you could optimize at least the equivalent gain back … or just buy an off-the-shelf system which performed well now and do literally anything else with your life.
One really interesting related angle is the rise of open source software in business IT which was happening contemporaneously. X86 compatibility mattered so much back then because people had tons of code they couldn’t easily modify whereas later switches like Apple’s PPC-x86 or x86-ARM and Microsoft’s recent ARM attempts seem to be a lot smoother because almost everyone is relying on many of the same open source libraries and compilers. I think Itanium would still have struggled to realize much of its peak performance but at least you wouldn’t have had so many frictional costs simply getting code to run correctly.
I think you're right. The combination of open source and public clouds has really tended to reduce the dominance of specific hardware/software ecosystems, especially Wintel. Especially with the decline of CMOS process scaling as a performance lever, I expect that we'll see more heterogeneous computing in the future.
This form versus substance issue is a really deeply embedded problem in our industry, and it is getting worse.
Time and again, I run into professionals who claim X, only to find out that the assertion was based only upon the flimsiest interpretation of what it took to accomplish the assertion. If I had to be less charitable, then I’d say fraudulent interpretations.
Promo Packet Princesses are especially prone to getting caught out doing this. And as the above story illustrates, you better catch and tear down these “interpretations” as the risks to the enterprise they are, well before they obtain visible executive sponsorship, or the political waters gets choppy.
IMHE, if you catch these in time, then estimate the risk along with a solution, it usually defuses them and “prices” their proposals more at a “market clearing rate” of the actual risk. They’re usually hoping to pass the hot potato to the poor suckers forced to handle sustaining work streams on their “brilliant vision” before anyone notices the emperor has no clothes.
I’d love to hear others’ experiences around this and how they defused the risk time bombs.
> “You're predicting the entire future of this architecture on 30 lines of hand generated code?"
It’s comforting to know that massively strategic decisions based on very little information that may not even be correct are made in other organizations and not just mine.
Everybody does it. Information only comes because you made your strategic decision, never before it.
I don’t think it is that simple. Itanium was for years supported for example by RHEL (including GCC working of course, if anybody cared enough they could invest into optimising that), it is not like the whole fiasco happened in one moment. No, Itanium was genuinely a bad design, which never got fixed, because it apparently couldn’t be.
Well, yes, the market didn't care all that much for various reasons. (There were reasons beyond technology.) RHEL/GCC supported but, while I wasn't there at the time, I'm not sure how much focus there was. Other companies were hedging their bets on Itanium at the time--e.g. Project Monterey. Aside from Sun, most of the majors were placing Itanium bets to some degree if only to hedge other projects.
Even HP dropped it eventually. And the former CEO of Intel (who was CTO during much of the time Itanium was active) said in a trade press interview that he wished they had just done a more enterprisey Xeon--which happened eventually anyway.
We're not living through this again at all with generative AI, right?
I think Bob Colwell's account is the clearest short synopsis.
https://www.sigmicro.org/media/oralhistories/colwell.pdf
'And I finally put my hand up and said I just could not see how you're proposing to get to those kind of performance levels. And he said well we've got a simulation, and I thought Ah, ok. That shut me up for a little bit, but then something occurred to me and I interrupted him again. I said, wait I am sorry to derail this meeting. But how would you use a simulator if you don't have a compiler? He said, well that's true we don't have a compiler yet, so I hand assembled my simulations. I asked "How did you do thousands of line of code that way?" He said “No, I did 30 lines of code”. Flabbergasted, I said, "You're predicting the entire future of this architecture on 30 lines of hand generated code?" [chuckle], I said it just like that, I did not mean to be insulting but I was just thunderstruck. Andy Grove piped up and said "we are not here right now to reconsider the future of this effort, so let’s move on".'