Comment by stevenjgarner

Comment by stevenjgarner 12 hours ago

116 replies

"It is 1958. IBM passes up the chance to buy a young, fledgling company that has invented a new technology called xerography. Two years later, Xerox is born, and IBM has been kicking themselves ever since. It is ten years later, the late '60s. Digital Equipment DEC and others invent the minicomputer. IBM dismisses the minicomputer as too small to do serious computing and, therefore, unimportant to their business. DEC grows to become a multi-hundred-million dollar corporation before IBM finally enters the minicomputer market. It is now ten years later, the late '70s. In 1977, Apple, a young fledgling company on the West Coast, invents the Apple II, the first personal computer as we know it today. IBM dismisses the personal computer as too small to do serious computing and unimportant to their business." - Steve Jobs [1][2][3]

Now, "IBM CEO says there is 'no way' spending on AI data centers will pay off". IBM has not exactly had a stellar record at identifying the future.

[1] https://speakola.com/ideas/steve-jobs-1984-ad-launch-1983

[2] https://archive.org/details/1983-10-22-steve-jobs-keynote

[3] https://theinventors.org/library/inventors/blxerox.htm

bodge5000 an hour ago

This is the exact kind of thinking that got us into this mess in the first place, and I'm not blaming you for it, it seems to be something all of us do to an extent. We don't look to Meta, who only a few years ago thought that the Metaverse would be the "next big thing" as an example of failure to identify the future, we look to IBM who made that mistake almost 30 years ago. Underestimating a technology seems to stick much harder than overestimating one.

If you want to be seen as relevant in this industry, or as a kind of "thought leader", the easy trick seems to be to hype up everything. If you do that and you're wrong, people will quickly forget. If you don't and you're wrong, that will stain your reputation for decades.

  • axegon_ an hour ago

    The amount of hate I've received here for similar statements is astonishing. What is even more astonishing is that it takes 3-rd grade math skills to work out that the current AI(even ignoring the fact that there is nothing intelligent about the current AI) costs are astronomical and they do not deliver on the promises and everyone is operating at wild loses. At the moment we are at "if you owe 100k to your bank, you have a problem but if you owe 100M to your bank, your bank has a problem". It's the exact same bullshitter economy that people like musk have been exploiting for decades: promise a ton, never deliver, make a secondary promise for "next year", rinse and repeat -> infinite profit. Especially when you rope in fanatical followers.

    • consp 9 minutes ago

      The last sentence sounds a lot like a (partial?) Ponzi scheme.

  • mistersquid 43 minutes ago

    > We don't look to Meta, who only a few years ago thought that the Metaverse would be the "next big thing" as an example of failure to identify the future, we look to IBM who made that mistake almost 30 years ago.

    The grandparent points to a pattern of failures whereas you point to Meta’s big miss. What you miss about Meta, and I am no fan, is that Facebook purchased Whatsapp and Instagram.

    In other words, two out of three ain’t bad; IBM is zero for three.

    While that’s not the thrust of your argument, which is about jumping on the problem of jumping on every hype train, the post to which you reply is not on about hype cycle. Rather, that post calls out IBM for a failure to understand the future of technology and does so by pointing to a history of failures.

    • bodge5000 36 minutes ago

      > In other words, two out of three ain’t bad; IBM is zero for three.

      Many others in this thread have pointed out IBM's achievements but regardless, IBM is far from "zero for three".

helsinkiandrew 5 hours ago

> IBM has not exactly had a stellar record at identifying the future.

IBM invented/developed/introduced magnetic stripe cards, UPC Barcodes, the modern ATM, Hard drives, floppies, DRAM, SQL, the 360 Family of Mainframes, the PC, Apollo guidance computers, Deep Blue. IBM created a far share of the future we're living in.

I'm no fan of much of what IBM is doing at the moment but it could be argued that its consultancy/service orientation gives it a good view of how business is and is planning to use AI.

  • consp 3 minutes ago

    I've heard some second hand stories about IBM's way of using "AI" and it is pretty much business oriented and not much of the glamour and galore promises the other companies make (of course you still have shiny new things in business terms). It's actually good entertainment hearing all the internal struggles of business vs fancy during the holidays.

  • hinkley 4 hours ago

    They also either fairly accurately predicted the death of HDDs by selling off their research division before the market collapsed, or they caused the end of the HDD era by selling off their research division. They did a lot of research.

    • highwaylights an hour ago

      I think the retail market is maybe dead but datacenters are still a fairly large customer I’d think. HDDs really shine at scale where they can be fronted by flash and DRAM cache layers.

      • rbanffy an hour ago

        They are still cheaper than flash for cold data, but that’s not going to hold for long. Flash is so much denser the acquisition cost difference for a multi-petabyte store becomes small next to the datacenter space and power needed by HDDs. HDDs require research for increasing density while flash can rely on silicon manufacturing advances for that - not that it doesn’t require specific research, but being able to apply the IP across a vast space makes better economical sense.

    • ReptileMan 2 hours ago

      The hdd being dead will surely come as a surprise to the couple of 12TB rusties spinning joyously in my case right now.

      • newsclues an hour ago

        HDDs would be much more important today if flash storage didn’t exist.

        • busssard 44 minutes ago

          did you know that SSD are not memory stable if they dont get electricity every now and again...

  • Aperocky 5 hours ago

    The other way to look at it is that the entire consulting industry is teetering on catastrophe. And IBM, being largely a consulting company now, is not being spared.

    • kelnos 3 hours ago

      IBM isn't failing, though. They're a profitable company with healthy margins, and enterprises continue to hire them for all sorts of things, in large numbers.

    • ahartmetz 3 hours ago

      > The other way to look at it is that the entire consulting industry is teetering on catastrophe

      Oh? Where'd you get that information?

      If you mean because of AI, it doesn't seem to apply much to IBM. They are probably not great at what they do like most such companies, but they are respectable and can take the blame if something goes wrong. AI doesn't have these properties.

      • leoc an hour ago

        If anything there’s likely plenty of work for body shops like IBM in reviewing and correcting AI-generated work product that has been thrown into production recently.

    • carlmr 5 hours ago

      This is a separate argument though. A failing company may still be right in identifying other companies failure modes.

      You can be prescient about failure in one area and still fail yourself. There's no gotcha.

      • esseph 4 hours ago

        IBM is not a failing company though, they are a Goliath in the Enterprise space.

        • carlmr 2 hours ago

          Still besides the point. The company failing or not is orthogonal to them being able to identify failure in others.

      • marliechiller 4 hours ago

        > A failing company may still be right in identifying other companies failure modes.

        Agreed if this is what they are doing, but what if theyre spewing claims to try and discredit an industry in order to quell their shareholder concerns?

        • rbanffy an hour ago

          They are not the only ones looking at the money spent in AI datacentres and concluding most of the investment will not be recovered anytime soon.

          A lot of the silicon being deployed is great for training, but inefficient for inference and the training to inference ratio for usage shows a clear tendency to go the inference way. Furthermore, that silicon, with the workloads it runs, doesn’t last long and needs replacement.

          The first ones to go online might recover the investment, but the followers better have a plan to pivot to other uses.

    • iso1631 17 minutes ago

      The whole point of a consultant is to let the execs blame someone else.

      Nobody got fired for buying something Gartner recommended, or for following EY's advice to lay off/hire

      I don't see AI taking that blame away.

  • Glemkloksdjf 2 hours ago

    For the fact that they invented Deep Blue, they are really struggling with AI

    • catwell 2 hours ago

      Their Granite family of models is actually pretty good! They just aren't working on the mainstream large LLMs that capture all the attention.

      • rbanffy an hour ago

        IBM is always very conscious of what their clients need (and the large consultancy business provides a very comprehensive view). It just turns out their clients don’t need IBM to invest in large frontier models.

    • oedemis 20 minutes ago

      ibm developed SSMs/mamba models and also releasing trainings datasets i think, also quantum computing is strategic option..

  • meekaaku 4 hours ago

    IBM is/was good at inventing a lot of tech.

    It may not be good at recognizing other good tech invented or paradigm changes by others

  • jrflowers 5 hours ago

    > IBM invented/developed/introduced magnetic stripe cards, UPC Barcodes, the modern ATM, Hard drives, floppies, DRAM, SQL, the 360 Family of Mainframes, the PC, Apollo guidance computers, Deep Blue. IBM created a far share of the future we're living in.

    Well put. “IBM was wrong about computers being a big deal” is a bizarre take. It’s like saying that Colonel Sanders was wrong about chicken because he, uh… invented the pressure fryer.

EagnaIonat 5 hours ago

I read the actual article.

He is pointing out that the current costs to create the data centres means you will never be able to make a profit to cover those costs. $800 Billion just to cover the interest.

OpenAI is already haemorrhaging money and the space data centres has already been debunked. There is even a recent paper that points out that LLMs will never become AGI.

The article also finishes out with some other experts giving the same results.

[edit] Fixed $80 to $800

  • rbanffy 37 minutes ago

    While AGI might be the Holy Grail, AI doesn’t need to be general human-level to be useful and profitable.

  • ta12653421 5 hours ago

    >> There is even a recent paper that points out that LLMs will never become AGI.

    can you share a link?

    • EagnaIonat 4 hours ago

      Took me a while to find again, as there are a lot of such papers in this area.

      https://www.arxiv.org/pdf/2511.18517

      • will4274 4 hours ago

        Is this AI paper written by a reputable subject matter expert? It seems to be written by a physicist and also be the only academic work by this author in English

      • mkl 4 hours ago

        A single author, in a physics department. Seems unlikely to be groundbreaking or authoritative.

    • [removed] 4 hours ago
      [deleted]
  • Glemkloksdjf 2 hours ago

    Sry to say but the fact that you argue with LLMs never become AGI, you are not up-to-date.

    People don't assume LLM will be AGI, people assume that World Models will lead us to AGI.

    I personally never asumed LLM will become AGI, i always assumed that LLM broke the dam for investment and research into massivce scale compute ML learning and LLMs are very very good in showing were the future goes because they are already so crazy good that people can now imagine a future were AGI exists.

    And that was very clear already when / as soon as GPT-3 came out.

    The next big thing will probably be either a LOT more RL or self propelling ai architecture discovery. Both need massive compute to work well but then will potentially provide even faster progress as soon as humans are out of the loop.

    • EagnaIonat 2 hours ago

      > People don't assume LLM will be AGI,

      I wish that was true.

      > people assume that World Models will lead us to AGI.

      Who are these people? There is no consensus around this that I have seen. You have anything to review regarding this?

      > as soon as GPT-3 came out.

      I don't think that was true at all. It was impressive when it came out, but people in the field clearly saw the limitations and what it is.

      RL isn't magical either. Google AlphaGo as an example often required human intervention to get the RL to work correctly.

      • Glemkloksdjf an hour ago

        AlphaGo Zero doesn't need much human intervention at all.

        Regarding world models: All the big ones. LeCun, Demis Hassabis, Fei-Fei Li too. And they are all working on it.

        LLMs will definitly play some type of role in AGI. After all you can ask an LLM already a lot of basic things like 'what are common tasks to make a tea'. A type of guide, long term fact memory or whatever this can be called.

skissane 11 hours ago

> In 1977, Apple, a young fledgling company on the West Coast, invents the Apple II, the first personal computer as we know it today. IBM dismisses the personal computer as too small to do serious computing and unimportant to their business.

IBM released the 5100 in September 1975 [0] which was essentially a personal computer in feature set. The biggest problem with it was the price tag - the entry model cost US$8975, compared to US$1298 for the entry Apple II released in June 1977 (close to two years later). The IBM PC was released in August 1981 for US$1565 for the most basic system (which almost no one bought, so in practice they cost more). And the original IBM PC had model number 5150, officially positioning it as a successor to the 5100.

IBM’s big problem wasn’t that they were disinterested in the category - it was they initially insisted on using expensive IBM-proprietary parts (often shared technology with their mainframe/midrange/minicomputer systems and peripherals), which resulted in a price that made the machine unaffordable for everyone except large businesses, governments, universities (and even those customers often balked at the price tag). The secret of the IBM PC’s success is they told the design team to use commercial off-the-shelf chips from vendors such as Intel and Motorola instead of IBM’s own silicon.

[0] https://en.wikipedia.org/wiki/IBM_5100

  • meekaaku 4 hours ago

    And outsourcing the operating system to Microsoft, because they didnt consider it that important.

rchaud 12 hours ago

Got anything vis-a-vis the message as opposed to the messenger?

I'm not sure these examples are even the gotchas you're positing them as. Xerox is a dinosaur that was last relevant at the turn of the century, and IBM is a $300bn company. And if it wasn't obvious, the Apple II never made a dent in the corporate market, while IBM and later Windows PCs did.

In any case, these examples are almost half a century old and don't relate to capex ROI, which was the topic of dicussion.

  • stevenjgarner 11 hours ago

    If it's not obvious, Steve's quote is ENTIRELY about capex ROI, and I feel his quote is more relevant to what is happening today than anything Arvind Krishna is imagining. The quote is posted in my comment not to grandstand Apple in any sense, but to grandstand just how consistently wrong IBM has been about so many opportunities that they have failed to read correctly - reprography, mini computers and microcomputers being just three.

    Yes it is about ROI: "IBM enters the personal computer market in November ’81 with the IBM PC. 1983 Apple and IBM emerged as the industry’s strongest competitors each selling approximately one billion dollars worth of personal computers in 1983, each will invest greater than fifty million dollars for R&D and another fifty million dollars for television advertising in 1984 totaling almost one quarter of a billion dollars combined, the shakeout is in full swing. The first major firm goes bankrupt with others teetering on the brink, total industry losses for 83 out shadow even the combined profits of Apple and IBM for personal computers."

    • IgorPartola 7 hours ago

      I have no horse in this race.

      I don’t think this is really a fair assessment. IBM is in fact a huge company today and it is possible that they are because they took the conservative approach in some of their acquisition strategy.

      It is a bit like watching someone play poker and fold and then it turns out they had the high hand after all. In hindsight you could of course know that the risk would have been worth it but at the moment perhaps it did not seem like it given the money the first player would be risking.

      • bojan 3 hours ago

        > I don’t think this is really a fair assessment. IBM is in fact a huge company today and it is possible that they are because they took the conservative approach in some of their acquisition strategy.

        I can also imagine IBM was being approached by hundreds, if not thousands, propositions. That they missed three that turned out to be big is a statistical probability.

    • somenameforme 7 hours ago

      A big difference is that in the past things like the potential of the PC were somewhat widely underestimated. And then the internet was again as well.

      But in modern times it's rather the opposite scenario. The average entity is diving head first into AI simply expecting a revolutionary jump in capability that a more 'informed', for lack of any less snooty term, perspective would suggest is quite unlikely to occur anytime in the foreseeable future. Basically we have a modern day gold rush where companies and taking out unbelievably massive loans to invest in shovels.

      The only way this doesn't catastrophically blow up is if AI companies manage to convince the government they're too big to fail, and get the Boeing, Banks, et al treatment. And I expect that's exactly the current strategy, but that's rather a high risk, low reward, type strategy.

      • fuzzfactor an hour ago

        >things like the potential of the PC were somewhat widely underestimated.

        The potential of the AI that comes within reach at maximum expenditure levels may just be more widely overestimated.

        The potential to make "that much money" even more challenging.

        A very opposite scenario.

        I think so many corporations are looking at how expensive actual humans always have been, and can be sure will always be, so much so that it's a major cost item that can not be ignored. AI opens up the possibility of a whole new level of automation or outright replacement for the routine simple-minded tasks, to a degree that never existed before. More jobs could possibly be eliminated than previous waves of mechanical and digital automation.

        When you do the business math, the savings could be enormous.

        But you can only realistically save as much as you are actually wasting, otherwise if you go too far you shoot yourself in the foot.

        Even with all that money to work with, if you're in practice hunkering down for savings because you can't afford real people any more, you surely can't say the sky's the limit. Not like selling PC's or anything that's capable of more unbridled growth.

        When PC's arrived they flew off the shelf even at their high initial retail prices.

        People in droves (but not the silent majority) are shunning free AI and the movement is growing with backlash in proportion to the foisting.

    • davidmanescu 7 hours ago

      I have no special knowledge about IBM Vs Apple historically, but: a quarter billion in CAPEX when you've earned a billion in revenue in a single year is extremely different to what we're seeing now. These companies are spending all of their free cash flow, then taking on debt, to the tune of percentage points of world GDP, and multiples of any revenue they've seen so far. That kind of oversupply is a sure fire way to kill any ROI.

  • jstummbillig 5 hours ago

    > Got anything vis-a-vis the message as opposed to the messenger?

    Sure: People disagree. It's not like there is anything particularly clever that IBM CEO provided here. The guy not investing in something saying it won't work is about as good as the people who do saying it will. It's simply different assumptions about the future.

  • fuzzfactor an hour ago

    >the message as opposed to the messenger?

    Exactly.

    The message is plain to see with very little advanced math.

    The only news is that it is the CEO of IBM saying it out loud.

    IMHO he has some of the most credible opinions at this scale that many people have seen.

    It's "highly unlikely" that all this money will be paid back to everyone that invested at this point. The losers probably will outnumber the winners, and nobody knows whether it will end up becoming a winner-take-all situation yet. A number of wealthy players remain at the table, raising stakes with each passing round.

    It's so much money that it's already too late to do anything about it, and the full amount hasn't even changed hands yet.

    And the momentum from something so huge can mean that almost the entire amount will have to change hands a second time before a stable baseline can be determined relative to pre-existing assets.

    This can take longer than anyone gives credit for just because of massiveness, in the mean time, established real near-term growth opportunities may languish or even fade as the skew in rationality/solvency balance awaits the rolling dice to come to rest.

  • killingtime74 7 hours ago

    Would you read this if I (a nobody) told you and not the "CEO of IBM"? In that case it's completely fair to question the messenger.

bayindirh 5 hours ago

IBM is an interesting beast when it comes to business decisions. While I can't give exact details, their business intelligence and ability to predict monetary things is uncannily spot-on at times.

So, when their CEO says that this investment will not pay off, I tend to believe them, because they most probably have the knowledge, insight and data to back that claim, and they have ran the numbers.

Oh, also, please let's not forget that they dabbled in "big AI" before everyone else. Anyone remembers Deep Blue and Watson, the original chatbot backed by big data?

kelnos 3 hours ago

We can cherry-pick blunders made by any big company to make a point. Maybe it would be more honest to also list companies IBM passed on that turned out to be rubbish? And all the technologies that IBM did invest in that made them a ton of money and became industry standards?[0]

Today, Xerox has less total revenue than IBM has profit. DEC went out of business 27 years ago. Apple is an in astoundingly great place right now, but Jobs got kicked out of his own company, and then returned when it was about to fail, having to take investment from Microsoft(!) in order to stay afloat.

Meanwhile, IBM is still here, making money hand over fist. We might not have a ton of respect for them, being mostly a consulting services company these days, but they're doing just fine.

[0] As another commenter points out: https://news.ycombinator.com/item?id=46131245

akst 2 hours ago

"The amount being spent on AI data centres not paying off" is a different statement to "AI is not worth investing in". They're effectively saying the portions people are investing are disproportionately large to what the returns will end up being.

It's a difficult thing to predict, but I think there's almost certainly some wasteful competition here. And some competitors are probably going to lose hard. If models end up being easy to switch between and the better model is significantly better than its competitors, than anything invested in weaker models will effectively be for nothing.

But there's also a lot to gain from investing in the right model, even so it's possible those who invested in the winner may have to wait a long time to see a return on their investment and could still possibly over allocate their capital at the expense of other investment opportunities.

elnatro 6 hours ago

Were Xerox, Dec, or Apple burning investor money by the billions of dollars?

  • chroma205 5 hours ago

    > Were Xerox, Dec, or Apple burning investor money by the billions of dollars?

    Shhh. You are not allowed to ruin OpenAI’s PPU value. Can’t make the E7’s feel bad.

  • spiderfarmer 6 hours ago

    No, but the comment above and variations of it are mentioned in every thread about IBM, so it’s probably just a reflex at this point without much thought behind it.

  • beambot 5 hours ago

    Xerox is clearly crushing it in 2025... /s

    • raducu 5 hours ago

      That's completely beyond the point, though? Kodak invented the digital camera, did not think anything about it and others then ate their lunch. Those others are also not crushing it in 2025. The point is IBM is not the go-to to listen about AI. Also not saying they are not right, even a broken clock is right 2 times a day.

      • kelnos 3 hours ago

        > The point is IBM is not the go-to to listen about AI.

        Why not, though? For better or worse, they're a consulting services company these days, and they work with an eye-wateringly large number of companies. I would expect them to have a very good view as to what companies use AI for, and plan/want to use AI for in the future. They may not be experts in the tech itself, but I think they're decently well-positioned to read the tea leaves.

pacifika 7 hours ago

50 year grudges are not relevant there is no one still at ibm that worked there in 1977, IMHO.

  • vlovich123 6 hours ago

    It’s the ship of Theseus in corporate form. Even if all the people are gone but the culture hasn’t changed, is the criticism inaccurate?

    • EagnaIonat 5 hours ago

      > Even if all the people are gone but the culture hasn’t changed

      Can you expand on this? What was the culture then versus now?

      For example back then it was the culture to have suit inspectors ensure you had the right clothes on and even measure your socks. (PBS Triumph of the Nerds)

    • altmanaltman 6 hours ago

      I mean, okay, but you're taking the current leadership's words and claiming they are incorrect because IBM management was not great at identifying trends decades ago. Historical trend is not an indicator of the future and it's not engaging in good faith on the conversation if overspending on AI can be backed by revenue in the future. You're attacking the messenger instead of the message.

      • vlovich123 4 hours ago

        I’m saying given IBMs track record of completely failing at innovation repeatedly and failing at investing on the correct technologies, why do you assume today’s CEO is better at it and has bucked the milquetoast culture that has pervaded IBM? It’s a company that has largely divested its ability to provide solutions and technology and turned more into a large shop consultancy (+ milking their legacy mainframe contracts and whatnot).

      • alex77456 6 hours ago

        I didn't read the top level comment as dismissive or 'proving it wrong', but rather as adding context, or even being humorous somewhat

        • altmanaltman 5 hours ago

          I don't understand how calling something "a ship of Theseus in corporate form" or "culture hasn't changed" etc, is not dismissive of the actual comment by the CEO on AI overspending. They dismissed the content of the message by saying IBM's culture sucks, is how i read it. Also things can be funny and dimissive at the same time, they often are.

  • ndr 6 hours ago

    Culture evolution can be very fast, yet some cultures stick around for a very long time.

mattacular 8 hours ago

What does that have to do with the current CEO's assessment of the situation?

  • pinnochio 8 hours ago

    Nothing. It's just BS rhetoric to bias you against it in favor of The Obvious-to-Everyone-But-the-Laggards AI Revolution.

    • hansmayer 4 hours ago

      A revolution means radical changes executed over a short period of time. Well with 4 years in, this has got to be one of the smallest "revolutions" we have ever witnessed in human history. Maybe it's revolutionary for people who get excited about crappy pictures they can insert into their slides to impress the management.

    • Atlas667 7 hours ago

      The AI astroturfing campaign.

      If you had billions to gain, would you invest a few 100k or millions in an astroturfing campaign?

    • venturecruelty 7 hours ago

      You definitely want to be standing in front of a chair when the music stops.

  • echelon 7 hours ago

    IBM sees the funding bubble bursting and the next wave of AI innovation as about to begin.

    IBM was too early with "Watson" to really participate in the 2018-2025 rapid scaling growth phase, but they want to be present for the next round of more sensible investment.

    IBM's CEO is attempting to poison the well for funding, startups, and other ventures so IBM can collect itself and take advantage of any opportunities to insert itself back into the AI game. They're hoping timing and preparation pay off this time.

    It's not like IBM totally slept on AI. They had Kubernetes clusters with GPUs. They had models and notebooks. But their offerings were the absolute worst. They weren't in a position to service real customers or build real products.

    Have you seen their cloud offerings? Ugh.

    They're hoping this time they'll be better prepared. And they want to dunk on AI to cool the playing field as much as they can. Maybe pick up an acquisition or two on the cheap.

    • hansmayer 4 hours ago

      How exactly are they poisoning the well..? OpenAI committed to 1.4 trillion investements...with a revenue of ~13B - how is IBM CEO contributing to that absolutely already poisoned situation? Steve Jobs did not care about naysayers when he introduced iPhone - because his product was so innovative for the time. According to AI boosters, we now have a segment of supposedly incredibly powerful and at the same time "dangerous" AI products. Why are they not sweeping the floor off with the "negators", "luddites", "laggards" etc... After so many hundreds of billions of dollars and supposedly so many "smart" AI researchers...Where are the groundbreaking results man? Where are the billion-dollar startups launched by single persons (heck, I'd settle even for a small team)...Where are the ultimate applications..etc?

hansmayer 5 hours ago

Right, you just missed the part where DEC went out of business in the 90s. And IBM is still around, with a different business model.

jacquesm 6 hours ago

Steve Jobs, the guy that got booted out of his own company and that required a lifeline from his arch nemesis to survive?

This is all true, but it was only true in hindsight and as such does not carry much value.

It's possible that you are right and AI is 'the future' but with the present day AI offering I'm skeptical as well. It isn't at a level where you don't have to be constantly on guard against bs and in that sense it's very different from computing so far, where reproducibility and accuracy of the results were important, not the language that they are cast in.

AI has killed the NLP field and it probably will kill quite a few others, but for the moment I don't see it as the replacement of general computing that the proponents say that it is. Some qualitative change is still required before I'm willing to check off that box.

In other news: Kodak declares digital cameras a fad, and Microsoft saw the potential of the mp3 format and created a killer device called the M-Pod.

ActionHank 42 minutes ago

Cool story, but it’s more than just the opinion of this CEO. It’s logic.

Hardware is not like building railroads, the hardware is already out of date once deployed and the clock has started ticking on writing off the expense or turning a profit on it.

There are fundamental discoveries needed to make the current tech financially viable and an entire next generation of discoveries needed to deliver on the over inflated promises already made.

Jean-Papoulos 6 hours ago

But how many companies did IBM pass on that did crash and burn ? And how many did it not pass on and did decently ? They're still around after more than 3 generations worth of tech industry. They're doing something right.

TLDR Cherrypicking

zorked 2 hours ago

This isn't even a great argument at a literal level. Nowadays nobody cares about Xerox and their business is selling printers, DEC was bought by Compaq which was bought by HP. Apple is important today because of phones, and itself was struggling selling personal computers and needed a (antitrust-motivated) bailout from Microsoft to survive during the transition.

jojobas 8 hours ago

DEC went down the drain, Xerox is 1/1000 of IBM's market cap. IBM made its own, superior by its relative openness, personal computer that ended up running the world, mostly maintaining direct binary compatibility for 40+ years, even without IBM really paying attention.

  • cylemons 3 hours ago

    How much did IBM itself benefit from the PC? I thought the clones ate their lunch there

[removed] 5 hours ago
[deleted]
[removed] 3 hours ago
[deleted]
zkmon 3 hours ago

So, is the napkin math wrong, or you are just going by the company history?

jrflowers 5 hours ago

> IBM has not exactly had a stellar record at identifying the future.

This would be very damning if IBM had only considered three businesses over the course of seventy years and made the wrong call each time.

This is like only counting three times that somebody got food poisoning and then confidently asserting that diarrhea is part of their character.

nosianu 2 hours ago

For some strange reason a lot of people were attracted by a comment that speaks about everything else BUT the actual topic and its the top comment now. Sigh.

If you think that carefully chosen anecdotes out of many many more are relevant, there needs to be at least an attempt of reasoning. There is nothing here. It's really just barebones mentioning of stuff intentionally selected to support the preconceived point.

I think we can, and should, do better in HN discussions, no? This is "vibe commenting".

jasonwatkinspdx 4 hours ago

You could try addressing the actual topic of discussion vs this inflammatory and lazy "dunk" format that frankly, doesn't reflect favorably on you.

delis-thumbs-7e 2 hours ago

I’m sorry, but this is stupid, you understand that you have several logical errors in your post? I was sure Clinton is going to win 2016. Does that mean that when I say 800 is bigger than 8 is not to be trusted?

Do people actually think that running a business is some magical realism where you can manifest yourself to become a billionaire if you just believe hard enough?

  • walt_grata 2 hours ago

    The post is almost worse than you give it credit for. Like it doesn't even take into account different people are making the decisions.

zaphirplane 3 hours ago

The idea that a company DNA somehow lives over 100 years and maintains the same track record is far fetched.

that the OpenAI tech bro are investing in AI using a grown up ROI is similarly far fetched, they are burning money to pull ahead of the reset and assume the world will be in the palm of the winner and there is only 1 winner. Will the investment pay off if there are 3 neck and neck companies ?

camillomiller 2 hours ago

IBM is still alive and kicking well, and definitively more relevant than Xerox or DEC. You are completely misconstruing Jobs’ point to justify the current AI datacenter tulip fever.

esseph 4 hours ago

Yet here they are at the front of Quantum Computing research

otikik 3 hours ago

Even a broken watch is right twice per day