jabl 18 hours ago

To comment on using direct current rather than alternating current, at least NVIDIA is working on 800VDC:

https://developer.nvidia.com/blog/nvidia-800-v-hvdc-architec...

https://developer.nvidia.com/blog/building-the-800-vdc-ecosy...

More generally, a bunch of the electrical industry heavyweights are involved in a standard for DC microgrids called Current/OS: https://currentos.org/

There are certainly advantages to DC vs AC, but there's of course a huge amount of sunken cost in contemporary AC networks.

bob1029 a day ago

From a purely engineering perspective I think it becomes difficult to argue with the gas turbine once you get into the gigawatt class of data center. The amount of land required for this much solar is not to be understated. In many practical scenarios the solar array would need to be located a distance away from the actual data center. This implies transmission infrastructure which is often the hardest part of any electrical engineering project. You can put a gigawatt of N+1 generation on a 50 acre site with gas. It's dispatchable 24/7/365 and you can store energy for pennies on the dollar at incredible scale.

Having both forms of generation available at the same time is the best solution. Once you put a data center on the grid you can mix the fuel however you want upstream. This should be the ultimate goal and I believe it is for all current AI projects. I am not aware of any data center builds that intend to operate on parking lot generators indefinitely.

  • dan-robertson 15 hours ago

    For inference you don’t need gpus to be clustered together as much (generally training has lots of synchronisation steps so you can be bottlenecked on that instead of ‘real’ work) as they can handle separate tasks in parallel. But maybe other economies of scale still make you want to put them together (and therefore on average further from the power).

    I guess there was a bit of thought about transmission with the reference to high voltages. Another interesting thing: batteries allow you to reduce the needed capacity for transmission lines – if you have batteries near generation and then transmit power at a lower maximum, same average rate than if you only have batteries near use, you can more efficiently use the available transmission.

    I guess the main reason for gas to be a problem is if you can’t get new generation (eg lack of turbines).

  • hjoutfbkfd a day ago

    they are talking about covering the desert with solar panels. why would you not put the data center in the middle of it?

    • bob1029 17 hours ago

      Dust, high ambient temperatures, latency, accessibility.

      I think the dust is the worst part in terms of operational concerns.

    • sethops1 a day ago

      Simply because latency is a competitive advantage, one worth paying for. At the speed of light, making a trip out to the desert and back is too slow.

      • hjoutfbkfd 21 hours ago

        20 ms extra, for models which respond in 5 minutes

        • stogot 20 hours ago

          Right It is a use case where humans are not latency sensitive

  • matt-p 20 hours ago

    Sadly, I agree until we get SMRs (I think we are few years off). Obviously it would be more ideal to use grid+solar with curtailment but not super realistic.

  • cinntaile a day ago

    If you have predictable demand at that scale, nuclear might make more sense than the combination of gas and solar.

    • leetrout 21 hours ago

      I am hoping nuclear batteries make a comeback by the desire for all this compute and its voracious appetite for energy.

      • alansaber 21 hours ago

        We have rolls royce small modular reactors (SMRs) driving a similar functionality in the UK

        • stephen_g 19 hours ago

          For context, at the moment they hope to have them operating some time in the 2030s. That’s a best case, just like the cost estimates (which operating practically and safely may be more than what people are forecasting)

          Not operating today like it sounds from the comment.

  • [removed] a day ago
    [deleted]
Havoc a day ago

Slightly OT, but I see the Chinese are talking about space DCs now too which would suggest they reckon it could work too. (Unlike me and others here)

  • hhh a day ago

    datacenters in space are a great way to claim vast amount of viable orbit space for a stupid project to eventually sell the slot for something else when it’s rarer.

    • alansaber 21 hours ago

      This is basically the same argument made by people in domain-specific language models but rather than physical space (in space) it's mind-share, so actually your argument makes more sense? lol.

  • numpad0 19 hours ago

    It can't work if you're launching from Earth. Datacenters are too heavy with or without the solar and radiator panels.

    If you could make those panels and chips on the Moon, Deimos, Mars, high Jupiter, wherever, then space datacenters can totally work.

  • bgnn 19 hours ago

    Except none of that data center grade chips can work in the space. No GPUs, no memory, no SSD. They are not radiation-hardened (rad-hard). Rad-hard chips generally cost an oder of magnitude or more compared to normal commercial chips, and they are in general an order of magnitude less complex, plus they operate much lower frequencies. Data centers in space is straight up stupid.

  • Galanwe a day ago

    Not a physician, but wouldn't space be terrible for heat dissipation?

    • ampersandwhich 21 hours ago

      Also not a rocket surgeon, but to my understanding, modern satellites already have solar panels and radiators that account for the system's overall energy absorption and dissipation in low Earth orbit [1]. Therefore, plugging a supercomputer into the solar array instead of another instrument would likely not affect the overall heat profile meaningfully. Most energy in LEO is ultimately derived from solar irradiance and passes through the spacecraft regardless of internal usage. That said, take this with a grain of salt due to the aforementioned lack of astrochirurgical bona fides.

      Edit: Added some primary sources [2][3][4], including an interactive website by Andrew McCalip which lets you play around with the unit economics of orbital 'datacenters' at various price points [4].

      [1] https://youtu.be/DCto6UkBJoI

      [2] https://services.google.com/fh/files/misc/suncatcher_paper.p...

      [3] https://starcloudinc.github.io/wp.pdf

      [4] https://andrewmccalip.com/space-datacenters

      • [removed] 19 hours ago
        [deleted]
    • preisschild 21 hours ago

      Yes, you would need massive amounts of radiators

      • Incipient 10 minutes ago

        And solar panels. I don't think we can put 1GW+ of solar panels into orbit?

  • alansaber 21 hours ago

    I think it's more of a classic mirror move where IF they do work, they're at danger of falling behind.

    • xyzsparetimexyz 19 hours ago

      Falling behind? No, they're shadowing us, waiting until we make a mistake.

phtrivier 21 hours ago

I'm curious about Handmeier's opinion on location of data centers.

Should they be close to the solar arrays (that is, in the desert, with data networks connecting them to were the tokens are used)

Or close to their customers (which mean far from the solar arrays, with electricity networks)

He's talking a lot about removing movable parts, but aren't the wires going to be an limiting factor ?

  • bgnn 20 hours ago

    Fiber is much much less of a cost and technical challenge compared to transfering GWs of power. Unless the customer cannot handle up to 100ms latency, it's totally logical to place the data centers close to the power source, or vice versa (power source close to the data center).

hambes a day ago

it is difficult to comprehend for me that soneone spends all this time thinking through and calculating how to harness as much energy as possible and then wants to use it for large language models instead of something useful, like food production, communication, transport or any other way of satisfying actual human material needs. what weird priorities.

  • Hendrikto 19 hours ago

    Whether you like it or not, we are burning a lot of electricity on datacenters. That is a fact. And energy consumption is likely going to significantly increase in the near future. If we can reduce that energy usage, that is a good thing and a big improvement.

    I do not think I even understand your complaint. Different people can work on different problems. We do not have to pick only one.

    > My improvement is more important than yours.

    We can just do both.

    • ufish235 19 hours ago

      We don’t do both. We spend trillions on AI.

    • xyzsparetimexyz 19 hours ago

      Reducing consumption is just a case of using A) smaller models and B) not shoving AI into everything, e.g. ads, search results, email summaries

  • samus 20 hours ago

    LLMs and other IT applications have the distinct advantage that they require no other raw materials as input, aside from initial setup, extension, and maintenance. Under these conditions the requirements essentially boil down to real estate and high bandwidth internet connections. Also, demand for AI is currently so high that the solution can be scaled up far enough to be viable.

    All the other concerns require more subtle approaches because human requirements are much more messy.

    • hambes 16 hours ago

      demand for AI is not high, which is the current problem of the industry and the reason that AI companies are trying to shoehorn their technology into products everywhere.

      these companies and the author of the article are trying to increase capacity for something that barely anyone wants in the software they use, which makes it all the more wasteful.

      • samus 15 hours ago

        I agree, the author seems wildly optimistic that all that capacity will indeed be needed in the long run. But I personally hope that it will lead to a breakthrough of solar and battery storage even if demand for AI tanks. If that happens, one could still shunt all that solar energy to other places, either with alternators and overland lines, or by shipping charged grid scale batteries via train.

  • dan-robertson 15 hours ago

    I don’t think that’s a great description of what’s going on here. I think there are two things:

    1. The actual thing the authors spend a lot of time thinking about seems to be more generally how to make good use of solar power for things that people find valuable – synthetic fuels desalination, etc – and the implications of the sun only shining some of the time – maybe you don’t want to pay more for more efficient systems as then you want steady power which is more expensive.

    2. I think the blog post is a bit of a response to lots of public discussion about AI data centres. IMO seems better to see what someone who thinks a lot about energy has to say than eg, a government suggestion that you delete old pictures to reduce water consumption.

  • stingraycharles a day ago

    Sometimes (often) solving the problem is the most fun part, regardless of how it’s used.

    The scale of AI energy consumption is quite unique from what I heard, and there’s a lot of money flowing into that direction. So that seems to me a decent reason to think about that.

    I haven’t heard yet that food production is constrained by these kind of things.

    It appears to make that you’re just taking a cheap jab at AI.

    • alansaber 21 hours ago

      Exactly this, you need a (big) problem to motivate people to actually take a serious jab at a (big) new idea

  • compass_copium 20 hours ago

    Well, I've never seen anything written by AI evangelists that doesn't sound like it was written in day three of an adderall binge. This essay is no different.

  • 627467 18 hours ago

    I share the reaction but I'm also aware how easy is it to inventivize (aka subsidize) ineffective old processes in the name of "productive" priorities. The problem is not LLM/DC, the problem is food production, transport and communications are not sexy in a "post-scarcity" (entitled/distracted) societies. People take too many things for granted

  • sandworm101 20 hours ago

    Tell that to the 1000-watt space heater in the corner that i tasked with upscaling some old home movies! Four GPUs worked very hard all night to get footage of my first dog up to 1080p. My living room is a little warm this morning.

  • gruez 19 hours ago

    >instead of something useful, like food production, communication, transport or any other way of satisfying actual human material needs. what weird priorities.

    You realize that even pre-AI, that this complaint would still hold for most of tech? Adtech, enterprise SaaS, and B2C apps are hardly "actual human material needs". Even excluding tech, the next lucrative sector would be banking, and same complaint would be applicable. In other words, this is a decades (centuries?) old complaint, repackaged for the current thing.

    • hambes 16 hours ago

      yes, i do realize that. thank you for expanding on my point.

  • fnord77 19 hours ago

    the saying goes something like: the brightest minds in the world are getting together to figure out how to deliver more ads

  • hjoutfbkfd a day ago

    if anything we are producing too much food

    and what communications you find lacking?

    • phtrivier 21 hours ago

      Food distribution is still a problem in vast part of the world.

      Handling food waste is another issue.

      Climate related shortage are coming soon for us (at the moment they only manifest as punctual price hikes - mustard a few years ago, coffee and chocolate more recently, etc...

      https://www.euronews.com/green/2025/02/13/goodbye-gouda-and-...

      https://www.fao.org/newsroom/detail/adverse-climatic-conditi...

      https://www.forbes.com/sites/noelfletcher/2024/11/03/how-cli...

      I don't know if the electricity going into compute centers could be put to better use, to help alleviate climate change impacts, or to create more resilient and distributed supply chains, etc...

      But I would not say that this is "not a problem", or that it's completely obvious that allocating those resources instead to improving chatbots is smart.

      I understand why we allocate resource to improving chatbots - first world consumers are using them, and the stock markets assume this usage is soon going to be monetized. So it's not that different from "using electricity to build radios / movie theater / TVs / 3D gaming cards, etc... instead of desalinating water / pulling CO2 out of the air / transporting beans, etc...

      But at least Nvidia did not have the "toupet" to claim that using electricity to play Quake in higher res would solve world hunger, as some people claim:

      https://www.forbes.com/sites/johnwerner/2024/05/03/sam-altma...

      • EQmWgw87pw 20 hours ago

        It feels like you didn’t read your own link as he somewhat addressed your concern directly. The idea is simply that AI investment is an “up front cost” to future improvements. To debate against it you would have to provably explain why you think AI will not advance other technologies whatsoever.

        • phtrivier 16 hours ago

          I usually don't try to prove things won't happen. I leave the burden of proof to the salesmen. In this case, they have extraordinary claims, so as the saying goes, I wait for extraordinary proofs.

          So far they have failed to convince me.

    • scellus a day ago

      the main bottleneck for the civilization in communications currently is the sparsity of cynical, negative HN comments

      • cornhole 20 hours ago

        nerds favorite pastime is to go “um actually ”

ErroneousBosh a day ago

Why are we wasting resources on toy chatbots?

  • alansaber 20 hours ago

    Because fusion energy isn't cool anymore.

  • adamsb6 19 hours ago

    Why are we wasting resources hosting countless replicas of alt.tv.simpsons?

  • boxed 21 hours ago

    If you think this is what LLMs are, then you are a bit behind the times. Opus 4.5 is a huge step up. The previous generation was good for starting basic hobby projects, now we can do pretty big time-consuming changes with it.

    I have been extremely skeptical and dismissive of LLMs for a long time, but after a certain level of improvement you have to realize that at least for programming the advantages are substantial.

    • ErroneousBosh 19 hours ago

      Okay, that's great. LLMs offer no benefit though.

      • boxed 18 hours ago

        Ok, let's take it this way:

        What evidence could convince you there is some benefit?

        • ErroneousBosh 14 hours ago

          Well, what would those benefits be? I genuinely don't see what it's useful for.

          From what I've seen, people spend an inordinate amount of time typing in "prompts", and the chatbot goes off and pretends to generate some code, and then the people have to work out what's wrong with it and type in another prompt.

          So it looks like the humans get to do all the slow, time-consuming drudge work of typing stuff in and debugging the result, but the chatbot does the interesting part.

          Why would I voluntarily just do the drudge work?

  • Joel_Mckay 21 hours ago

    Borrowing state money that ultimately indentures a country with over-engineered massive boondoggle projects.

    That regulatory capture con strangled more emerging economies than most like to admit. =3

    "The Dictator's Handbook: Why Bad Behavior is Almost Always Good Politics" (Bruce Bueno de Mesquita, Alastair Smith)

    • gruez 19 hours ago

      >Borrowing state money that ultimately indentures a country with over-engineered massive boondoggle projects.

      The datacenters of Meta, Google, Amazon, etc. are primarily funded by the government?

      • Joel_Mckay 19 hours ago

        Do they get tax breaks, subsidy, loan deals, and naive non-voting investor money?

        My point was these folks never gamble with their own cash from revenue. It is always the tax payer that ends up holding the gamblers debts. =3

        https://www.usdebtclock.org/