Comment by kevdev

Comment by kevdev 4 days ago

48 replies

As someone with a similar background to the writer of this post (I did avionics work for NASA before moving into more “traditional” software engineering), this post does a great job at summing up my thoughts on why space-based data centers won’t work. The SEU issues were my first though followed by the thermal concerns, and both are addressed here fantastically.

On the SEU issue I’ll add in that even in LEO you can still get SEUs - the ISS is in LEO and gets SEUs on occasion. There’s also the South Atlantic Anomaly where spacecraft in LEO see a higher number of SEUs.

RobotToaster 4 days ago

As someone with only a basic knowledge of space technology, my first thought when I read the idea was "how the hell are they going to cool it".

inejge 3 days ago

> On the SEU issue I’ll add in that even in LEO you can still get SEUs

As a sibling post noted, SEUs are possible all the way down to sea level. The recent Airbus mass intervention was essentially a fix for a badly handled SEU in a corner case.

hedora 4 days ago

Single event upsets are already commonplace at sea level well below data center scale.

The section of the article that talks about them isn’t great. At least for FPGAs, the state of the art is to run 2-3 copies of the logic, and detect output discrepancies before they can create side effects.

I guess you could build a GPU that way, but it’d have 1/3 the parallelism as a normal one for the same die size and power budget. The article says it’d be a 2-3 order of magnitude loss.

It’s still a terrible idea, pf course.

  • sdenton4 3 days ago

    It strikes me that neutral network inference loads are probably pretty resilient to these kinds of problems (as we see the bits per activation steadily decreasing), and where they aren't, you can add them as augmentations at training time and they will essentially act as regularization.

  • ACCount37 3 days ago

    If you're using GPUs, you're running AI workloads. In which case: do you care?

    One of the funniest things about modern AI systems is just how many random bitflips they can tank before their performance begins to really suffer.

  • jeltz 3 days ago

    Sounds like it would remove a lot of the benefits gain from more solar power.

foobarian 4 days ago

The only advantage I can come up with is the background temperature being much colder than Earth surface. If you ignored the capex cost to get this launched and running in orbit, could the cooling cost be smaller? Maybe that's the gimmick being used to sell the idea. "Yes it costs more upfront but then the 40% cooling bill goes away... breakeven in X years"

  • jcranmer 4 days ago

    Strictly speaking, the thermosphere is actually much warmer than the atmosphere we experience--on the order of 100's or even a 1000 degrees Celsius, if you're measuring by temperature (the average kinetic energy of molecules). However, since particle density is so low, the number of molecules is quite low, and so total heat content of the thermosphere is low. But since particle count is low, conduction and convection are essentially nonexistent, which means cooling needs to rely entirely on radiation, which is much less efficient than other modes at cooling.

    In other words, a) background temperature (to the extent it's even meaningful) is much warmer than Earth's surface and b) cooling is much, much more difficult than on Earth.

    • MadnessASAP 4 days ago

      Technically radiation cooling is 100% efficient. And remarkably effective, you can cool an inert object to the temperature of the CMBR (4K) without doing anything at all. However it is rather slow and works best if there's no nearby planets or stars.

      Fun fact though, make your radiator hotter and you can dump just as much if not more energy then you would typically via convective cooling. At 1400C (just below the melting point of steel) you can shed 450kW of heat per square meter, all you need is a really fancy heat pump!

      • fsh 3 days ago

        Your hypothetical liquid metal heat pump would have a Carnot efficiency of only 25%.

      • wat10000 4 days ago

        How much power would a square meter at 1400C shed from convection?

  • noselasd 4 days ago

    Is it an advantage though ? One of the main objections in the article is exactly that.

    There's no atmosphere that helps with heat loss through convection, there's nowhere to shed heat through conduction, all you have is radiation. It is a serious engineering challenge for spacecrafts to getting rid of the little heat they generate, and avoid being overheated by the sun.

    • foobarian 4 days ago

      I think it is an advantage, the question is just how big, and assume we look only at ongoing operation cost.

      - Earth temperatures are variable, and radiation only works at night

      - The required radiator area is much smaller for the space installation

      - The engineering is simple: CPU -> cooler -> liquid -> pipe -> radiator. We're assuming no constraint on capex so we can omit heat pumps

      • Hikikomori 4 days ago

        Radiators on earth mainly do it to air, there's no air in space.

      • noselasd 3 days ago

        A typical CPU heatsink dissipates 10-30% of heat through radiation, and the rest through convection. In space you're in a vacuum so you can't disipated heat through convection.

        You need to rework your physical equipment quite substantially to make up for the fact you can't shed 70-90% of the heat in the same manner as you can down here on Earth

  • skywhopper 4 days ago

    But the cooling cost wouldn’t be smaller. There’s no good way to eliminate the waste heat into space. It’s actually far far harder to radiate the waste heat into space directly than it would be to get rid of it on Earth.

    • buildbot 4 days ago

      Which is why vacuum flask for hot/cold drinks are a thing/work. Empty space is a pretty good insulator as it turns out.

      It’s a little worrying so many don’t know that.

    • foobarian 4 days ago

      I don't know about that. Look at where the power goes in a typical data center, for a 10MW DC you might spend 2MW just to blow air around. A radiating cooler in space would almost eliminate that. The problem is the initial investment is probably impractical.

      • nick238 3 days ago

        >99.999% of the power put into compute turns into heat, so you're going to need to reject 8 MW of power into space with pure radiation. The ISS EATCS radiators reject 0.07 MW of power in 85 sq. m, so you're talking about 9700 sq. m of radiators, or bigger than a football field/pitch.

        • mercutio2 3 days ago

          Yes, so?

          Everyone keeps talking past each other on this, it seems.

          “Generating power in space is easy, but ejecting heat is hard!”

          Yes.

          “That means you’d need huge radiators!”

          Yes.

          OK, we’re back to “how expensive/reliable is your giant radiator with a data center attached?”

          We don’t know yet, but with low launch costs, it isn’t obviously crazy.

      • Hikikomori 4 days ago

        Now scale the radiator size for your 8MW datacenter.

      • wat10000 3 days ago

        How do you propose to get 10MW of heat from the computers out to the radiators?

  • andrewflnr 4 days ago

    This question is thoroughly covered in the linked article.

    • foobarian 4 days ago

      Pardon, but the question of "could the operational cost be smaller in space" is almost not touched at all in the article. The article mostly argues that designing thermal management systems for space applications is hard, and that the radiators required would be big, which speaks to the upfront investment cost, not ongoing opex.

      • andrewflnr 4 days ago

        Ok, sure, technically. To be fair you can't really assess the opex of technology that doesn't exist yet, but I find it hard to believe that operating brand new, huge machines that have to move fluid around (and not nice fluids either) will ever be less than it is on the surface. Better hope you never get a coolant leak. Heck, it might even be that opex=0 still isn't enough to offset the "capex". Space is already hard when you're not trying to launch record-breaking structures.

        Even optimistically, capex goes up by a lot to reduce opex, which means you need a really really long breakeven time, which means a long time where nothing breaks. How many months of reduced electricity costs is wiped out if you have to send a tech to orbit?

        Oh, and don't forget the radiation slowly destroying all your transistors. Does that count as opex? Can you break even before your customers start complaining about corruption?

  • cmptrnerd6 4 days ago

    Cooling is more difficult in space, yes it's colder, but transferring heat is more difficult.

  • wat10000 4 days ago

    Things on earth also have access to that coldness for about half of each day. How many data centers use radiative cooling into the night sky to supplement their regular cooling? The fact that the answer is “zero” should tell you all you need to know about how useful this is.

    • foobarian 4 days ago

      The atmosphere is in the way even at night, and re-radiates the energy. The effective background temperature is the temperature of the air, not to mention it would only work at night. I think there would need to be like 50-ish acres of radiators for a 50MW datacenter to radiate from 60 to 30C. This would be a lot smaller in space due to bigger temp delta. Either way opex would be much much less than average Earth DC (PUE almost 1 instead of run-of-the mill 1.5 or as low as 1.1 for hyperscalers). But yeah the upfront cost would be immense.

      • tstrimple 4 days ago

        I think you’re ignoring a huge factor in how radiative cooling actually works. I thought the initial question was fine if you hadn’t read the article but understand the downvotes due to doubling down. Think of it this way. Why do thermoses have a vacuum sealed chamber between two walls in order to insulate the contents of the bottle? Because a vacuum is a fucking terrible heat convector. Putting your data center into space in order to cool it is like putting a computer inside of a thermos to cool it. It makes zero fucking sense. There is nowhere for the heat to actually radiate to so it stays inside.

    • oceanplexian 3 days ago

      Look up Tech Ingredients episode on Radiative Paint.

      The fact that people aren’t using something isn’t evidence that it’s not possible or even a great idea, it could be that a practical application didn’t exist before or someone enterprising enough hasn’t come along yet.

      • wat10000 3 days ago

        When something has been known for millennia and hasn’t been put to a particular use even after decades where it could have been used, that is pretty good evidence that this use isn’t a good idea. Especially when it’s something really simple.

        Radiative cooling is great for achieving temperature a bit below ambient at night when you don’t have any modern refrigeration equipment. That’s about all. It’s used in space applications because it’s literally the only option.

  • dzhiurgis 4 days ago

    Breakeven in X years probably makes sense for storage (slow depreciation), not GPUs (depreciates in like 4 years)

    • foobarian 4 days ago

      I think by far the most mass in this kind of setup would go into the heat management, which could probably last a long time and could be amortized separately from the electronics.

      • tsimionescu 3 days ago

        How would the radiators be useful if the electronics no longer are? Unless you can repurpose the radiators once the electronics are useless, which you can't in space, then the radiators' useful lifetime is hard limited by the electronics' lifetime.