Scan to Download Gate App
qrCode
More Download Options
Don't remind me again today

Former NASA engineer: Building a space data center is the worst idea I've ever heard.

A former NASA engineer and Google Cloud expert explains why establishing data centers in space is a completely impractical idea, filled with challenges from power, heat dissipation to radiation tolerance. This article is based on a piece by Taranis, organized, translated, and written by Dongqu. (Background: This man wants to send Bitcoin mining rigs to space: infinite sunlight + zero cooling costs is the holy land for BTC mining) (Context: Moving the Three Gorges Dam to space) China plans to build a solar power station in space. Will humanity welcome energy freedom?) To clarify, I am a former NASA engineer/scientist with a PhD in space electronics. I also worked at Google for 10 years in various departments, including YouTube and the cloud department responsible for deploying AI computing capabilities, so I am well qualified to speak on this topic. In short: this is absolutely a bad idea, it really makes no sense at all. There are many reasons, but to sum it up, the electronic equipment required to operate data centers, especially those deploying AI computing capabilities in the form of GPUs and TPUs, is completely unsuitable for operation in space. If you have not worked in this field before, I would caution you not to assume based on intuition, as the reality of getting hardware to operate in space is not necessarily obvious. Power The primary reason people seem to want to do this is that there is ample power in space. That is not true. Essentially, you only have two choices: solar power and nuclear power. Solar power means deploying solar panel arrays with photovoltaic cells — basically equivalent to the equipment on the roof of my house in Ireland, just in space. It can work, but it won't magically be better than installing solar panels on the ground — the power you lose through the atmosphere is not that much, so the intuition about the required area is generally correct. The largest solar array ever deployed in space is the system on the International Space Station (ISS), which can provide just over 200kW of power at peak. It is important to mention that deploying this system required several space shuttle flights and a lot of work — it has an area of about 2,500 square meters, more than half the size of a football field. For reference, each GPU device, using the NVIDIA H200, has a power requirement of about 0.7kW per chip. These cannot operate alone, and power conversion is not 100% efficient, so realistically, 1kW per GPU may be a better benchmark. Thus, a huge array the size of the ISS can power about 200 GPUs. That sounds like a lot, but let's keep some perspective: the data center that OpenAI is about to build in Norway intends to house 100,000 GPUs, each of which may consume more power than the H200. To achieve this capacity, you would need to launch 500 satellites the size of the ISS. In comparison, a single server rack (like the pre-configured ones sold by NVIDIA) will accommodate 72 GPUs, so each giant satellite only corresponds to about three racks. Nuclear power is no help either. We are not talking about nuclear reactors here — we are talking about radioisotope thermoelectric generators (RTGs), which typically output about 50W – 150W. So it is not even enough to run a single GPU, even if you could convince someone to give you a piece of subcritical plutonium and didn’t mind having hundreds of opportunities to scatter it over a wide area when the launch vehicle explosively self-destructs. Thermal Regulation ISS Advanced Thermal Control System (Boeing) I have seen many comments on this concept saying, “Well, space is cold, so cooling would be easy, right?” Uh… no… really not. Cooling on Earth is relatively simple. Air convection works quite well — blowing air over a surface, especially designed with a large surface area to volume ratio in heat sinks, can effectively transfer heat from the heat sink to the air. If you need a higher power density than direct cooling (and high-power GPUs definitely fall into this category), you can use liquid cooling to transfer heat from the chip to larger heat exchangers/radiators elsewhere. In terrestrial data centers, cooling loops are typically set up, with machines cooled by a coolant (usually water), which is pumped around the racks, extracting heat and returning the coolant to the loop. Usually, the coolant cools to the air by convection, so that’s how it works on Earth. In space, there is no air. The environment is close to a perfect vacuum, making convection nonexistent. In space engineering, we typically think about thermal management, not just cooling. The fact is, space itself does not have a temperature. Only matter has temperature. This may surprise you, but in the Earth-Moon system, the average temperature of almost anything is essentially the same as the average temperature of the Earth, which is why the Earth has that specific temperature. If a satellite is rotating, a bit like a chicken on a rotisserie, it will tend to maintain a consistent temperature roughly similar to that of the Earth’s surface. If it is not rotating, the side facing away from the sun will gradually cool to about 4 Kelvin, just above absolute zero, due to the cosmic microwave background limitation. On the sunward side, it can get quite hot, reaching several hundred degrees Celsius. Therefore, thermal management requires very careful design to ensure heat is carefully directed where it needs to go. Because there is no convection in a vacuum, this can only be achieved through conduction or some kind of thermal pump. I have designed space hardware that flies in space. In one particular case, I designed a camera system that needed to be very compact and lightweight while still providing scientific-grade imaging capabilities. Thermal management was central to the design process. It had to be, as power was scarce on small spacecraft, and thermal management had to be achieved while keeping mass to a minimum. So for me, there was no thermal pump or fancy stuff: I took another approach and designed the system to consume about 1 watt at peak, dropping to about 10% when the camera was idle. All of this power translates to heat, so if I only consume 1 watt while capturing images and then shut off the image sensor immediately after the data enters RAM, I can halve the power consumption, and then when the image is downloaded to the flight computer, I can turn off the RAM, lowering the power to a relatively small level. The only thermal management required was to bolt the edges of the circuit board to the rack so that the copper layers inside the circuit board could dissipate any heat generated. Cooling even a single H200 would be an absolute nightmare. It is clear that heat sinks and fans would not work at all, but there is a liquid-cooled version of the H200. Assuming this version is used. This heat needs to be transferred to the heat sink — this is not like the radiator in your car, remember there’s no convection? — it needs to radiate heat into space. Assuming we can point it away from the sun. The active thermal control system (ATCS) on the ISS is an example of such a thermal control system. It is a very complex system that uses ammonia cooling loops and large thermal radiators. Its heat dissipation limit is 16kW, so about 16 H200 GPUs, slightly more than a quarter of a ground rack. The thermal radiator system measures 13.6m x 3.12m, which is about 42.5 square meters. If we take 200kW as a benchmark and assume all this power will be supplied to the GPUs, we need a system that is 12.5 times larger, or about 531 …

BTC1.94%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)