北京时间周五凌晨,Elon Musk appeared as a guest on the well-known tech interview podcast Dwarkesh Podcast, which was released. As the hot topic in global capital markets, “Space GPU” received a detailed explanation from the world’s richest person.
“Remember my words, within 36 months, space will become the cheapest place to deploy artificial intelligence.”
Regarding the recent heated discussion about “space data centers,” Musk began with multi-dimensional explanations.
The world’s richest person stated that the main reason for sending data centers into space is that the increase in power supply cannot keep up with chip production. Musk said, “Chip output is growing almost exponentially, but power output is flat. So how do you power these chips? Rely on magic power sources? Magic power elves?”
He also gave a rather alarming prophecy: by the end of this year, people will reach a point where they cannot turn on large clusters. Chips will pile up, but there won’t be enough electricity.
He also pointed out that sending solar panels into space has the advantage that, besides higher energy efficiency in space and no need for additional batteries, it also avoids the complicated approval process for setting up photovoltaic farms. From this perspective, large-scale expansion on the ground is more difficult than in space.
Musk also casually set a timetable for the economic viability of space GPUs: “In space, the power generation capacity of any solar panel is about five times that on the ground. At the same time, you don’t need to bear the cost of batteries to get through the night. In fact, deploying in space is much cheaper. My judgment is: In the future, running AI in space will be the most cost-effective option, and it will be overwhelmingly cheaper. This shift will happen within 36 months, or even just 30 months.”
Regarding the maintenance of GPUs sent into space, Musk pointed out that there might be some early failures after the chips arrive, which can obviously be addressed on the ground first, and after initial debugging, they can be sent into space. Once chips reach a certain stage, they become quite reliable, so maintenance won’t be an issue.
Then Musk enthusiastically reiterated: “Remember what I said. Within 36 months, or even closer to 30 months, placing AI in space will be the most economically attractive. And after that, the advantages of being in space will become ridiculously good.”
The item the world’s richest person also worries about not being able to buy: power generation equipment.
Following the issue of insufficient power supply, Musk also shared why data centers cannot build large-scale co-located power equipment: gas turbines are hard to buy, and tariffs in the US make importing solar panels too expensive.
Musk explained: “The bottleneck for gas turbines lies in the guide vanes and turbine blades, because if you’re using gas power generation, these turbine blades and guide vanes are cast using highly specialized processes. Other forms of power generation are actually difficult to scale up. Solar energy can theoretically be expanded, but currently, the tariffs on imported photovoltaic panels are astronomically high, and domestic capacity is painfully low.”
For the solar panels to be sent into space, Musk said that since there is no weather in space, the solar panels launched into space don’t need much glass or heavy supports, and in fact, they will be 5-10 times cheaper than the ground versions.
Musk also lamented that outsiders simply don’t understand how power-hungry data center operations are.
He stated that, besides Nvidia chips, all network hardware and storage devices need power, and power planning must also consider peak cooling demands based on location. He said, “In the Memphis data center of xAI, just cooling alone increases electricity consumption by 40%, and power equipment also needs offline maintenance, which means a 20%-25% increase.” Therefore, about 1 gigawatt of power capacity is enough to serve 330,000 GB300 units.
By the way, Musk also expressed concern about the soaring prices of memory chips and believes that the path to manufacturing logic chips is clearer than the path to obtaining enough memory to support these logic chips. He joked, “If you’re stranded on a deserted island and write ‘Help’ on the beach, no one will come. But if you write ‘DDR memory,’ ships will swarm.”
For his vision, before AI goes to space, the limiting factor is energy; after that, it’s chips. Therefore, in his plan, future TeraFabs will not only produce logic chips in-house but may also need to produce storage and packaging themselves.
(Source: Caixin)
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Elon Musk Talks About "Space GPU" Vision Again: Space Will Be the Cheapest Place to Deploy AI Within 36 Months
北京时间周五凌晨,Elon Musk appeared as a guest on the well-known tech interview podcast Dwarkesh Podcast, which was released. As the hot topic in global capital markets, “Space GPU” received a detailed explanation from the world’s richest person.
“Remember my words, within 36 months, space will become the cheapest place to deploy artificial intelligence.”
Regarding the recent heated discussion about “space data centers,” Musk began with multi-dimensional explanations.
The world’s richest person stated that the main reason for sending data centers into space is that the increase in power supply cannot keep up with chip production. Musk said, “Chip output is growing almost exponentially, but power output is flat. So how do you power these chips? Rely on magic power sources? Magic power elves?”
He also gave a rather alarming prophecy: by the end of this year, people will reach a point where they cannot turn on large clusters. Chips will pile up, but there won’t be enough electricity.
He also pointed out that sending solar panels into space has the advantage that, besides higher energy efficiency in space and no need for additional batteries, it also avoids the complicated approval process for setting up photovoltaic farms. From this perspective, large-scale expansion on the ground is more difficult than in space.
Musk also casually set a timetable for the economic viability of space GPUs: “In space, the power generation capacity of any solar panel is about five times that on the ground. At the same time, you don’t need to bear the cost of batteries to get through the night. In fact, deploying in space is much cheaper. My judgment is: In the future, running AI in space will be the most cost-effective option, and it will be overwhelmingly cheaper. This shift will happen within 36 months, or even just 30 months.”
Regarding the maintenance of GPUs sent into space, Musk pointed out that there might be some early failures after the chips arrive, which can obviously be addressed on the ground first, and after initial debugging, they can be sent into space. Once chips reach a certain stage, they become quite reliable, so maintenance won’t be an issue.
Then Musk enthusiastically reiterated: “Remember what I said. Within 36 months, or even closer to 30 months, placing AI in space will be the most economically attractive. And after that, the advantages of being in space will become ridiculously good.”
The item the world’s richest person also worries about not being able to buy: power generation equipment.
Following the issue of insufficient power supply, Musk also shared why data centers cannot build large-scale co-located power equipment: gas turbines are hard to buy, and tariffs in the US make importing solar panels too expensive.
Musk explained: “The bottleneck for gas turbines lies in the guide vanes and turbine blades, because if you’re using gas power generation, these turbine blades and guide vanes are cast using highly specialized processes. Other forms of power generation are actually difficult to scale up. Solar energy can theoretically be expanded, but currently, the tariffs on imported photovoltaic panels are astronomically high, and domestic capacity is painfully low.”
For the solar panels to be sent into space, Musk said that since there is no weather in space, the solar panels launched into space don’t need much glass or heavy supports, and in fact, they will be 5-10 times cheaper than the ground versions.
Musk also lamented that outsiders simply don’t understand how power-hungry data center operations are.
He stated that, besides Nvidia chips, all network hardware and storage devices need power, and power planning must also consider peak cooling demands based on location. He said, “In the Memphis data center of xAI, just cooling alone increases electricity consumption by 40%, and power equipment also needs offline maintenance, which means a 20%-25% increase.” Therefore, about 1 gigawatt of power capacity is enough to serve 330,000 GB300 units.
By the way, Musk also expressed concern about the soaring prices of memory chips and believes that the path to manufacturing logic chips is clearer than the path to obtaining enough memory to support these logic chips. He joked, “If you’re stranded on a deserted island and write ‘Help’ on the beach, no one will come. But if you write ‘DDR memory,’ ships will swarm.”
For his vision, before AI goes to space, the limiting factor is energy; after that, it’s chips. Therefore, in his plan, future TeraFabs will not only produce logic chips in-house but may also need to produce storage and packaging themselves.
(Source: Caixin)