| ▲ | esseph 5 hours ago |
| This is such a hypebeast paragraph. Datacenters in space are a TERRIBLE idea. Figure out how to get rid of the waste heat and get back to me. |
|
| ▲ | elihu 4 hours ago | parent | next [-] |
| That's not a new problem that no one has dealt with before. The ISS for instance has its External Active Thermal Control System (EACTS). It's not so much a matter of whether it's an unsolvable problem but more like, how expensive is it to solve this problem, what are its limitations, and does the project still makes economic sense once you factor all that in? |
| |
| ▲ | OneDeuxTriSeiGo 3 hours ago | parent | next [-] | | It's worth noting that the EACTS can at maximum dissipate 70kW of waste heat. And EEACTS (the original heat exchange system) can only dissipate another 14kW. That is together less than a single AI inference rack. And to achieve that the EACTS needs 6 radiator ORUs each spanning 23 meters by 11 meters and with a mass of 1100 kg. So that's 1500 square meters and 6 and a half metric tons before you factor in any of the actual refrigerant, pumps, support beams, valve assemblies, rotary joints, or cold side heat exchangers all of which will probably together double the mass you need to put in orbit. There is no situation where that makes sense. ----------- Manufacturing in space makes sense (all kinds of techniques are theoretically easier in zero G and hard vacuum). Mining asteroids, etc makes sense. Datacenters in space for people on earth? That's just stupid. | | |
| ▲ | marcus_holmes 4 minutes ago | parent [-] | | I'm a total noob on this. I get that vacuum is a really good insulator, which is why we use it to insulate our drinks bottles. So disposing of the heat is a problem. Can't we use it, though? Like, I dunno, to take a really stupid example: boil water and run a turbine with the waste heat? Convert some of it back to electricity? |
| |
| ▲ | hyperbovine 3 hours ago | parent | prev [-] | | The ISS consumes roughly 90kW. That’s about *one* modern AI/ML server rack. To do that they need 1000 m^2 of radiator panels (EACTS). So that’s the math: every rack needs another square kilometer of stuff put into orbit. Doesn’t make sense to me. | | |
| ▲ | dnqthao 26 minutes ago | parent | next [-] | | 1000m2 is not a square kilometer (1 square kilometer is 1mil m2) | |
| ▲ | jcgrillo 3 hours ago | parent | prev [-] | | And what happens every time a rack (or node) fails? Does someone go out and try to fix it? Do we just "deorbit" it? How many tons per second of crap would we be burning in the upper atmosphere now? What are the consequences of that? How do the racks (or nodes) talk to eachother? Radios? Lasers? What about the Kessler Syndrome? Not a rocket scientist but 100% agree this sounds like a dead end. | | |
| ▲ | elihu 2 hours ago | parent [-] | | Communication is a well-understood problem, and SpaceX already has Starlink. They might need pretty high bandwidth, but that's not necessarily much of a problem in space. Latency could be a problem, except that AI training isn't the sort of problem where you care about latency. I'd be curious where exactly they plan to put these datacenters... In low Earth orbit they would eventually reenter, which makes them a pollution source and you'd have no solar power half the time. Parking them at the Earth-Sun L1 point would be better for solar power, but it would be more expensive to get stuff there. | | |
| ▲ | WalterBright an hour ago | parent [-] | | > you'd have no solar power half the time Polar orbit. | | |
| ▲ | woooooo 43 minutes ago | parent [-] | | Seasons mess that up unless you're burning fuel to make minor plane changes every day. Otherwise you have an equinox where your plane faces the sun (equivalent to an equatorial orbit) and a solstice where your plane is parallel to the sun (the ideal case). |
|
|
|
|
|
|
| ▲ | fnord77 4 hours ago | parent | prev | next [-] |
| I agree that data centers in space is nuts. But I think there's solutions to the waste heat issue https://www.nasa.gov/centers-and-facilities/goddard/engineer... |
| |
| ▲ | OneDeuxTriSeiGo 3 hours ago | parent | next [-] | | The distinction is that what they are doing for Webb is trying to dissipate small amounts of heat that would warm up sensors past cryogenic temperatures. Like on the order of tens or hundreds of watts but -100C. Dissipating heat for an AI datacenter is a different game. A single AI inference or training rack is going to be putting out somewhere around 100kW of waste heat. Temps don't have to be cryogenic but it's the difference between chiselling a marble or jade statue and excavating a quarry. | |
| ▲ | boutell 4 hours ago | parent | prev [-] | | That's a solution for minuscule amounts of heat that nevertheless disturb extremely sensitive scientific experiments. Using gold, no less. This does not scale to a crapton of GPU waste heat. |
|
|
| ▲ | everfrustrated 4 hours ago | parent | prev [-] |
| Just have to size radiators correctly. Not a physics problem. Just an economic one. Main physics problem is actually that the math works better at higher GPU temps for efficiency reasons and that might have reliability trade off. |
| |
| ▲ | kadoban 3 hours ago | parent [-] | | Anything is possible here, it's just there's no goddamn reason to do any of this. You're giving up the easiest means of cooling for no benefit and you add other big downsides. It's scifi nonsense for no purpose other than to sound cool. | | |
| ▲ | everfrustrated 2 hours ago | parent [-] | | It's about creating a flywheel for scale. Getting better at creating and erecting solar panels & AI datacenters on earth is all well and good, but it doesn't advance SpaceX or humanity very much. At lot of the bottlenecks there are around moving physical mass and paperwork. Whereas combining SpaceX & xAI together means the margins for AI are used to force the economies of scale which drives the manufacturing efficiencies needed to drive down launch etc. Which opens up new markets like Mars etc. It is also pushing their competitive advantage. It leaves a massive moat which makes it very hard for competitors. If xAI ends up with a lower cost of capital (big if - like Amazon this might take 20 years horizon to realize) but it would give them a massive moat to be vertically integrated. OpenAI and others would be priced out. If xAI wants to double AI capacity then it's a purely an automation of manufacturing problem which plays to Elons strengths (Tesla & automation). For anyone on earth doubling capacity means working with electricity restrictions, licensing, bureaucracy, etc. For example all turbines needed for electricity plants are sold years in advance. You can't get a new thermal plant built & online within 5 years even if you had infinite money as turbines are highly complex and just not available. | | |
| ▲ | michaelmrose 2 minutes ago | parent | next [-] | | There is nothing we need on Mars other than science. It's not a market because there isn't money to be made outside of what is required to do whatever economically useless but scientifically valuable efforts we can convince people to fund. We can't build an independent colony we can't live there any time soon. Arguably it may never make sense to live there. | |
| ▲ | amluto an hour ago | parent | prev [-] | | Hmm, Elon really did run that flywheel pretty well. He built the Roadster to drum up some cash and excitement so he could develop the Model S, then he used that success to do the Model X, and then he expanded capacity to develop the 3 and Y, and he reinvested the profits to develop the Model 2, finally bringing EVs to the masses, displacing ICEs everywhere, and becoming the undisputed leader of both EV and battery manufacturering. Oh wait, that didn’t actually happen, because he got distracted or something? He doesn’t really have battery capacity worth writing home about, the Chinese are surpassing Tesla in EV manufacturing, and Waymo is far ahead in self-driving. The amazing space computation cost reduction process sounds rather more challenging than the Model 2, and I’m not sure why anyone should bet on Elon pulling it off. |
|
|
|