SpaceX and Blue Origin race to get into orbit as scientists question physics



The field is alluring in its simplicity: AI needs more power than terrestrial networks can provide, so move data centers into orbit where the sun never sets and electricity is free. SpaceX, Blue Origin and a growing constellation of startups are now racing to make that vision a reality. The problem, according to the scientists and engineers who are supposed to make physics work, is that the vision skips several unwritten chapters of thermodynamics, economics and orbital mechanics.

SpaceX filed with the Federal Communications Commission on January 30 for permission to launch up to one million satellites into low Earth orbit.unprecedented computing power to power advanced AI models.” The satellites will operate at altitudes between 500 and 2,000 kilometers, in orbits designed to maximize time in sunlight, and will route traffic through SpaceX’s existing Starlink network. SpaceX requested a waiver from the FCC’s standard deployment milestones, which typically require half a constellation to be operational within six years.

Seven weeks later, Blue Origin submitted its application. Project Sunrise offers 51,600 satellites in sun-synchronous orbits between 500 and 1,800 kilometers, complementing the previously announced TeraWave constellation of 5,408 satellites. While SpaceX’s presentation emphasized raw scale, Blue Origin’s architecture was emphasized: the system will perform calculations in orbit and transmit the results to Earth via TeraWave’s mesh network.

The startup ecosystem is moving even faster. Starcloud, formerly Lumen Orbit, raised $170 million at a $1.1 billion valuation just 17 months after completing the program in March, it became the fastest unicorn in Y Combinator history. The company launched its first satellite carrying the Nvidia H100 GPU in November 2025, and in February filed with the FCC for a constellation of up to 88,000 satellites. Aethero, a defense-focused startup that builds space-grade computers with radiation-shielding Nvidia Orin NX chips, has raised $8.4 million and is testing hardware in orbit this year.

Commercial logic is based on a real problem. Global data center electricity consumption It reached about 415 terawatt-hours in 2024, and the International Energy Agency predicts it could surpass 1,000 TWh by 2026, driving 30 percent annual growth of accelerated AI servers. In Virginia alone, data centers consume 26 percent of total electricity. Ireland’s share may reach 32 percent by the end of the year. Network constraints are real, permitting delays are real, and political resistance to building more ground capacity is real.

What’s real, scientists say, is the physics that make orbital calculations spectacularly difficult on any meaningful scale. The main problem is heat. In space, there is no air to carry heat away from processors, only radiative cooling, which requires large surface areas. Dissipating just one megawatt of thermal energy while keeping electronics at a constant 20 degrees Celsius requires about 1,200 square meters of radiators, about four tennis courts. A data center of several hundred megawatts, the bare minimum for commercial significance, would require radiators thousands of times larger than anything ever deployed on the International Space Station.

Radiation presents a second structural problem. Low-Earth orbit exposes unprotected chips to cosmic rays and trapped particles, causing bits to drift and permanent circuit damage. Radiation hardening adds 30 to 50 percent to hardware costs and reduces performance by 20 to 30 percent. Alternatively, triple-module redundancy means running three copies of each chip, three times the cooling, three times the power, and three times the mass. Starcloud’s approach to flying commercial GPUs with external shielding is an interesting experiment, but no one has yet demonstrated it working at scale or on hardware lifetimes measured in years rather than months.

Latency is the third limitation. A million satellites spread across 500- to 2,000-kilometer orbits cannot achieve the tight coupling required for boundary model training, where inter-node communication delays must remain in the microsecond range. Low-Earth orbit provides minimum delays of a few milliseconds for intersatellite links and 60-190 milliseconds for ground-to-orbit round-trips, compared to 10-50 milliseconds for terrestrial content delivery networks. This makes orbital infrastructure potentially suitable for inference workloads rather than training, where the vast majority of AI computing demand currently sits.

Then there is the cost. IEEE Spectrum estimated that a one-gigawatt orbital data center would cost $50 billion, about three times the cost of an equivalent ground-based facility, including five years of operation. Google said launch costs would need to fall below $200 per kilogram before space-based computing could make economic sense. SpaceX’s current Starlink economics work out to about $1,000 to $2,000 per kilogram. Some analysts argue that the true threshold for competing with the Earth’s renewable economy is $20-$30 per kilogram, a number that cannot be reliably predicted for the next two decades. It looks less favorable when set against the economy deep tech funding landscape on the groundwhere ground infrastructure projects can take advantage of established supply chains and proven unit economics.

Even OpenAI’s Sam Altman, who has invested billions in rocket manufacturer Stoke Space as a potential rival to SpaceX for orbital data centers, called the concept “ridiculous” for the current decade. Altman told reporters that the rough math of launch costs relative to ground power costs doesn’t work yet, and he asked exactly how anyone planned to fix a broken GPU in space.

The astronomical community adds an entirely separate objection. The vast majority of nearly 1,000 public comments on SpaceX’s FCC filings urged the commission not to proceed. If confirmed, the constellation would host more satellites than stars visible in the sky for large parts of the night throughout the year. further militarization and commercialization of the orbital environment this is already strained under the weight of existing mega constellations.

None of this means that orbital data centers will never exist. SpaceX’s Starship, if it meets its cost targets, could fundamentally change the mass-to-orbit economics that currently make the concept unworkable. Starcloud’s incremental approach to flying small payloads and replicating radiation performance is an engineering path that sometimes produces breakthroughs. And the terrestrial network limitations that drive interest are not going away.

But the gap between filing an FCC application for a million satellites and making orbital computing economically competitive with a warehouse full of GPUs in Iowa isn’t measured in years. It is measured in physics problems The current pace of AI infrastructure investments No matter how many billionaires are willing to try, there can be no shortcut. The question scientists are asking is not whether space data centers are theoretically possible. That is why, given the magnitude of unsolved engineering, everyone treats them as a near-term solution to a problem that requires near-term answers. It turns out that the sky is not the limit. It is a radiator.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *