Space-based AI compute has been positioned by the tech industry as a solution to several challenges facing current Earth-based datacenter infrastructure, including energy supply, cooling, and water. However, is the technology as feasible and attainable as it’s touted to be?
We spoke to Sean McDevitt, a Partner at consulting firm Arthur D. Little, to share his insights on what the technology’s future could look like in the short and long term.
A Targeted, Modular Deployment
McDevitt, expanding on the feasibility of datacenters with our current economic and technological advancements, shared that orbital datacenters were feasible in “a limited sense,” adding that the current levels suggest it can carry out “specialized workloads,” but weren’t yet at “hyperscale or at a cost” that could make a satellite cluster replace terrestrial datacenters.
“Today, the strongest case is for targeted, modular deployments rather than giant clouds in space,” he said. McDevitt also shared that the promise of orbital datacenters was attractive because Earth-based AI infrastructure is running into bottlenecks like “long power interconnection queues, cooling and water constraints, permitting delays, etc.”
However, the enthusiasm and the interest in the phenomenon could be partly driven by “hype around one variable [energy], while underestimating all the others,” McDevitt shared.
Challenges Along The Way
To scale orbital AI datacenters, the tech industry still faces several challenges, both technologically and economically. “Thermal management, launch economics, radiation tolerance, networking, servicing, and refresh cycles,” were all major hurdles along the way, McDevitt said. Building a “reliable, upgradeable, economically competitive compute system at scale,” also remains a headwind.
The potential debris caused by the datacenters, McDevitt shared, was also cause for concern. “If the industry starts talking about thousands of orbital compute assets, debris management moves from a side issue to a core design and regulatory requirement,” he said. McDevitt also shared that the satellites could increase collision risk unless there was “strong debris mitigation.”
Cooling is another issue that McDevitt sees posing a challenge for orbital datacenters. “In orbit, you do not cool by convection or evaporation,” he said, which, while eliminating the need for water, translates to thermal design taking center stage. “Heat spreading, radiators, conservative power density, and workload scheduling” were all important aspects, he said.
Potential For Physical AI And Autonomy?
As physical AI and autonomy efforts are ramping up, McDevitt shared that he did see potential for orbital datacenters to work as an extension of ground-based systems on Earth to operate autonomous technology and physical AI. “The best use case is likely a distributed architecture,” he said, adding that orbital AI compute “will most likely be a complement to Earth-based infrastructure.”
He added that in such a system, orbital compute would support “global sensing, preprocessing, model updates, and resilience for autonomous systems on Earth, in the air, at sea, and in space.”
The system also has its best application in the areas of “Earth observation processing, RF signal analysis, spacecraft telemetry, in-orbit edge inference,” etc., as these processes were “space-native,” he said. However, he cautioned that “early [orbital datacenter] systems would have a shorter useful economic life,” of somewhere between 5-7 years, adding that commercial systems were still several years away.
Concerns With AI
McDevitt did have some concerns with the tech industry’s AI push and orbital datacenters. He was worried about the “concentration of power” and “regulatory avoidance” that could result from the infrastructure. He also shared that “cybersecurity, dual-use military applications,” as well as “governance gaps” over the control and ownership of compute infrastructure, “beyond national borders,” were also concerning.
Raising alarm over privacy, McDevitt said that “there is also the possibility that some actors may see orbital infrastructure as a way to sidestep national legal and data rules.”
However, in the near term, orbital AI datacenters didn’t warrant massive investments like deployments of thousands of AI satellites, he said. “The smarter near-term strategy is disciplined experimentation: identify orbit-native workloads, quantify avoided terrestrial bottlenecks, and pilot selectively,” he said.
Space Exploration?
The technology also holds potential for furthering humanity’s space exploration ambitions, according to McDevitt, as the infrastructure could help spacecraft with onboard autonomous operations, navigation support, and more.
“For exploration missions, local or near-local compute would potentially reduce the need to send every raw dataset back to Earth and could speed up decisions in bandwidth-constrained environments,” he shared.
He also shared that the technology could initially provide more value for space operations before it sees commercial applications for “mainstream cloud workloads.”
Check out more of Benzinga’s Future Of Mobility coverage by following this link.
Photo courtesy: Alones from Shutterstock
Recent Comments