Gallery

Contacts

411 University St, Seattle, USA

engitech@oceanthemes.net

+1 -800-456-478-23

Skip to content
Markets Deep Dive

Orbital AI Compute Gets Its First Institutional Series B, What Cowboy Space's $275M Raise Changes for Infrastructure...

$355M total raised
5 min read SiliconANGLE Partial Moderate
Six months ago, orbital AI compute was a plausible thesis backed by early-stage money. Today it's a $2B company with Index Ventures, NEA, and IVP on the cap table. The question Cowboy Space's Series B actually answers isn't whether space-based data centers are technically feasible, it's whether institutional capital has decided the power crisis facing terrestrial AI infrastructure is severe enough to make orbit a serious alternative.
Total outside funding, $355M

Key Takeaways

  • Cowboy Space's $275M Series B at $2B valuation is the first institutional round of this size for orbital AI compute, confirmed by SiliconANGLE page content
  • The power crisis driving this bet is verifiable: terrestrial data center permitting runs 3-7 years; Epoch AI tracks hyperscaler capex trending toward $770B, which assumes grid capacity that isn't building fast enough
  • GPU/compute specs (~1MW, ~800 GPUs per module) remain company-stated only via
  • TechCrunch, not independently verified; infrastructure architects shouldn't model on these figures until independent evaluation exists
  • First orbital deployment is the real validation event, not this raise; a successful LEO operational milestone creates comps for every subsequent orbital infrastructure raise

Funding Round

$275M
CompanyCowboy Space Corp.
RoundSeries B
Lead InvestorsIndex Ventures, NEA, IVP + institutional co-investors
Valuation$2B
SectorOrbital AI Compute / Space-Based Data Centers
Total outside funding
$355M
Cumulative across all rounds; $275M in this Series B

The $355M number is the one that matters.

Not the $275M in this round. The cumulative $355M in total outside funding that SiliconANGLE confirmed when Cowboy Space Corp. disclosed its Series B. That’s a number large enough to actually build and launch orbital hardware. It’s not a proof-of-concept budget. It’s a deployment budget, and the investors who wrote the checks, Index Ventures leading, with NEA and IVP alongside, have underwritten that interpretation.

Cowboy Space builds AI data centers designed for low-earth orbit. The power source is orbital solar: direct sunlight, continuous exposure, no grid connection required. CEO Baiju Bhatt co-founded Robinhood Markets, which means he’s demonstrated he can take a technically complex, regulation-adjacent infrastructure product from concept to mass adoption. Index Ventures doesn’t need to be told who Bhatt is. They wrote the check knowing the pedigree.

Why Orbit? The Power Argument Made Concrete

Epoch AI’s 2026 tracking shows hyperscaler capex trending toward $770B annually, a figure that assumes terrestrial power infrastructure can keep pace. It probably can’t, at that speed.

The U.S. electrical grid permitting process for large-scale data centers now runs 3-7 years in most jurisdictions. Power purchase agreements for renewable energy are oversubscribed in every major data center market. Water cooling requirements are generating regulatory pushback in drought-affected regions. These aren’t hypothetical constraints, they’re the reason Meta signed a 1GW space-based solar agreement and the reason fuel cell deployments are accelerating across the hyperscaler ecosystem.

Orbital solar doesn’t route through any of those bottlenecks. A module in LEO generates power continuously, at predictable output, without land permitting, grid interconnection queues, or water. The latency tradeoff is real, LEO introduces round-trip communication delays that make orbital compute unsuitable for applications requiring sub-10ms response times. But for batch inference, model training runs, and asynchronous agentic workloads, the latency is tolerable. Cowboy Space is positioning for exactly that segment of the market.

Three Models, Updated for May 2026

The framework for evaluating alternative AI compute power models has three branches: grid (traditional utility power, renewable PPA where available), fuel cell (on-site hydrogen or natural gas generation, bypassing grid interconnection), and orbit (LEO solar, bypassing terrestrial infrastructure entirely). A prior brief on these three models laid out the comparative landscape in April.

What’s changed since then:

*Grid:* Still the lowest-cost option at scale for established hyperscaler campuses where power is already contracted. The constraint is new capacity, permitting timelines mean new grid-connected data centers that break ground in 2026 won’t be operational until 2029 or later in most U.S. markets.

AI Data Center Power Model Comparison (May 2026)

Grid (terrestrial utility)
Lowest $/watt-hr at scale; 3-7yr permitting for new capacity
Fuel Cell (on-site hydrogen/gas)
Faster permitting than grid; supply chain dependent; bridge solution
Orbit (LEO solar)
$2B institutionally backed; no grid; higher latency; deployment unproven

Verification

Partial SiliconANGLE (T3, page content confirmed); TechCrunch (cited by Wire, not in verification report) Core financials and company facts confirmed. ~1MW / ~800 GPU module specs are company-stated via TechCrunch only.

Orbital Compute Category, Key Players

Cowboy Space Corp.
for
Orbital AI compute hardware; $355M total funding; Series B at $2B valuation
True Anomaly
for
Defense-adjacent orbital compute; separate market segment
Overview Energy
for
Orbital power generation for terrestrial transmission; Meta 1GW agreement reported
Hyperscalers (Microsoft, Google, Amazon)
neutral
Evaluating orbital as grid-alternative overflow; no confirmed orbital compute contracts

*Fuel cell:* Gaining traction as a bridge solution. Faster permitting than grid in many jurisdictions, lower cost than orbit, but still dependent on hydrogen or gas supply chains. Several hyperscaler campuses are now deploying fuel cell capacity specifically for AI workload overflow.

*Orbit:* Before today, institutionally interesting but pre-Series B. After today, $2B valued and institutionally backed. That doesn’t make orbit cost-competitive with grid power per-watt-hour, it isn’t, not yet. What it does is move orbit from speculative to evaluable. Infrastructure investors and enterprise architects can now model against a real company with a funded deployment roadmap.

The $355M total doesn’t include launch costs for operational scale, and Cowboy Space hasn’t disclosed its deployment timeline publicly. Those are the two variables that will determine whether orbital compute can actually compete with fuel cell on cost by 2028.

The Competitive Landscape

Cowboy Space isn’t alone in the orbital compute category. True Anomaly, which closed a reported new round in May 2026 coverage, operates in the defense-adjacent orbital compute space. Overview Energy, which signed the reported 1GW space-based solar agreement with Meta, is in the infrastructure supply chain for orbital power. The category is forming around three distinct layers: compute hardware in orbit (Cowboy Space’s play), orbital power generation for terrestrial transmission (Overview Energy’s play), and defense-specific orbital compute (True Anomaly’s play). These aren’t direct competitors, they’re different bets on which part of the orbital value chain captures the most value.

Index Ventures chose the compute hardware layer. That’s a specific read on where the margin lives.

What the Baiju Bhatt Factor Actually Means

It’s worth naming directly: Bhatt’s Robinhood pedigree doesn’t guarantee Cowboy Space succeeds, but it meaningfully reduces the execution risk discount that early-stage deep tech usually carries. Robinhood went from zero to regulated financial infrastructure at consumer scale. The operational complexity of that, regulatory navigation, infrastructure reliability, user trust, is legitimately transferable to the challenge of deploying AI compute hardware in orbit.

The investor thesis isn’t just “orbit is interesting.” It’s “this is the operator who could actually pull it off.” That’s a different investment than backing a technically credentialed team with no prior scaling experience.

Compute Specs: What’s Verified and What Isn’t

TechCrunch reported, and the Wire included, that Cowboy Space modules are designed to provide approximately 1 megawatt of computing power via roughly 800 onboard GPUs. That figure didn’t appear in the SiliconANGLE page content that forms the primary verification basis for this brief, and TechCrunch wasn’t in the source verification report. Treat the 1MW/800-GPU figure as company-stated specifications, not independently verified hardware performance. The distinction matters for enterprise architects evaluating this as a serious option: vendor-stated compute density and actual delivered performance in LEO are different numbers, and no independent evaluation exists yet.

What to Watch

First operational LEO deployment confirmation2027-2028
Cost-per-token benchmarking vs. fuel cell for async workloadsPost-deployment
Hyperscaler infrastructure partnership announcement18 months
Independent technical evaluation of module compute specsPost-deployment

Analysis

The Cowboy Space Series B doesn't resolve the orbit-vs-fuel-cell cost question, it funds the experiment. The critical threshold is 2028 cost-per-token parity with fuel cell solutions for asynchronous AI workloads. If Cowboy Space reaches that threshold with operational hardware, the orbital compute category becomes a permanent line item in enterprise infrastructure planning. If it doesn't, it stays a premium solution for power-constrained geographies and defense applications.

What Investors and Infrastructure Teams Should Track

The Series B validation is done. The open questions are operational.

First deployment date is the critical milestone. Everything about Cowboy Space’s investment thesis depends on demonstrating that orbital AI compute actually works at the module specs described. A successful first deployment, hardware in LEO, compute workloads running, power output confirmed, creates a comps anchor that every subsequent orbital infrastructure raise will price against.

Cost-per-token benchmarking against terrestrial alternatives is the second milestone. Once modules are operational, the question becomes whether orbital compute can reach cost parity with fuel cell solutions for async workloads. If it can by 2028, the addressable market expands significantly. If it can’t, orbital compute stays a premium niche for constrained geographies or defense workloads.

The third variable is the hyperscaler response. If Microsoft, Google, or Amazon evaluate orbital compute as a credible grid-alternative for overflow AI workloads, the market validation is complete. A hyperscaler infrastructure partnership announcement would be the signal that orbital compute has graduated from “serious startup” to “category.”

Don’t bet on a 2027 hyperscaler partnership. Do watch for the first operational deployment confirmation, that’s what converts this from a funded thesis to a proven infrastructure category.

View Source
More Markets intelligence
View all Markets

Related Coverage

More from May 13, 2026

Stay ahead on Markets

Get verified AI intelligence delivered daily. No hype, no speculation, just what matters.

Explore the AI News Hub