
Sam Altman didn’t step down from Helion’s board to avoid a conflict of interest. He stepped down because the conflict has already been won.
The real story isn’t about ethics or corporate governance. It’s about the final merger of compute and energy — the moment when AI stopped being a software business and became a utility play. OpenAI isn’t buying “electricity” from Helion. It’s buying the infrastructure of intelligence itself.
The $720 Billion Question Nobody Asked
Here’s what the headlines missed: when Axios reported OpenAI is in “advanced talks” with Helion for fusion power, every analyst treated it as a procurement story. Cute. Irrelevant.
The actual number you should care about is $720 billion — the collective AI infrastructure capital expenditure projected for 2026 alone. That’s not R&D. That’s not software development. That’s concrete, steel, cooling systems, and increasingly, power generation.
Microsoft has already committed to restarting Three Mile Island. Google signed a nuclear power purchase agreement with Kairos Power. Amazon bought a data center campus directly attached to a nuclear facility. And now OpenAI — the company that supposedly exists to “benefit humanity” — is bypassing the grid entirely by going straight to fusion.
This isn’t vertical integration. This is infrastructure capture.
The Physics of the Problem
Let’s be clear about what AI actually consumes. Training GPT-4 required roughly 50 million kilowatt-hours of electricity. That’s the annual consumption of a small city. Running inference at scale — the billions of queries hitting ChatGPT daily — multiplies that demand exponentially.
But here’s the uncomfortable truth: the cost of intelligence is now measured in megawatts, not parameters.
Every major AI lab has hit the same wall. They can buy more GPUs (NVIDIA’s backlog is 18 months). They can hire more engineers (compensation packages now exceed $2 million for senior researchers). What they cannot buy is permission to use more power.
Data centers already consume 2% of global electricity. By 2030, that number could reach 8%. Grid operators are saying no. Environmental regulations are tightening. The only solution left is to become the power company.
The Helion Gambit
Helion Energy is not a typical fusion startup. It’s building a direct energy capture system — no steam turbines, no thermal conversion. If it works (a massive “if”), it would be the first commercially viable fusion reactor in history.
But Altman’s relationship with Helion predates OpenAI. He invested $375 million in the company back in 2021. He served as board chair. He pushed Helion toward a specific vision: small, modular, fast-deployable fusion units designed for one customer — AI data centers.
The “recusal” is a legal fiction. OpenAI isn’t negotiating with an arms-length vendor. It’s negotiating with a company that was built around Altman’s vision of AI-energy convergence. The only thing that changed is that now OpenAI needs to formalize what was always inevitable.
The End of Software Margins
Software companies historically enjoyed 80-90% gross margins. Write code once, sell it infinitely. The marginal cost of serving one more user was effectively zero.
That era is over.
Running a single ChatGPT query costs roughly $0.003 in compute. At scale, with hundreds of millions of daily queries, that’s tens of millions in daily operational costs. The economics of AI have inverted the software playbook:
- Cost structure: Hardware + Energy + Maintenance (capital-intensive)
- Revenue model: Subscription (recurrent but thin-margin)
- Moat: Not code — it’s access to power
This is why Microsoft, Google, and Amazon are all buying nuclear assets. They’re not hedging against energy prices. They’re securing the only input that actually limits their growth.
The Infrastructure Feudalism
What emerges from this convergence is a new hierarchy:
Tier 1 — Energy-AI Sovereigns: Companies that own both compute AND power generation (Microsoft, Google, Amazon, now potentially OpenAI-Helion).
Tier 2 — Compute Tenants: Companies that own models but rent infrastructure (Anthropic, xAI, most startups).
Tier 3 — Model Refugees: Companies that use API access to Tier 1/2 models, paying rent on intelligence they don’t control.
Apple’s recent deal with Google perfectly illustrates the feudal structure. Apple — the world’s most valuable company — is distilling Google’s Gemini to train smaller models for on-device inference. They’re not building foundation models. They’re becoming a refinery for Google’s crude.
Even Apple, with $200 billion in cash, cannot compete at the infrastructure layer. That’s the new reality of the AI economy.
The $1,000 GPU Nobody Wanted
Intel’s new Arc Pro B70, priced at $949, reveals the same logic from the hardware side. With 32GB VRAM and “up to 32 Xe2 cores,” Intel explicitly positioned this as an AI workstation card. Gaming is an afterthought.
The press release buried the real story: “Gaming-focused versions of these cards sure would sound swell.” Translation: We can’t compete with NVIDIA in gaming GPUs, but we can sell you a card that runs local LLMs.
This is the hardware equivalent of the software margin collapse. General-purpose computing is being cannibalized by AI-specific silicon. The consumer PC market — once the engine of technological progress — is now a secondary consideration for chipmakers.
What Zuckerberg Understood
Meta’s quiet project to build a “CEO agent” for Mark Zuckerberg is the logical endpoint. If intelligence requires energy, and energy is finite, then efficiency becomes the only competitive advantage.
A CEO agent that retrieves information “he would typically have to go through layers of people to get” is not a productivity tool. It’s a layer eliminator. Every executive, middle manager, and knowledge worker who serves primarily as an information router is now redundant.
Zuckerberg isn’t building AI to help him be a better CEO. He’s building AI to replace the information supply chain that makes modern management necessary.
The Personal Verdict
I’ve spent years analyzing AI as a technology story. The models, the benchmarks, the training runs — it was all distraction.
The real story was always physics.
AI companies are behaving like 19th-century railroad tycoons. They’re not competing on routes or schedules. They’re competing on who owns the land, the steel, and the coal. The “intelligence” part — the actual models — is becoming a commodity layer on top of infrastructure oligarchies.
When Altman stepped down from Helion’s board, he wasn’t resolving a conflict. He was formalizing a convergence that has been obvious for years: the future of AI is not software. It’s nuclear.
The question isn’t whether fusion will power AGI. The question is whether you — the user, the developer, the investor — will ever see the power bill.
The Infrastructure Hawk watches from the cooling tower.