The Moonshots crew opened Ep. 247 with the Musk v. Altman lawsuit. They closed it in Star Trek uniforms doing a boy-band skit. Between those two beats sat the most coherent argument the panel has made in months: the binding constraint on the singularity has flipped from intelligence to infrastructure, and the next twelve months belong to whoever owns the plumbing.
Three numbers anchor that read. $3 billion a day in AI investment. A $100 billion lawsuit against a CEO with no equity. A 99% answer when an MIT panel was asked how many white-collar jobs AI can replace inside two years. Each one points at the same thing. The capability is here. The systems built around it are not.
Jury selection in Musk v. Altman begins April 27 in Oakland federal court. Greg Brockman’s 2017 diary entry, surfaced in discovery, is the document that lets the case proceed at all. The diary calls the nonprofit commitment a lie. Judge Yvonne Gonzalez Rogers let the case move forward on that single piece of evidence.
Dave Blundin’s prediction is concrete. They settle, Sam steps down as CEO, the for-profit continues. Alex Wissner-Gross’s read is structural. The defense has a real argument that the OpenAI of today is the only OpenAI that could have funded $130 billion worth of research. The prosecution has a real argument that taking nonprofit capital and converting it to PBC equity is fraud absent precedent.
There is no precedent. The case is going to set one for every research lab that wants to do what OpenAI did. Alex’s calculation puts Harvard’s value at three to four times higher under PBC reincorporation. If the trial blesses the conversion path, the unlock on American research universities is on the order of a trillion dollars. If it does not, the next generation of frontier labs will be born as PBCs from day one and Anthropic’s structure becomes the template.

Anthropic’s lead is now structural
Secondary-market demand for Anthropic is $2 billion. For OpenAI, $600 million. The implied prices are $600 billion for Anthropic, up from $380 billion last round, against an OpenAI secondary trading roughly 10% below its last $852 billion print. Anthropic is at $30 billion ARR. OpenAI sits at $24 to $25 billion. The lead changed hands while everyone was watching the lawsuit.
Salim called Claude Managed Agents the pivot from “AI that answers” to “AI that does.” Alex called it the race to become the de facto OpenClaw provider, the headless 24/7 multimodal long-horizon agent host, before OpenAI can. Anthropic’s $400 million acqui-hire of Coefficient Bio in eight months and the $2.75 billion Eli Lilly deal with Insilico Medicine, with phase-one success rates at 85% versus 52% for traditional drug discovery, are not separate stories. They are the same story. The compute lead converted into an enterprise lead, which converted into a biology lead.
The Anthropic structural advantage is not the model. It is that compute scarcity early forced focus on recursively self-improving codegen, and codegen turned out to be the unhobbling. Every other capability falls out of that one.

The white-collar timeline already collapsed
Dave moderated a panel at MIT the morning of recording. Peter Norvig, formerly Google, and Alexander Amini of Liquid AI on stage. The prompt: take a random white-collar worker today, give two years, what odds AI can replace them at 10x productivity? Norvig answered 99%. Amini’s correction was sharper. That is today, not in two years.
Andreessen’s “AI job loss is fake” position and the Q1 2026 layoff data, 80,000 reported with software roles up 30% to 67,000 open, are both true at different timescales. By 2030 the Industrial Revolution comparison probably holds. Between now and 2030 it does not, because the Industrial Revolution had decades of friction to absorb the dislocation. The current curve has 24 months.
The political response that fits inside that window is what Andrew Yang told the panel at A360. Politics writes checks. It cannot write thoughtful policy. Dave’s prediction for the first social-contract update is therefore a UBI bidding war in the next election cycle. Alex called the redistributive frame a failure of imagination. The disagreement is not whether something is coming. It is whether the private sector can turn the technologically unemployed into macro-entrepreneurs faster than politicians can write checks.
The supply chain is winding its own motors
Chase Lockmiller, building gigawatt data centers in Abilene for Stargate, told Dave he is melting metal to make electronic components because there is no supply chain for what he needs. Brett Adcock at Figure has said the same. Bernt Bornich at 1X has said the same. Unitree’s IPO is $610 million, which would have been a basement filing on Wall Street six months ago and now is one of the larger robotics raises of the year.
Agibot has shipped 10,000 humanoids. Xiaomi displayed CyberOne. UniXAI launched a home robot. Alex’s read is that China owns physical-unit production while the US owns the foundation models, and the question is whether China can build a robot foundation model lab faster than the US can rebuild a robotics manufacturing base. He spent an hour the morning of recording walking a Unitree around the MIT Media Lab as a quasi-shame play directed at US robotics startups.
Liquidity is the same problem at higher altitude. UBS’s CIO Ulrika Hoffman-Burkhardi told Dave at a private lunch that the $242 billion Q1 AI investment cannot be sourced from idle capital. Things have to get sold. If you are a public industrial or a regional bank, you are the source of funds for the AI capex cycle, not the recipient.

Stores of wealth in a singularity economy
Alex does not hold gold. He does not hold Bitcoin. His portfolio is index funds plus startup equity, and that is it. The argument is that any non-productive store of wealth is a bet that capital allocation moves slower than the singularity, and that bet is now structurally bad.
His Bitcoin take is the sharper one. The community’s quantum-decryption fear is the wrong fear. The actual existential risk for Bitcoin is irrelevance. AI agents, given speed and a clean slate, will invent their own currencies. Six out of ten today preferring Bitcoin in a Bitcoin Policy Institute study is the kind of result that does not survive contact with agent-to-agent transaction volume. Mike Saylor’s “quantum hardens Bitcoin” position assumes the upgrade path runs faster than agent-native alternatives. That assumption is doing a lot of work.
The hot take Alex landed at the end of the segment was post-scarce land. Coastal Assembly, where he has a financial interest, is using AI to grow new coastline. If land becomes post-scarce in the singularity, the asset class that drove most of the last century of household wealth flips overnight.
The honest counterargument
The strongest read against the panel is that none of this has actually happened yet. The trial has not started. Mythos has not shipped. OpenAI’s $120 billion in cash has not been deployed. The Andreessen camp on jobs may end up being right at every timescale that matters politically, because the layoff data is concentrated in a few sectors and the new-job data is concentrated in others. The 99% answer at MIT was a panel with two people on stage and a buzzy prompt.
The panel’s own data shows the dispersion. Salim’s read on Iran, oil prices, and global macro is that AI is a contributing factor not a sole cause. Dave’s MicroStrategy-only Bitcoin position is a real hedge against Alex’s irrelevance argument. Peter’s tour-guide story from Morocco, where ChatGPT helped a guide build a bicycle business, is the abundance counter to the dislocation framing.
You can read the whole episode as a forecast that is half-right by 2027 and three-quarters-right by 2030. That is still the most consequential forecast available, because the half that lands first is the corporate-form precedent and the agent-economy plumbing.
The trial starts in two weeks. The plumbing builds quietly underneath. The interesting question is no longer whether intelligence arrives. It is who gets to own what comes after.
Sources
- Moonshots with Peter Diamandis, Episode #247: Elon Musk vs. Sam Altman, AI Job Loss, and OpenAI’s $852B Valuation. Episode date April 14, 2026. Hosts: Peter Diamandis, Salim Ismail (OpenExO), Dave Blundin (Link Ventures), Dr. Alexander Wissner-Gross.
- Musk v. Altman, US District Court, Northern District of California, Oakland Division. Trial date April 27, 2026. Judge Yvonne Gonzalez Rogers presiding.
- Q1 2026 global VC investment in AI: $242 billion, 64% concentrated in OpenAI, Anthropic, XAI, and Waymo.
- Anthropic vs OpenAI ARR: $30B vs $24 to $25B; secondary-market demand $2B vs $600M.
- Insilico Medicine deal with Eli Lilly: $2.75 billion total, $115 million up front, balance on milestones; phase-one AI-discovered drug success rates 85% vs 52% traditional, phase-two 70% vs 38%.
The Moonshots crew spent two hours building an “infrastructure not intelligence” frame. The frame is half-right. The half it gets wrong is the half worth inspecting.
The trial may set a narrow precedent
Musk v. Altman is being read as the case that decides whether nonprofit-to-PBC conversion is legal. It probably is not. Settlements rarely set durable precedent. Sam stepping down and the company continuing as a for-profit is what Dave predicts, and that outcome is silent on the legal question for everyone outside this exact set of facts.
The Harvard PBC calculation Alex ran is interesting math. It is also a thought experiment that requires every state attorney general to bless the idea, hundreds of donor lawsuits to fail, and tax-exempt regulators to rewrite a century of guidance. None of that is gated on the OpenAI trial. The trial may matter for OpenAI and not much else.
The honest read is that this is a high-profile case with limited generalizability. The press cycle will be loud. The case law that comes out the other side will be narrow.

Anthropic’s lead is fragile
$30B vs $24 to $25B in ARR is real. So is $120 billion in cash sitting on OpenAI’s balance sheet, 900 million users, and a CEO with a track record of making the right pivot under pressure. Secondary-market demand and implied valuation are also lagging indicators. Sora was a strategic miss. Sora is also fixable in a quarter, and the Disney deal cancellation is a write-down not a death sentence.
The Coefficient Bio and Insilico numbers are exciting in a press release. They are not yet shipped products. Anthropic’s lead is real on the metrics that compound monthly. It is fragile on a 12-month horizon, especially if SPUD or Codex closes the codegen gap. The structural-lead framing assumes the current trajectory holds across product, governance, and capital. Each of those is a non-trivial assumption.

99% is a panel answer, not a labor-market one
Dave’s 99% number is from a panel of two people on stage and a leading prompt. Norvig is one of the smartest people in computer science. He is also at a conference, looking at “give me a number” questions, with Liquid AI’s founder next to him. The room rewards confident answers.
The actual labor-market data is messier. 80,000 layoffs is a real number. So is software-engineer postings up 30% to 67,000. Salim’s read about Iran, oil prices, and macro effects is the more careful one. Attribution is hard. So is Dave’s own observation that he could not reconcile the layoff data with the new-grad hiring data.
The 99% answer is sticky because it is dramatic. The labor data is sticky because it is messy. Dramatic numbers tend to overpredict short-term change and underpredict long-term change. Andreessen’s case may be wrong on cadence and right on direction, and the right play is to model both timelines instead of picking one.
The supply chain story is partial
Chase Lockmiller winding his own motors is a great anecdote. It is also one company in one stack at one moment. The broader supply chain runs through TSM, ASML, Samsung, and a handful of bottlenecks the panel covered earlier on Intel-NVIDIA TerraFab. Those bottlenecks are not solved by hard-tech founders refusing to wait for vendors. They are solved by capital deployment at the foundry layer, which is exactly what is happening at the geopolitics scale.
China shipping 10,000 humanoids is impressive. It is also a small number against a labor force of hundreds of millions. The “Chinese physical units, US foundation models” frame Alex used is real on a quarterly view. It is unlikely to hold past 2027, when Chinese foundation models close the gap and Chinese units start to look more like commodity hardware than strategic capability. The right way to read this gap is as a transient inflection, not a structural moat.

Investment doctrine is over-confident
Alex’s “no gold, no Bitcoin, only index funds and startup equity” position is logically clean. It also assumes that the singularity arrives on his investment horizon and that public markets price superintelligence faster than private markets do. Both are reasonable assumptions. Neither is established.
The Bitcoin-as-irrelevance case is sharper than the quantum case. It is also an argument from speed about a system explicitly designed to be slow. Bitcoin’s value proposition is not transaction throughput. It is monetary policy resistance, custody resilience, and a settlement layer that trades latency for assurance. AI agents inventing their own currencies is plausible. AI agents settling final value into something that does not have ten thousand independent validators is less plausible.
Coastal Assembly making land post-scarce is a long-tail bet. Calling it the obvious endgame on a podcast is overcommitting to a single startup’s thesis on national television.
The honest read
The “infrastructure not intelligence” frame is a useful correction to capability-only thinking. It is also a frame that will be partially refuted by the next big release, the way every frame on a quarterly podcast eventually is.
The trial may set a narrow precedent or a broad one. Anthropic’s lead may compound or evaporate. The 99% number may be 60% in the labor data and 40% in the political response. The supply chain may bottle up at a foundry layer the panel did not examine in depth.
What the panel got most right is timing. The next twelve months matter. What it got most wrong is certainty. The plumbing thesis is a useful prior. It is not a forecast.
The Moonshots crew spent two hours arguing whether the singularity is bottlenecked on intelligence or on the systems around it. The argument is already settled. Capability is here. Whoever owns the legal, organizational, supply-chain, and cognitive plumbing wins the next decade.
If your roadmap is still about benchmarks, you are already behind.
The trial is the precedent
Jury selection in Musk v. Altman begins April 27 in Oakland. Greg Brockman’s 2017 diary entry, the one that calls the nonprofit commitment a lie, is the document that lets the case proceed. There is no case law on whether converting nonprofit assets to PBC equity is legal. This trial sets it.
Dave’s prediction is the right one. They settle, Sam steps down, the for-profit continues. The settlement structure determines whether the corporate-form precedent is restrictive or permissive. Permissive unlocks a trillion dollars of stranded value across American research universities. Alex ran the Harvard PBC math live on the pod, three to four times current book value. Restrictive forces the next generation of frontier labs to incorporate as PBCs from day one and ends OpenAI’s organizational template.
Either way, the lesson for any research lab considering a similar conversion is binary. Move now or wait until precedent locks the path.

Anthropic is no longer catching up
$30B ARR vs $24 to $25B. $2B in secondary-market demand vs $600M. $600B implied valuation against an OpenAI secondary trading 10% below its last round. Claude Managed Agents shipped. The $400M Coefficient Bio acqui-hire closed in eight months. The Eli Lilly Insilico deal at $2.75B has phase-one success rates of 85% versus 52% traditional, phase-two 70% versus 38%.
Anthropic compounded a single advantage, recursive self-improvement on codegen, into every adjacent vertical. OpenAI’s $120 billion in cash and 900 million users are still real, and Sam Altman is still a generational operator. Neither fact changes the structural picture. The lead changed hands while the cap-table soap opera was on.
If you sell into enterprise and your capability roadmap assumes parity between vendors, you are pricing yesterday’s market.

White-collar replacement is here
Norvig answered 99% on Dave’s panel at MIT. Amini’s only correction was that two years is the wrong horizon, today is. The Q1 layoff data shows 80,000 jobs gone, concentrated in marketing, sales, and consumer relations. Software-engineer postings up 30% in the same period are the recursive self-improvement signal. AI is not slowing white-collar replacement. It is accelerating it while building the next layer of jobs faster than the bureau can count them.
The political plumbing for what comes next is the constraint, not the capability. Andrew Yang’s read at A360 is correct. Politicians can write checks. They cannot legislate retraining. Whoever turns the technologically displaced into operators of agent fleets first wins the political and economic alignment of the 2028 cycle. The companies that figure that out, not the ones still arguing whether jobs are net negative, are the ones that scale through this.
The supply chain decides the next layer
Chase Lockmiller is melting metal in Abilene because there is no supply chain for what gigawatt data centers need. Bernt Bornich is winding his own motors at 1X. Brett Adcock is doing the same at Figure. Unitree’s $610M IPO is small only by old-economy standards.
Agibot at 10,000 humanoids shipped. Xiaomi CyberOne. UniXAI’s home robot launch. China is producing physical units while the US produces VLA foundation models. Whichever side closes the other’s gap first owns the next decade of physical labor automation. UBS’s CIO told the panel that the $242B Q1 AI investment cannot be sourced from idle capital. Things have to be sold. If you are a public industrial or a regional bank, you are the source of funds. If you are a hard-tech founder, you are inheriting the receiving end of a forced reallocation.

Stores of wealth flip
Alex’s portfolio is index funds plus startup equity. No gold, no Bitcoin. The case is that any non-productive store of wealth bets capital reallocation runs slower than the singularity, and that bet is structurally wrong now.
The cleaner take is on Bitcoin specifically. The risk to Bitcoin is not quantum decryption. It is irrelevance. Agent-to-agent transaction volume will invent its own currencies. Saylor’s “quantum hardens Bitcoin” take depends on the upgrade path running faster than agent-native alternatives, and there is no reason to assume that.
The post-scarce land argument lands the segment. Coastal Assembly is using AI to grow new coastline. If land becomes post-scarce in the singularity, the asset class that drove household wealth for a century flips. Position accordingly.
What to do this quarter
Three things follow if you take the panel seriously.
Stop benchmarking against released models. The internal-versus-released gap is now wide enough to matter. Roadmap against leaked numbers, not API ones.
Treat codegen as the unhobbling, not a vertical. If your product still ships AI as a copilot, your spend is wrong by an order of magnitude.
Build for the agent economy. The supply-chain gap and the legal-form question are both more interesting than the next benchmark, and the operators who build there will be on the right side of the next twelve months.
The trial starts in two weeks. The plumbing builds underneath. The interesting question is no longer whether intelligence arrives. It is who gets to own what comes after.
Comments
Loading comments…
Leave a comment