Several technology bets resolved in a single week, not as forecasts but as measurable numbers. Episode 240 of Moonshots brought Peter Diamandis, Dave Blundin, Salim Ismail, and Dr. Alexander Wissner-Gross together to read the ledger after NVIDIA’s GTC 2026, the Abundance Summit in Palos Verdes, and a week of data that forced several long-running debates to a close.
Jensen Huang’s Trillion-Dollar Vision
NVIDIA’s GPU Technology Conference drew 30,000 attendees to San Jose. Jensen Huang’s keynote was too large for the convention center and moved to SAP Center. The headline was direct: NVIDIA is on track for $1 trillion in revenue by 2027. Dave Blundin clarified that the figure represents bookings recognized over the life of contracts, spread across two years, but $350 billion this calendar year already makes the direction clear. TSMC capacity is the only constraint left.
Jensen also announced NVIDIA’s support for OpenClaw. The chart he displayed put the scale in perspective: OpenClaw exceeded what Linux accomplished in 30 years, and it did so in weeks. AWG noted that each AI unhobbling, from ChatGPT to reasoning models to OpenClaw, arrives faster than the last, because each one builds on everything that came before it. The stack compounds.

NVIDIA positioned itself as infrastructure for physical AI: robotaxi partnerships with BYD, Hyundai, Nissan, and Uber; an AI-RAN agreement with T-Mobile; orbital data centers planned alongside SpaceX. Peter asked how long before regulators treat NVIDIA as critical infrastructure. AWG’s answer: export controls on NVIDIA compute are already heavy, and GTC 2026 is, in its scope, the Western response to China’s AI five-year plan.
The Inference Explosion
Sam Altman stated at the Abundance Summit that the cost to get the same answer from a hard reasoning problem dropped 1,000x between O1 and GPT-5.4 in 16 months. AWG confirmed the number is consistent with the 40x annual hyper-deflation the crew has tracked, and attributed most of the gain to reasoning models specifically. Before O1, almost no effort went into scaling inference-time compute. The overhang was enormous.
Dave pressed on why inference scales so much faster than training. The answer: compute spent on inference was near zero before reasoning models, so orders-of-magnitude scaling costs almost nothing in absolute terms. That free lunch is ending. Some frontier models now spend more compute on inference than training, and from here, efficiency gains will require genuine architectural breakthroughs, not just more tokens at reasoning time.
“We did a thousand X. What are you expecting in the next year? Oh, 2X?” — Dave Blundin

Peter Diamandis put the case for abundance directly: six billion people hold smartphones, AI inference costs are collapsing, and intelligence will be available to everyone. The remaining gap is not compute. It is finding the consumer use case that makes individuals use reasoning as aggressively as enterprises already do.
Anthropic’s Enterprise Surge
Time magazine named Anthropic the most disruptive company in the world. The supporting data is specific: Anthropic now captures 73% of first-time enterprise customers. OpenAI sits at 26%. Three months ago, those numbers were roughly reversed, 60% for OpenAI and 40% for Anthropic.
“This is an absolute ass kicking.” — Dave Blundin
The crew traced the shift to strategy. Sam Altman is building an empire across chip design, consumer devices, and Stargate data centers while running a frontier AI lab. Dario Amodei stayed focused on the software and the enterprise. AWG offered the structural explanation: Anthropic was forced into enterprise focus by resource constraints early on, then turned that constraint into a coherent identity. OpenAI’s original bet was that consumers would need reasoning compute as urgently as enterprises do. That bet was wrong.

OpenAI is now scaling back Stargate. The $1.6 trillion data center plans are being throttled, and the company is switching from building its own facilities to renting existing capacity. AWG cautioned against writing any epitaph: GPT-5.4 Pro is strong, Codex is growing rapidly, and all five frontier labs will likely be worth trillions eventually. But the quarter belongs to Anthropic.
TerraFab and the Chip Bottleneck
Elon Musk announced TerraFab, a chip fabrication initiative starting at 100,000 wafer starts per month and targeting one million, roughly 70% of TSMC’s current annual global output. Dave Blundin explained the wall: ASML produces around 700 EUV machines per year and might push to 1,000 under maximum effort. Elon’s response to that hard ceiling will define how fast TerraFab actually scales.
AWG added the geopolitical case. If TerraFab succeeds in scaling domestic chip production in the United States, it reduces dependency on TSMC in Taiwan and materially reduces the incentive for a Chinese military move on the island. The strategic value of that outcome, AWG argued, may outweigh the direct business economics of the fab itself.
Get Physics Done
AWG announced Get Physics Done (GPD), an open-source agentic physicist released by Physical Superintelligence under the Apache 2.0 license. Within days of launch, the former chair of Harvard’s Astronomy Department recommended that all faculty, postdocs, and students begin using it. VCs flooded AWG’s inbox. Someone is already using GPD to design a rocket engine for the Future Vision XPRIZE.
“Math is cooked. PSI is cooking physics.” — Dr. Alexander Wissner-Gross
The goal is to compress decades of physics research into years. AWG has argued that there has been a drought of genuinely new physics since the early 1970s. GPD, combined with the reasoning model revolution, is the first serious tool aimed at ending it. The project is available at psi.inc under an open license.
CS Placement: From 89% to 19%
A professor shared cohort data across three years of his own graduating classes. Fall 2023: 89% placed, median salary $94,000. Spring 2024: 71%. Fall 2024: 43%. Spring 2025: 31%. Spring 2026: 19%, median salary below $61,000.

Dave Blundin’s read: GitHub meritocracy replaced university credentialing six or seven years ago, and the salary data has been confirming it since. The credential no longer predicts placement. What predicts placement is demonstrated capability in the tools that matter now, and that bar is set by what AI can already do.
The crew’s verdict was uniform: get on cap tables. Elon’s projection of 10x economic growth in ten years means the gains flow to equity holders. W2 paychecks do not capture that growth. Dave’s advice was direct: if you are a parent, show your kids these numbers. The path of high school to college to CS degree to stable job is closed. The open path runs through startups, founding teams, and building.
“There is no other path forward.” — Dave Blundin
The placement data does not describe a temporary freeze that will thaw when the hiring cycle turns. It describes a structural shift already underway. Anthropic’s enterprise share, NVIDIA’s booking pipeline, and the inference cost curve are all moving in the same direction at the same time. What is happening is visible in the numbers for anyone willing to read them.
Sources
- Moonshots with Peter Diamandis. Episode 240: NVIDIA’s $1 Trillion Prediction, Anthropic Beats OpenAI, Tesla vs. TSMC and The CS Job Collapse. Recorded March 19, 2026, published March 21, 2026. Guests: Dave Blundin (Link Ventures), Salim Ismail (OpenExO), Dr. Alexander Wissner-Gross (Computer Scientist, Reified).
- OpenClaw growth data. Jensen Huang, GTC 2026 keynote. OpenClaw surpassed Linux’s 30-year GitHub record in weeks.
- Anthropic enterprise share. Time magazine, March 2026. 73% vs. 26% of first-time enterprise AI customers, reversed from 60/40 three months prior.
- CS placement data. Tech Layoff Tracker, citing professor cohort records. Fall 2023 through Spring 2026.
- Get Physics Done (GPD). Open-source agentic physicist, Apache 2.0. Available at psi.inc.
- TerraFab announcement. Elon Musk, March 2026. Target: 1 million wafer starts per month, approximately 70% of TSMC’s annual global output.
The headlines from GTC 2026 and the Abundance Summit were large. Some of them deserve scrutiny before acceptance. NVIDIA’s trillion-dollar projection is real, contingent, and hard-capped by physics. Anthropic’s enterprise share is a three-month snapshot. The CS job market data reflects genuine pain but may not be the permanent structural shift the crew described. And Get Physics Done launched four days ago.
NVIDIA’s Trillion Depends on Capacity That Does Not Exist Yet
Jensen Huang’s $1 trillion figure is bookings recognized over the life of contracts, spread across two years. Dave Blundin confirmed NVIDIA will do around $350 billion this calendar year. That is not $1 trillion. The path to $1 trillion runs entirely through TSMC’s ability to expand 3nm and 2nm node capacity, and that expansion is gated by ASML EUV machines, of which roughly 700 are produced per year.
NVIDIA already holds a large share of TSMC’s 3nm capacity. The growth from here requires new nodes that take years to bring online. Jensen can book demand. He cannot book fab capacity that does not yet exist. The trillion-dollar projection is a ceiling defined by supply, and that supply is controlled by a company NVIDIA does not own.

OpenClaw’s growth is genuinely extraordinary, and Jensen’s support for it is a smart move. The honest question is whether NVIDIA’s ecosystem expansion into robotics, robotaxis, and orbital data centers is compounding strength or compounding exposure to sectors where the revenue timelines are much longer than GPU sales cycles.
1,000x Cost Reduction Announced by the Company That Benefits From It
Sam Altman cited a 1,000x cost reduction in reasoning tasks between O1 and GPT-5.4. AWG confirmed the number is internally consistent with tracked deflation rates. What was not discussed: OpenAI has an obvious incentive to frame its own models in the most favorable comparative terms, and O1 was a high-cost model at launch. Comparing GPT-5.4 to O1 at its most expensive operating point gives the biggest possible ratio.
The underlying trend is real. Inference costs are collapsing, and the 40x annual deflation AWG has tracked across multiple models provides independent confirmation. The 1,000x number is consistent with that trend over 16 months. But the specific figure comes from OpenAI, selected by OpenAI, benchmarked by OpenAI, against a baseline OpenAI chose.

Dave Blundin is right that the 2x forecast for next year is probably wrong in the conservative direction. He is also right that the free lunch on inference scaling is ending. The next gains require architectural breakthroughs that do not yet exist publicly. Betting on the pattern continuing is a reasonable bet. It is still a bet.
One Quarter of Enterprise Data Is Not a Trend
Anthropic holds 73% of first-time enterprise AI customers according to Time magazine’s March 2026 report. Three months ago the split was approximately reversed. That is a dramatic shift in a short period, and it is worth asking what it actually measures.
First-time enterprise customers skew toward companies that have not yet committed to a provider. They are, by definition, the most undecided buyers. A three-month snapshot of first-time purchases among undecided enterprises may reflect temporary factors: a specific procurement cycle, a pricing promotion, a competitor’s capacity constraints during the measurement window. The metric does not capture total enterprise spending, renewal rates, or contract size.

The crew is right that Dario’s focused strategy is working and that OpenAI’s empire-building has created real costs. AWG also made the correct observation: it is too early to write OpenAI’s epitaph. GPT-5.4 Pro is competitive, Codex is growing, and OpenAI’s vertical integration gives it options Anthropic does not have. One quarter of enterprise market share data, from a single publication, does not settle the question of who wins the enterprise AI market over five years.
TerraFab’s Targets Are Staggeringly Ambitious
One million wafer starts per month is roughly 70% of TSMC’s current annual global output. The starting point is 100,000, which is itself a substantial operation. The path from 100,000 to 1,000,000 requires a 10x scale-up of a manufacturing process that is among the most technically demanding in existence.
ASML produces 700 EUV machines per year at maximum capacity. Elon has no special access to those machines beyond what his purchasing power can buy. TerraFab’s targets require either a decade-scale buildout of EUV capacity across the industry, a shift to older process nodes that do not require EUV but produce less capable chips, or some manufacturing breakthrough that does not yet exist. All three are possible. None is fast.
GPD Is Four Days Old
Get Physics Done is an open-source agentic physicist that AWG launched from Physical Superintelligence. The Harvard Astronomy Department recommendation came within days of launch. VCs are contacting AWG. The enthusiasm is real.
“Math is cooked. PSI is cooking physics.” — Dr. Alexander Wissner-Gross
AWG’s claim that there has been a drought of new physics since the early 1970s is contested territory. There is genuine debate among physicists about whether the drought is real or reflects the difficulty of the problems remaining. GPD may accelerate solutions to open problems in physics. It may also accelerate the production of plausible-sounding but incorrect physics, which is a known failure mode for reasoning models on hard scientific problems. Four days of usage data is not enough to distinguish between those outcomes.
The CS Job Market May Reflect a Hiring Freeze
The placement data is painful and the trend is unambiguous. 89% placed in Fall 2023 to 19% placed in Spring 2026 is a real collapse. Dave Blundin’s argument that credentials stopped predicting outcomes six years ago is directionally correct.

What is worth questioning is the permanence of the pattern. The tech hiring market has cycled sharply before: the post-2022 correction cut hiring dramatically, then it recovered. What looks like a structural end to CS employment may partially reflect a cyclical trough combined with structural change. The structural component is real: AI is genuinely displacing entry-level coding work. But the conclusion that there is no viable path for CS graduates except entrepreneurship is overstated. The people getting placed at 19% are getting placed somewhere, at reduced salaries that reflect a changed market, not at zero. The survivor question is what it takes to be in that 19%, not whether the 19% exists.
The crew’s advice on entrepreneurship and cap tables is sound. It is also advice from people who have spent decades in venture-backed environments. Not every CS graduate has access to the startup pipeline the crew describes, and recommending that everyone start a company sidesteps the question of what happens to the people for whom that path is not available.
Sources
- Moonshots with Peter Diamandis. Episode 240: NVIDIA’s $1 Trillion Prediction, Anthropic Beats OpenAI, Tesla vs. TSMC and The CS Job Collapse. Recorded March 19, 2026, published March 21, 2026. Guests: Dave Blundin (Link Ventures), Salim Ismail (OpenExO), Dr. Alexander Wissner-Gross (Computer Scientist, Reified).
- OpenClaw growth data. Jensen Huang, GTC 2026 keynote.
- Anthropic enterprise share. Time magazine, March 2026.
- CS placement data. Tech Layoff Tracker, citing professor cohort records.
- Get Physics Done (GPD). Open-source agentic physicist, Apache 2.0. Available at psi.inc.
- TerraFab announcement. Elon Musk, March 2026.
The numbers from this week are not signals or indicators. They are outcomes. NVIDIA is selling a trillion dollars of compute. Anthropic is winning three out of four enterprise deals. CS graduates cannot find jobs. The transition everyone debated is complete; what remains is deciding what to do about it.
NVIDIA Has Already Won the Infrastructure Layer
Thirty thousand people showed up to GTC 2026. Jensen moved the keynote to an arena because the convention center could not hold the crowd. He announced $1 trillion in revenue by 2027, support for OpenClaw, physical AI partnerships spanning BYD, Hyundai, Nissan, and Uber, and orbital data center plans alongside SpaceX.
OpenClaw exceeded what Linux built in 30 years, and it did so in weeks. AWG’s point is worth sitting with: each unhobbling arrives faster than the last because it stacks on everything before it. The rate of rate-of-change is itself accelerating. NVIDIA is the substrate for all of it.

Peter raised the question of regulatory risk. AWG gave the only sensible answer: export controls on NVIDIA compute are already in place. GTC 2026 is the Western industrial AI policy. There is no credible alternative waiting in the wings. NVIDIA is critical infrastructure now, whether or not anyone has formally declared it.
The 1,000x Drop Is Just the Beginning
Sam Altman told the Abundance Summit that the cost of a hard reasoning task dropped 1,000x between O1 and GPT-5.4 in 16 months. AWG confirmed it. Dave Blundin explained why it happened so fast: almost no compute was going into inference before reasoning models, so scaling it by orders of magnitude cost almost nothing in absolute terms.
That free lunch is ending. The next orders of magnitude will require genuine architectural breakthroughs. But dismissing what comes next on that basis is the wrong move. The pattern since 2022 is that the next breakthrough arrives before the current one is fully absorbed. There is no reason to expect the pattern to break now.

The question Dave raised is the right one: if we just did 1,000x, what exactly makes anyone confident the next 12 months delivers only 2x? The honest answer is nothing. The confident forecast is another order of magnitude, probably from architectural work that is already underway inside the frontier labs.
Dario’s Focus Is Beating Sam’s Empire
Anthropic has 73% of first-time enterprise customers. OpenAI has 26%. The numbers flipped from roughly 60/40 in the other direction three months ago. That is a complete reversal, and it did not happen because of benchmarks.
“This is an absolute ass kicking.” — Dave Blundin
Dario Amodei built a company that is easy to partner with because it is not threatening anyone’s core business. He is not designing chips. He is not building data centers. He is not going after Apple on the device front. He makes AI software and partners with every cloud provider who will have him. Every enterprise buyer looking for a stable, focused AI partner lands at Anthropic.

OpenAI is scaling back Stargate and switching from building to renting. That is not a minor adjustment. That is the empire-building strategy revealing its cost. Sam Altman is talented and OpenAI will remain powerful. But the strategic choice Dario made, to do one thing well instead of everything at once, is winning in the market that matters most right now.
Elon Is Building Around the TSMC Chokepoint
TerraFab targets one million wafer starts per month, roughly 70% of TSMC’s entire annual output. The starting point is 100,000. The path from here to there runs straight through ASML’s EUV machine production, which tops out around 700 units per year and cannot easily scale faster.
Elon has solved harder constraints than this before. The relevant track record is Colossus: everyone said you could not compound the effect when scaling chips, and he built it anyway. AWG added the dimension that matters strategically: domestic US chip fabrication at scale de-risks a Chinese military move on Taiwan. That outcome alone justifies the ambition of TerraFab regardless of the fab economics.
Open-Source Physics Is Here
Get Physics Done (GPD) launched from Physical Superintelligence under the Apache 2.0 license. Within days, the former chair of Harvard Astronomy recommended it to every faculty member, postdoc, and student in the department. Someone is using it to design a rocket engine for the XPRIZE. VCs are flooding AWG’s inbox.
“Math is cooked. PSI is cooking physics.” — Dr. Alexander Wissner-Gross
There has been a drought of genuinely new physics since the early 1970s. The reasoning model revolution combined with an open-source agentic physicist is the first real tool capable of ending it. This is early. The results will compound in the same way that software tools compound, faster than any individual researcher can track.
CS Credentials Are Already Worthless for Employment
The placement data is not a warning. It is a closing statement. Fall 2023: 89% of CS graduates placed at $94,000 median. Spring 2026: 19% placed at below $61,000. The credential that was supposed to guarantee a career in technology is delivering a worse outcome than most trades.

Dave Blundin argued this started six or seven years ago when GitHub meritocracy replaced university credentialing. What is happening now is the salary data catching up to what was already true about placement quality. The degree never predicted good outcomes. It predicted an interview. That function is gone.
The advice the crew gave is the only advice that makes sense in this environment. Get on cap tables. Join startup teams. If you are a student or a recent graduate waiting for conditions to normalize, you are waiting for something that is not coming back. The 10x economic growth Elon described will land in equities, not W2 paychecks. The window to get positioned is open. Act accordingly.
Sources
- Moonshots with Peter Diamandis. Episode 240: NVIDIA’s $1 Trillion Prediction, Anthropic Beats OpenAI, Tesla vs. TSMC and The CS Job Collapse. Recorded March 19, 2026, published March 21, 2026. Guests: Dave Blundin (Link Ventures), Salim Ismail (OpenExO), Dr. Alexander Wissner-Gross (Computer Scientist, Reified).
- OpenClaw growth data. Jensen Huang, GTC 2026 keynote.
- Anthropic enterprise share. Time magazine, March 2026.
- CS placement data. Tech Layoff Tracker, citing professor cohort records.
- Get Physics Done (GPD). Open-source agentic physicist, Apache 2.0. Available at psi.inc.
- TerraFab announcement. Elon Musk, March 2026.
Comments
Loading comments…
Leave a comment