Back to Blog
AI Strategy

Sundar Pichai & John Collison: Google's AI Strategy, Agent Managers, and the $185B Capex Bet

April 19, 2026
8 min read
Sundar PichaiJohn CollisonGoogleAlphabetStripeGeminiAI AgentsAI InfrastructureCapexWaymoQuantum ComputingIsomorphic LabsAGI
Sundar Pichai & John Collison: Google's AI Strategy, Agent Managers, and the $185B Capex Bet

In a wide-ranging conversation with Stripe co-founder John Collison, Sundar Pichai — CEO of Google and Alphabet — walked through Google's two-decade arc on artificial intelligence, the infrastructure build that will define the next one, and the handful of long-horizon bets he believes will still matter in 2036. The full interview is a rare unguarded look at how the person running the company that invented the Transformer actually thinks about what comes next.

The throughline: Google is not playing catch-up on AI. It is finishing a transition it began years before the rest of the industry understood the prize — and the next phase will be decided by capital, agents, and physics, not by demos.


1. The Transformer Paradox: Google's Quiet AI Foundation

One of the most persistent narratives about Google is that it "missed" the product window for Transformers — the architecture invented inside Google in 2017 that powers virtually every modern frontier model. Pichai rejects the framing, but with nuance.

Transformers, he reminded Collison, were not built as a chatbot. They were built to solve specific product problems that Google had at the time: higher-quality translation, faster speech recognition, and — most importantly — better search. The team that wrote "Attention Is All You Need" was not trying to replace Google. It was trying to make Google work better.

Search Quality, Not Showreel

That lens paid off in places the public rarely noticed. Transformer-based systems like BERT and later MUM drove what Pichai described as the single biggest jumps in search quality in Google's history — improvements that lifted billions of daily queries by measurable margins. The work was foundational. It just wasn't a product launch.

Lambda, AI Test Kitchen, and the Caution Trade-off

Long before ChatGPT's public moment, Google had its own internal conversational model called Lambda. At Google IO 2022, the company exposed parts of it through "AI Test Kitchen." Pichai was candid: Google was more cautious than its eventual competitors about toxicity, safety, and reputational risk, and that caution translated into a slower path to a mass-market chatbot.

The YouTube and Instagram Parallel

Pichai compared the ChatGPT moment to the arrival of YouTube or Instagram — disruptive consumer products that forced incumbents to internalize a new interaction pattern and adapt their own stack around it. Google has done that before, more than once. In his telling, the ChatGPT wave was not a moment of defeat; it was a moment of absorption.


2. From Search Box to Agent Manager

The most consequential product claim in the interview was about what search becomes over the next decade.

Pichai's framing: in ten years, "search" as most people understand it will have evolved into an "Agent Manager." Instead of typing a one-line query and getting back ten blue links, users will hand off multi-threaded, long-running tasks — comparing vendors, assembling a trip, drafting and reconciling a set of documents, running an ongoing process — and the system will coordinate the sub-agents required to finish the work.

That is a structurally different product from ten-blue-links search. It changes the unit of a "query" (a task, not a question), the unit of a "result" (a completed outcome, not a page), and eventually the unit of monetization. It also reframes ranking: the job is no longer picking the best page, it is orchestrating the best sequence of actions across tools, data sources, and other agents.

Pichai was clear that this is not a hypothetical. It is the direction the product roadmap is pointed, and the capex discussed later in the interview is the bill for getting there.


3. The Latency Budget: Why Milliseconds Still Matter

A characteristically Google detail emerged when Collison pressed on internal engineering culture: Google runs on an explicit latency budget.

Teams are told, in effect, that user-perceived speed is a shared resource. If a team wants to ship a new feature that adds to page latency, they have to earn the headroom — often by shaving milliseconds off somewhere else in the stack. Pichai noted that saving 3 milliseconds can buy a team enough credit to add a new capability. That discipline, invisible to users, is one of the reasons Google products feel fast under conditions where a naïvely-built equivalent would feel slow.

Why Flash Changes the Agentic Math

This is also why Pichai is so focused on Gemini Flash. In his description, Flash models now reach roughly 90% of Pro model capability while running at dramatically higher speeds and lower cost. For chat, that trade-off is nice. For agents — which chain dozens or hundreds of model calls to finish a single task — it is the difference between viable and unviable. Latency is not a UX polish problem in the agent era. It is the gating factor on whether the product exists at all.


4. The $185B Question: Capex, Wafers, and HBM

The headline number from the interview is blunt: Alphabet plans to spend between $175 billion and $185 billion in capex in 2026. That is not a typo, and it is not a one-year anomaly. It is the scale of the infrastructure commitment Pichai believes the next phase of AI demands.

The Real Bottlenecks Aren't Electricity

The public conversation about AI capex has fixated on power. Pichai pushed back. In his view, the binding short-term constraints are further upstream:

  • Wafer starts — the sheer throughput of advanced-node silicon coming out of the world's fabs. You cannot buy a GPU or a TPU that wasn't fabricated first.
  • HBM memory — high-bandwidth memory, which sits next to the compute and feeds it, is in tighter supply than the accelerators themselves.

Electricity matters, and it will matter more over time. But as of 2026, the immediate ceiling on how much AI infrastructure Alphabet (or anyone else) can bring online is set by the wafer and HBM pipelines, not by substations.

"Build 10x Faster"

Pichai was pointed about a second constraint: the physical world. The United States, he argued, needs to learn to build at roughly 10x its current speed — data centers, transmission, substations, fabs, housing for the people who staff them — to keep pace with global competitors who are already moving faster. The AI race is not only a software race. It is a permitting, construction, and industrial-capacity race.

The Underleveraged Admission

Perhaps the most candid moment: asked what he would change if he could rewind the tape, Pichai admitted Alphabet has historically been a conservative "good steward" of cash. If he could go back, he said, he would have put more capital into Waymo earlier. The implication is that the current capex surge is partly a correction — Alphabet choosing to stop underleveraging its balance sheet against its best long-term bets.


5. Moonshots: Space, Quantum, Robots, Drugs

Much of the interview's second half covered projects that most CEOs would either not discuss publicly or would bury in an investor-day slide. Pichai treated them as core, not ornamental.

Data Centers in Space

A small internal team at Google is actively studying the feasibility of putting data centers in space. The motivation is long-horizon: terrestrial energy, land, and cooling constraints do not scale forever, and solar harvesting above the atmosphere is strictly more efficient. Pichai did not promise a timeline. He did make clear this is real engineering work, not a thought experiment.

Quantum

On quantum computing, Pichai was direct: he is "deeply committed." His thesis is that quantum will be essential for simulating domains — chemistry, materials, weather, biology — where classical compute hits exponential walls. Quantum, in his framing, is not a competitor to AI. It is a complement for problems AI alone cannot crack.

Robotics & Waymo

Google's robotics push is moving on two tracks. On the humanoid and general-purpose side, it is partnering with Boston Dynamics and Agile Robotics, pairing their hardware with Gemini models for spatial reasoning. On the autonomy side, Waymo's breakthrough — the reason it now looks like a runaway leader rather than a perpetual pilot — came from moving to end-to-end deep learning with Transformers a few years ago. The same architectural bet that rebuilt search also rebuilt self-driving.

Isomorphic Labs

Isomorphic Labs, Alphabet's AI-native drug discovery company, sits in a different category. Pichai described it as a methodical, compounding bet — not a moonshot in the "flashy demo" sense, but a bet that applying modern AI to the protein and small-molecule stack will change the base rate of successful drug programs over a decade. The unit of progress is pipeline depth, not headlines.


6. "Jet Ski" and the Shape of Work in 2027

The interview's most forward-looking segment was about how work itself changes.

The Internal Tool Called Jet Ski

Inside Google, employees use an agent-workflow tool known as "Jet Ski" — the internal nickname for what the company launched externally as Antigravity. It is not a chatbot. It is an environment in which an agent manages multi-step workflows on behalf of a human, surfacing decisions only when needed.

The 2027 Inflection Point

Pichai made an unusually specific prediction: by 2027, a meaningful class of business processes — his example was financial forecasting — will be "fully agentic." Humans will act as reviewers, not authors. The first draft, the reconciliation, the sensitivity analysis, the variance commentary: all produced by agents, then approved by a person whose job is judgment rather than production.

If that prediction is even directionally right, it changes what enterprise software is for. Tools optimized for human producers — the "great UI for a skilled analyst" category — start to look like vestigial organs. The buyers who matter are the ones designing the reviewer's workflow, not the producer's.


7. The "AGI Pill": Google Has Always Been All-In

A persistent critique — mostly from outside Google — has been that the company is somehow lukewarm on AGI, more comfortable in the "useful AI product" lane than in the "build a general intelligence" lane. Pichai dismissed the characterization directly.

Google's founders, he reminded Collison, have talked about AGI-scale ambition since the company's earliest days. Demis Hassabis, whose DeepMind joined Google in 2014, has been pursuing AGI for roughly 20 years. Jeff Dean, one of the architects of Google's infrastructure and its AI systems, has been working on large-scale learning systems for nearly as long. AGI ambition inside Google, in Pichai's telling, is a continuity, not a pivot forced by a competitor's product launch.

That is worth taking seriously. A company can be late to a particular product moment and still be the company that has been building toward the underlying goal longest.


Key Takeaways from the Pichai × Collison Interview

  • Transformers were a product investment, not a missed chatbot. BERT and MUM drove the biggest search-quality jumps in Google's history; the lack of a public chatbot was caution, not absence of capability.
  • Search becomes an Agent Manager. Over the next decade, the unit of a query shifts from a question to a task, and the unit of a result shifts from a page to a completed outcome.
  • The bottleneck is upstream of electricity. Alphabet's $175–185B 2026 capex plan is gated short-term by wafer starts and HBM memory, and long-term by the speed at which the physical world can be built.
  • Moonshots are compounding, not theatrical. Space data centers, quantum, humanoid robotics, Waymo, and Isomorphic Labs are being treated as serious, staffed programs with decade-scale horizons.
  • 2027 is the agentic inflection point. Pichai expects whole categories of business work — starting with financial forecasting — to flip from human-authored to human-reviewed within roughly a year.

Whether you run a platform, build on one, or are simply trying to understand where the next decade of infrastructure spending will land, this is one of the most substantive CEO interviews of the year.


Watch the full interview here: Sundar Pichai × John Collison on Google's AI strategy


Related Reading

For more on Pichai's evolving view of Google's AI stack, see our earlier recap Sundar Pichai on Google's AI Future: 'Nano Banana Pro,' 'Vibe Coding,' and Data Centers in Space. And if the "Agent Manager" framing is new to you, start with our foundational explainer on What Are AI Agents.