The CPU's Quiet Comeback
For most of the last three years, the AI trade has been a GPU trade — and Intel was the company the AI era was supposed to leave behind. Lip-Bu Tan's Q1 2026 message inverts that framing. 'In recent months we have seen clear signs that the CPU is reasserting itself as the indispensable foundation of the AI era,' Tan told investors, pointing to inference and agentic workloads where orchestration, memory, and general-purpose compute live on the host CPU rather than the accelerator. The numbers back the rhetoric: Data Center and AI revenue grew 22% year-over-year to $5.1 billion, AI-driven businesses now make up roughly 60% of total revenue and grew about 40% year-over-year, and Intel sold chips it had previously written off because demand exceeded supply.
The NVIDIA deal anchors the thesis architecturally. Jensen Huang described the partnership as 'tightly couples NVIDIA's AI and accelerated computing stack with Intel's CPUs,' and Intel Xeon 6 was named as the host CPU for NVIDIA's DGX Rubin NVL8 systems — meaning every Rubin-class rack NVIDIA ships pulls Xeon along with it. CFO David Zinsner reinforced the demand picture by noting Q1 results would have been even stronger if supply could keep up. The bullish read is that AI inference is not a GPU monopoly; it is a system problem, and Intel sits at the orchestration layer of that system.




