All Articles
AI5 min read8 March 2024

Claude 3, Nvidia’s Rise, and the Infrastructure Behind the AI Race

In early March 2024, Anthropic released the Claude 3 model family and Nvidia briefly overtook Apple in market capitalisation. The two events were related more than coincidental.

AIClaudeAnthropicNvidiaHardware

In early March 2024, Anthropic released the Claude 3 model family. Three variants were announced. Opus, the highest-capability model. Sonnet, the mid-tier balance of capability and cost. Haiku, the small, fast model for high-volume use cases. The benchmarks for Opus put it at or above GPT-4 on a wide range of tasks, including some categories where it was clearly stronger.

The release was significant because it confirmed something that had been forming as a pattern. The frontier was now a competitive space rather than a single company at the lead. OpenAI had set the pace with GPT-4 in March 2023. Google had reached the frontier with Gemini Ultra in December 2023. Anthropic had now joined them with Claude 3. The benchmark gap between the leading models had narrowed to the point where comparisons depended heavily on which benchmarks you chose.

In the same month, Nvidia briefly overtook Apple as the second most valuable publicly traded company in the United States, behind only Microsoft. The market capitalisation of Nvidia had grown several times over during the previous year as the AI infrastructure boom translated into orders for AI accelerators that Nvidia was almost uniquely positioned to fill at scale.

The two events were related more than the timing alone suggested. The pace at which the frontier models had been improving depended on the availability of large-scale training infrastructure. The training runs for Claude 3 Opus, GPT-4, and Gemini Ultra all consumed enormous amounts of compute, almost all of which ran on Nvidia accelerators. The market reflected that dependence. Every dollar that went into training a frontier model was, in significant part, a dollar that ultimately found its way to Nvidia.

What the March 2024 moment captured was the maturation of the AI ecosystem into something with multiple distinct layers, each with its own set of leaders. Frontier model providers were competing at the application layer. Nvidia was dominant at the silicon layer. Cloud providers were positioning themselves as the integration layer. None of these layers operated independently. The competitive dynamics within each layer affected the others in ways that would continue to play out.

The investors looking at Nvidia’s market cap were essentially betting that the AI training infrastructure boom was sustainable. Whether that bet would hold over the years to come depended on how quickly the demand for additional compute capacity continued to grow.

Found this useful?

Share it with someone who'd enjoy it.