China’s AI firms are cleverly innovating around chip bans
Tweaks to software blunt the shortage of powerful hardware
TODAY’S TOP artificial-intelligence (AI) models rely on large numbers of cutting-edge processors known as graphics processing units (GPUs). Most Western companies have no trouble acquiring them. Llama 3, the newest model from Meta, a social-media giant, was trained on 16,000 H100 GPUs from Nvidia, an American chipmaker. Meta plans to stockpile 600,000 more before year’s end. XAI, a startup backed by Elon Musk, has built a data centre in Memphis powered by 100,000 H100s. And though OpenAI, the other big model-maker, is tight-lipped about its GPU stash, it had its latest processors hand-delivered by Jensen Huang, Nvidia’s boss, in April.
Explore more
This article appeared in the Science & technology section of the print edition under the headline “Miniature model-building”
Science & technology September 21st 2024
More from Science & technology
Can you breathe stress away?
It won’t hurt to try. But scientists are only beginning to understand the links between the breath and the mind
The Economist’s science and technology internship
We invite applications for the 2025 Richard Casement internship
A better understanding of Huntington’s disease brings hope
Previous research seems to have misinterpreted what is going on
Is obesity a disease?
It wasn’t. But it is now
Volunteers with Down’s syndrome could help find Alzheimer’s drugs
Those with the syndrome have more of a protein implicated in dementia
Should you start lifting weights?
You’ll stay healthier for longer if you’re strong