Science & technology | I can do it with a distributed heart

Training AI models might not need enormous data centres

Eventually, models could be trained without any dedicated hardware at all

A network of pixelated hearts
Illustration: Mariaelena Caputi

Once, the world’s richest men competed over yachts, jets and private islands. Now, the size-measuring contest of choice is clusters. Just 18 months ago, OpenAI trained GPT-4, its then state-of-the-art large language model (LLM), on a network of around 25,000 then state-of-the-art graphics processing units (GPUs) made by Nvidia. Now Elon Musk and Mark Zuckerberg, bosses of X and Meta respectively, are waving their chips in the air: Mr Musk says he has 100,000 GPUs in one data centre and plans to buy 200,000. Mr Zuckerberg says he’ll get 350,000.

Explore more

This article appeared in the Science & technology section of the print edition under the headline “I can do it with a distributed heart”

From the January 11th 2025 edition

Discover stories from this section and more in the list of contents

Explore the edition

More from Science & technology

A person blowing about a pattern in the shape of a brain

Can you breathe stress away?

It won’t hurt to try. But scientists are only beginning to understand the links between the breath and the mind

The Economist’s science and technology internship

We invite applications for the 2025 Richard Casement internship


A man sits inside a pixelated pink brain while examining a clipboard, with colored squares falling from the brain

A better understanding of Huntington’s disease brings hope

Previous research seems to have misinterpreted what is going on


Is obesity a disease?

It wasn’t. But it is now

Volunteers with Down’s syndrome could help find Alzheimer’s drugs

Those with the syndrome have more of a protein implicated in dementia

Should you start lifting weights?

You’ll stay healthier for longer if you’re strong