How generative models could go wrong
A big problem is that they are black boxes
In 1960 norbert wiener published a prescient essay. In it, the father of cybernetics worried about a world in which “machines learn” and “develop unforeseen strategies at rates that baffle their programmers.” Such strategies, he thought, might involve actions that those programmers did not “really desire” and were instead “merely colourful imitation[s] of it.” Wiener illustrated his point with the German poet Goethe’s fable, “The Sorcerer’s Apprentice”, in which a trainee magician enchants a broom to fetch water to fill his master’s bath. But the trainee is unable to stop the broom when its task is complete. It eventually brings so much water that it floods the room, having lacked the common sense to know when to stop.
Explore more
This article appeared in the Science & technology section of the print edition under the headline “How generative models could go wrong”
Discover more
Deforestation is costing Brazilian farmers millions
Without trees to circulate moisture, the land is getting hotter and drier
Robots can learn new actions faster thanks to AI techniques
They could soon show their moves in settings from car factories to care homes
Scientists are learning why ultra-processed foods are bad for you
A mystery is finally being solved
Scientific publishers are producing more papers than ever
Concerns about some of their business models are building
The two types of human laugh
One is caused by tickling; the other by everything else
Scientists are building a catalogue of every type of cell in our bodies
It has thus far shed light on everything from organ formation to the causes of inflammation