We prize originality, yet humans are natural-born copycats and only good imitators survive. Is it time to celebrate the rip-off?
Imitation might be a form of flattery, but it is also a good way to end up in legal trouble. More than 6,000 lawsuits over patent infringements were filed in the United States last year. Samsung and Apple, locked in what’s been called the bloodiest corporate war in history, have jointly spent more than $1 billion in the past four years trying to prove that one poached the other’s smartphone technology.
In today’s world, inventors are our heroes and our saviours – the geniuses who keep the world economy surging forward, who bring us the newest playthings and the latest comforts. We rely on inventors to build a cleaner, happier, more prosperous future. Copycats are a threat to this cheerful vision. Not for nothing do we call them pirates; by cheating and stealing, copiers undermine the system. By profiting from the hard work of others, they reduce the incentive to create. They are a threat to the social order.
But according to a cluster of like-minded researchers, we’ve misunderstood how innovation really works. Throughout human history, innovation – including the technological progress we cherish – has been fuelled and sustained by imitation. Copying is the mighty force that has allowed the human race to move from stone knives to remote-guided drones, from digging sticks to crops that manufacture their own pesticides. Plenty of animals can innovate, but no other species on earth can imitate with the skill and accuracy of a human being. We’re natural-born rip-off artists. To be human is to copy.
This claim emerges from findings in many different kinds of research: field observations of traditional small societies, comparative psychology experiments that compare humans with other primates; computational models that model how civilisations bloom and die. It reveals that imitation allows good ideas to spread quickly and efficiently. By distributing good ideas among many brains, copying preserves them for future generations, allowing them to accumulate.
We think of innovation this way: a lone genius applies massive computational power to a problem, and a flash of insight brings about a world-changing breakthrough. But that’s a myth. Most innovation is mundane, the product of lots of copying and a little bit of creativity.
The history of technology shows that advances happen largely through tinkering, when somebody recreates a good thing with a minor upgrade that makes it slightly better. These humble improvements accrue over generations, so that the Bronze Age straight pin becomes a toga fastener becomes a safety pin. Money begins as seashells, evolves into metal coins, diversifies as paper, and eventually becomes virtual as bitcoins and abstruse financial derivatives. In this way, technologies arise that no one person could possibly invent on his own. When Isaac Newton talked about standing on the shoulders of giants, he should have said that we are dwarves, standing atop a vast heap of dwarves.
Researchers dub this iterative process ‘cumulative cultural evolution’: just as organisms evolve via repeated small changes in genes that provide a survival advantage, each human generation makes small modifications to the technology and traditions it inherits. This idea is most clearly articulated by the anthropologist Robert Boyd, of the Santa Fe Institute and Arizona State University, and the biologist and mathematical modeller Peter Richerson, of the University of California Davis. ‘When lots of imitation is mixed with a little bit of individual learning, populations can adapt in ways that outreach the abilities of any individual genius,’ they write in their book Not By Genes Alone (2005).
Lots of copying means that many minds get their chance at the problem; imitation ‘makes the contents of brains available to everyone’, writes the developmental psychologist Michael Tomasello in the Cultural Origins of Human Cognition (1999). Tomasello, who is co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, calls the combination of imitation and innovation the ‘cultural ratchet’. It is like a mechanical ratchet that permits motion in only one direction – such as winding a watch, or walking through a turnstile. Good ideas push the ratchet forward one notch. Faithful imitation keeps the ratchet from slipping backward, protecting ideas from being forgotten or lost and keeping knowledge alive for the next round of improvement.
It turns out that creating something new is the easy part. What’s difficult – and what’s really important – is maintaining what we already know through copying. Luckily, we are very good at it.
The usual explanation of why humans are so successful as a species is simply that we are smarter, with huge brains and uniquely flexible intelligence that allows us to figure everything out through sheer force of logic.
But it turns out that other animals are not so different. Comparative psychologists now know that many creatures besides us can think on their feet, improvising shrewd solutions to new challenges. They’ve documented chimps that make tools, pigeons that understand probability, and octopuses that make their own shelters. Innovation is commonplace in the animal kingdom.
Meanwhile, without the benefit of cultural learning, humans have turned out to be not so clever after all. A few years ago, Tomasello ran a series of experiments that compared the raw brainpower of chimpanzees, orangutans and children aged two and a half, who haven’t yet had much formal teaching in literacy, mathematics or other accumulated knowledge. The species were pitted head-to-head against one another with concrete tasks that probed abilities such as spatial relations, tool use, and the understanding of quantity.
The chimp raised in a family never behaved like a human. But the human child soon began knuckle-walking, biting, grunting and hooting
The chimps and the toddlers performed about the same on most tests (the orangutans didn’t do so well). But the children left the chimps in the dust on tests of social learning – for instance, watching someone open a puzzle box, then copying his actions. The upshot: without the benefit of learning and culture, humans aren’t that much more intelligent than chimps. But we do have extraordinary skills in social cognition, including a stellar ability to observe and imitate.
Other animals sometimes copy and can learn from one another. But only humans imitate indiscriminately, persistently, and at very high accuracy. We’re compulsive about it. Even before babies can walk, they start imitating adults. In the 1930s, a pair of psychologists raised an infant chimp alongside their own baby in an attempt to understand both species better. The chimp raised in this family (and others in other such experiments later in the century) never behaved much like a human. The human child, on the other hand, soon began knuckle-walking, biting, grunting and hooting – just like his new sibling.
Only humans ‘overimitate’, copying an action with precision even when it’s obviously a poor technique. In another of Tomasello’s studies, an experimenter demonstrated to both chimps and children how to use a rake-like tool to retrieve an out-of-reach reward. He wielded the tool upside down, making it ineffective at catching the reward. When given the tool, the chimps immediately flipped it over to use it more efficiently while the children just copied the adult’s clumsy actions instead.
We overimitate even when told not to – it seems to be part of the way we think. In experiments conducted in 2007 by the psychologist Derek Lyons when he was a graduate student at Yale, children were shown a jar with a toy dinosaur inside. An experimenter then demonstrated a ridiculous way to open the jar: first tapping it with a feather, then unscrewing the lid.
In a video of this experiment, the psychologist emphasises how useless his actions were: ‘Josh, did I have to tap on the jar with this feather to get the dinosaur out?’ he asks. The little boy shakes his head: ‘NO!’ Then the researcher asks Josh to name the gestures that were ‘silly’ and ‘extra’, and praises him when he answers correctly.
Clearly, Josh gets the point. So when the psychologist tells him to take the toy out however he wants, and then leaves the room, what does Josh do? He picks up the feather, taps the jar, and then unscrews the lid.
In variations of this experiment, children were explicitly forbidden to make any of the ‘silly extra’ gestures that researchers used; even so, between 75 and 94 per cent of the time, they copied the precise sequence of motions. Lyons argues that this is a perfectly rational way to behave, especially for children: puzzling out how something works through casual reasoning requires time, energy and knowledge about the world that they don’t yet have. Copying is heuristic – a smart shortcut that, outside of a psychologist’s lab, usually yields the right answer. ‘Imitation is a remarkably potent learning strategy,’ writes Lyons.
Animals are obligatory empiricists: they learn almost everything about the world through trial and error, and when they die, that knowledge dies with them. Each individual is doomed to reinvent the wheel. Humans alone, by learning from those who came before us and doing what they do, can make use of other people’s hard-won expertise.
In the world’s harsher climates, this legacy of knowledge is essential for survival. Those who ignore it risk paying the ultimate price, as many vivid stories from the annals of exploration attest. Boyd’s favourite example is the fate of the British explorer Sir John Franklin and his 1845 expedition to find the Northwest Passage above Canada, which the British were wrongly convinced led to an open sea around the North Pole.
Franklin was a brave and intelligent man, well-equipped, with a skilled crew. But although Franklin had friendly interaction with some local Inuit, he did not take their technology seriously. The explorers did not learn to hunt for seal, wear fur, or build igloos or kayaks, instead sticking with their British woollens and canvas tents. If primitive Inuit had survived in the Arctic for generations, they reasoned, officers of the Royal Navy would certainly have no trouble figuring out how to get by. Overriding human nature, they refused to imitate.
Franklin’s ships became trapped in pack ice, and eventually the food ran out. The surviving crew tried to walk to safety across the frozen seas, but without the Inuit knowledge of how to hunt and stay warm, smarts alone could not save them. They all starved to death.
Innovation does matter: if everyone copied everything, nothing would ever improve, and we would be unable to respond as the world changed around us. But as the work of the behavioural scientist and evolutionary biologist Kevin Laland of the University of St Andrews reveals, innovation doesn’t matter anywhere near as much as we might expect.
In an effort to understand what drives the accumulation of cultural information, his colleague Hannah Lewis developed a mathematical model that can simulate how new cultural traits – technological inventions, traditions, or knowledge – arise and disappear over generations. With this model, Laland and Lewis plugged in various forms of innovation (inventing something outright, or modifying or combining existing inventions), as well as trait loss – losing knowledge through inaccurate transmission of information. They ran these simulations through 5,000 cycles, looking to see which factor had the biggest impact on the final richness and diversity of traits.
Accurate transmission of information had a massive impact on the outcome: with this model, increasing the fidelity of cultural transmission just a bit yielded huge increases in the amount and variety of culture. ‘It doesn’t matter how much novel invention or refinement is going on: if you don’t have accurate transmission you simply cannot build up culture,’ says Laland. ‘It was a real insight.’
The mighty machine of cultural innovation turns out to be powered by an army of small minds
Combining existing traits (a strategy that merges imitation and innovation) was moderately effective, but outright innovation was not very effective at all. The authors aren’t too surprised by that outcome: other historical and experimental studies have repeatedly found the same thing. ‘The finding that novel invention turns out not to be so important is consistent with studies of human innovation, which find that innovation or discovery is often the result of chance, combination and incremental refinement rather than genius,’ they wrote in 2012.
There’s a message here for us: we’ve got it all wrong.
It’s time to retire the notion of genius and all the baggage that comes with it: the exaltation of big brains, the story of progress as a grand parade of exceptional thinkers, the myth that innovation happens with a lightning bolt of insight. We can stop worshipping at the altar of disruption. We can get rid of our posters of Einstein.
And we should give credit where credit is due. The mighty machine of cultural innovation turns out to be powered by an army of small minds, thinking unoriginal thoughts. It’s time to celebrate their mediocrity.
Let’s honour the dabblers and tinkerers, who together discover what one lone genius never could. Rather than one Nobel Prize, limited to three people at most, we should be awarding hundreds of tiny prizelets. The dilettante, the guy fiddling around in his basement, the two-bit inventor peddling a gizmo on Kickstarter: it turns out that they are the saviours of civilisation. Glory to these derivative thinkers, to those who kludge and muck about! The very long view – one that stretches all the way back to the first humans who struck two rocks together to create a blade – suggests the future is in their hands.