(hey, type here for great stuff)

access to tools for the beginning of infinity

Do cumulative culture & extreme imitation prevent breakthroughs?

The story of a man who lived a normal life with a hollowed-out brain pushed as a thin membrane into the skull resurfaces here and there as any meme does, getting a flavorful context depending on the medium it kicks again.

If the news comes out, say, a mostly British social media constellation, the origin of this unlucky man living his John Doe life is more than anecdotal since it will make a good joke given the special animosity popular culture sees neighbors on the other side of the English Channel.

“We diagnosed a non-communicating hydrocephalus, with probable stenosis of Magendie’s foramen. The leg weakness improved partly after neuroendoscopic ventriculocisternostomy, but soon recurred; however, after a ventriculoperitoneal shunt was inserted, the findings on neurological examination became normal within a few weeks. The findings on neuropsychological testing and CT did not change.” Feuillet et al./The Lancet

Reuters (conveniently signing as “Reuters Staff” for the occasion) aired the news in 2007 under the title “Tiny brain no obstacle to French civil servant.” At this stage of digital culture, we acknowledge with ease terms that pack the minimum viable cultural message spreading through imitation or popularity on the web, so squeezing on a single headline by a reliable source the keywords “tiny brain,” “French,” and “civil servant” plays with old jokes.

As a journalist who was once an intern (that is, the precarious assistant on an especially despised — and already precarious almost by definition— trade from the “information society”), I can see the laughs behind the making of this headline; one can almost see a young crew of office reporters mostly attached to the computer and sent around, say, London predictable press conferences and events (pre-Covid, of course,) getting ready for the weekend.

Tiny brain, functioning life

Reuters covered this “healthcare” news, which would appear published one day later on The Lancet, on July 20, 2007 —a Friday, unsurprisingly. Headlines can read very differently if published before leaving the office to hit the weekend.

And so the trades of journalistic predictability go on. “Man with tiny brain” reads like a derogatory oxymoron, yet what makes this discovery a meme is its surprising lack of dramatic effect on the professional and casual living of a 44-year-old civil servant —the brain’s plasticity is so extraordinary that he carried a perfectly dull, fulfilling life as a civil servant, married and with two children, and probably thinking about which fulfilling stepping stones may suit him best: buying a second home? A promotion at work? A vanity purchase to play into the predictable fantasies of midlife crisis?

Managing to live “an entirely normal life” despite a fluid-filled chamber called ventricle “took most of the room of his skull” shows more than mere brain adaptability, which we already knew is extraordinary through the study of patients who had experimented severe damage (sometimes extirpation) of entire brain areas, and yet had managed to relearn cognitive processes even at very advanced ages.

Described as a patient with “normal social functioning,” he has two children and works as a civil servant

Especially poignant stories regarding the brain’s potential for adaptation and (partial) cognitive reconstruction and regeneration became bestsellers as neuroimaging and neurology evolved in the seventies.

Neurologist and divulgator Oliver Sacks, among others, brought the field to the mainstream through the 1973 book Awakenings. Its compelling film adaptation starring Robin Williams (as fictional doctor Malcolm Sayer, based on Sacks) and Robert De Niro (a catatonic patient suffering severe encephalitis) helped speed public interest and studies concerning head trauma and rehabilitation, interventional neuroradiology, and other overlapping disciplines.

A very particular midlife crisis

Our 44-year-old patient was not a medically perceived case of brain oddity. According to The Lancet, “on neuropsychological testing, he proved to have an intelligence quotient (IQ) of 75: his verbal IQ was 84, and his performance IQ 70.” The whole field of IQ testing deserves, as it is having, a skeptical approach since its framing and results coming from it only determine trends but cannot (as it is popularly maintained) read actual “intelligence,” especially someone’s intelligence. That said, we know our patient was a normally functioning adult though not especially gifted (75-84 is considered “borderline,” whereas the average score is 100).

Doctor Lionel Feuillet and his colleagues at the Université de la Mediterranée in Marseille would not have stumbled upon a concerning diagnosis regarding the patient’s motricity, as well as a clinical history that deserved scrutiny.

It all started with a 2-week “mild” left leg weakness, a mere anecdote to almost any patient; not for somebody who had undergone a cerebral shunt as a baby to treat hydrocephalus (the swelling of the brain with excess fluid):

“When he was 14 years old, he developed ataxia and paresis of the left leg, which resolved entirely after shunt revision. His neurological development and medical history were otherwise normal.”

“Otherwise normal.” The shunt’s last revision dated from three decades back, so his doctor decided to approve CT and MRI scans, revealing a diagnosis far from “normal”: severe dilatation of the lateral ventricles and “massive enlargement” of the lateral, third, and fourth ventricles, resulting in “a very thin cortical mantle and a posterior fossa cyst”:

“We diagnosed a non-communicating hydrocephalus, with probable stenosis of Magendie’s foramen. The leg weakness improved partly after neuroendoscopic ventriculocisternostomy, but soon recurred; however, after a ventriculoperitoneal shunt was inserted, the findings on neurological examination became normal within a few weeks. The findings on neuropsychological testing and CT did not change.”

The place where thought and consciousness happen

The scan images are indeed shocking. Even nonspecialists can attest that, at least visually, from half to two-thirds of the space occupied by fluid. Which could bring us to speculate about the philosophically so-called mind-body problem in which we try to discern between thought, consciousness, and the physical recipient where these emergent phenomena take place.

If, as many scientists think, the physical origin of consciousness takes place in the brain, how is it possible to experience severe brain damage and still be aware of one’s existence, carrying the conventional, highly demanding life of a 44-year-old civil servant with two young children?

“Awakenings” (1973); Penny Marshall adapted Oliver Sacks’ memoir on the author’s struggle to help patients affected by the 1917-1928 epidemic of encephalitis lethargica

Researcher Max Muenke, a pediatric brain defect specialist, stated the rarity of the case but also the relativity of what “normalcy” can be:

“What I find amazing to this day is how the brain can deal with something which you think should not be compatible with life. If something happens very slowly over quite some time, maybe over decades, the different parts of the brain take up functions that would normally be done by the part that is pushed to the side.”

The traditional division of the labor force into while-collar (professional) and blue-collar (industrial) workers is misleading and dated as heavy industry and manufacturing left the higher costs of so-called “industrialized” or “advanced” countries for emerging economies of convenience (Eastern Europe and Northern Africa for Western Europe, Mexico for the United States, and East Asia overall).

Before specialization

The patient’s “white collar” profile is somehow classic: as a civil servant in a country that historically has relied on Statism and a harmonized, centralized bureaucracy, what the anonymous protagonist’s health issues reveal is the relative dull intellectual work in highly technical societies, a process to which early sociologists (such as Max Weber) paid special attention.

Modern bureaucratic societies favor extreme specialization, and the process is still evolving; philosopher José Ortega y Gasset theorized about the societal toll of losing humanists (well-educated cultural generalists) in favor of the “specialist”:

“He [the ‘specialist’] is one who, out of all that has to be known in order to be a man of judgment, is only acquainted with one science, and even of that one only knows the small corner in which he is an active investigator. He even proclaims it as a virtue that he takes no cognizance of what lies outside the narrow territory specially cultivated by himself and gives the name of ‘dilettantism’ to any curiosity for the general scheme of knowledge.”

Specialization in human societies began much earlier than the more or less porous divide between white-collar and blue-collar workers from industrial and post-industrial societies; vulgarizers such as historian and ornithologist Jared Diamond trace it back to pre-Neolithic and Neolithic cultures.

As Diamond dedicates an entire chapter of Guns, Germs, and Steel to explain activity and roles in different types of pre-modern societies (he uses the nomenclatures of “bands” of hunter-gatherers, “tribe,” “chiefdom”, and proto-states), semi-permanent settlements required a type of specialization that would eventually prevent each adult member of any small & disperse group from learning (both orally and by imitation) all the tribe’s cumulative knowledge.

“Arrival of Burke, Wills and King at the deserted camp at Cooper’s Creek, Sunday evening, 21 April 1861” (John Longstaff, oil on canvas, 1907)

Diamond brings an example he knows well: amid societies in New Guinea’s interior, lack of specialization and written culture forces every adult to develop a surprisingly rich set of cultural skills, representing a sharp contrast with everyone’s role within more populated, hierarchical societies.

As individuals, we stopped being a portable encyclopedia of all the tribe’s cultural and practical knowledge millennia ago, yet with alphabetization, we developed highly sophisticated codes that allowed us to rely on cumulative culture instead of learning “all there is worth knowing” in every generation. No wonder how technocratic work allowed people to specialize in small knowledge domains, some of which don’t require especially brilliant competencies but rather tend to rely on a certain discipline and attitude. Even somebody with a brain is pushed to the skull’s internal perimeter by fluid build-up.

On the shoulders of giants

Adult individuals in non-specialized societies depended on a wide range of knowledge and skills transmitted fully from one generation to another; there is evidence of early humans with disabilities surviving through social support, but physical skills were as important for survival as a deep knowledge of hunting and gathering techniques adapted to seasons and place.

As specialization advanced from the late paleolithic onwards, the need for mastering cultural knowledge and physical skills may have faltered. As the first writing systems developed in Bronze Age cultures, the early literary works associated virtue to cultivating as many skills as possible, some of which had already lost some practical value.

As societies prospered thanks to cumulative knowledge and specialization, it became clear that no individual can aspire to master an entire culture: new generations benefit by the mere fact they can retrieve what major thinkers had already achieved generations back, a vantage point of cumulative knowledge that meant “standing on the shoulders of giants.”

“Burke and Wills on the way to Mount Hopeless” (George Washington Lambert, watercolor, 1907)

Despite the efforts of early glossaries and medieval sages such as Isidore of Seville (considered the last scholar of the ancient and early Christian world), and Vincent de Beauvais’ Speculum Maius, a two-million-word treatise that intended to be a compendium of everything worth knowing in the 13th century, specialization further accelerated during the Enlightenment. French encyclopedists built upon the notion of a tree of wisdom that could encapsulate all knowledge, switching from divine sanction to the also-ethereal ideal of pure scientific verification.

Weight of cumulative culture

As alphabetization rose in centralized civilizations, oral culture and local tradition had blended with practical knowledge in written almanacs —eclectic compendiums of applied wisdom, culture, and superstition. Such works became more statistical and descriptive in the modern world but could not fight specialization. However, practical advice remained popular among rural communities relying on the content sold by colporteurs.

Learning and cumulative culture set humans apart, explains Carl Zimmer in his essay about genome editing, She Has Her Mother’s Laugh: The Powers, Perversions, and Potential of Heredity, but also wires us to learn by customary imitation —sometimes to our disadvantage.

Zimmer dedicates a chapter to an experiment designed by Yale graduates about how children learn, in which children and chimpanzees assist in an overly elaborated demonstration of how to get fruit from a secured box. Chimpanzees will skip any extra step not necessary to reach the end, though children will uncritically follow the unnecessary steps previously shown on the demonstration:

“Charlotte’s tapping was not a childish mistake. It was actually a little window through which I could glimpse something profound about human nature. We are well adapted for inheriting culture. We pass genes down through generations, but we also pass down recipes, songs, knowledge, and rituals.”

There is an important reason, explains Zimmer, for us to follow by instinct the inner workings of cumulative culture:

“You can’t use your brain to reinvent thousands of years of technology and customs on your own.

“To say that culture is an important part of our lives doesn’t really do the word justice. Culture is not a part of our life. We are a part of it. Lyons’ experiment helped me see how adapted we humans are to immersing ourselves in culture.”

A day in the life of the Yandruwandha

Zimmer goes on with one example highlighted by Harvard anthropologist Joseph Heinrich, in which one can clearly see how dangerous it is for humans to try to survive outside of cumulative culture: it happened in 1861 in a remote outback region of deserts, mountains, and swamps in eastern Australia, inhabited by the Yandruwandha.

“Over hundreds of generations, the Yandruwandha built up knowledge about their part of the continent and transmitted it to their children.”

The Yandruwandha came to know each water hole to drink and fish, how to use fire to feed themselves and fight nocturnal winter temperatures. Through trial and error we will never know the details about, the Yandruwandha also learned to make bread and porridge from nardoo, a type of fern that grows in the region.

Painting depicting the moment when a group of Yandruwandha discovers Wills’ body: “Natives discovering the body of William John Wills, the explorer, at Coopers Creek, June 1861,” (Eugene Montagu Scott, oil on canvas, 1862)

But nardoo is a tricky, clover-like fern: its cells are so thick that eating the plant with no extra step provides virtually no nourishment, while at the same time a toxic enzyme, thiaminase, can make one sick by destroying the body’s supply of vitamin B1. When we lose thiamine, we can develop extreme fatigue, hypothermia, and muscular atrophy.

“Yet the Yandruwandha could make nardoo a major part of their diet, because they learned how to make it safe to eat. They collected the plant’s seedlike sporocarps in the morning and immediately started roasting them in the embers of a fire. The fire destroyed some of the thiaminase, and the ashes likely eliminated more of it by altering the nardoo’s pH.”

Afterward, it was a matter of using grinding stones and adding water to destroy the plant’s cell walls, getting nutritive flour as a result.

Mentality and survival

Cumulative culture helped the Yandruwandha survive by mastering the secrets of the harsh surrounding environment, which contrasts with what happened in the summer of 1861, when three European men entered the area leading a sick camel, the remainings of an exploratory crew that pulled ahead of a bigger group that, departing from Melbourne, had wanted to find a feasible land route northbound through the outback all the way to the Gulf of Carpentaria.

The little group (now known as Burke and Wills expedition) ahead waited halfway for the rest of the party to catch up. As no one arrived, they decided to continue. Tired, hungry, and disoriented, they struggled through never-ending, treacherous wetlands. When supplies ended, they had to get their meals from an environment they knew little about.

The Yandruwandha found them desperate. They let them camp and fed them fish and nardoo bread, with which the Europeans regained some strength. Feeling humiliated by “savage charity,” Burke, one of the crew’s members vindicated Western culture superiority: only intelligence was needed to survive. When the Yandruwandha packed, Burke, King, and Wills tried to catch fish but couldn’t (the Yandruwandha used nets, which they had probably woven from some of the surrounding plants).

“Without enough fish to eat, the explorers turned to nardoo. They boiled the plants and ate four or five pounds’ worth of the stuff every day. But no matter how much they ate, they kept growing more gaunt. ‘I am weaker than ever although I have a good appetite and relish the nardu much but it seems to give us no nutriment,’ Wills wrote in his journal.

Soon later, Wills was dead. Then it was Burke’s turn. King wandered alone, running into a second group of Yandruwandha. They took him in, and King recovered again within their cumulative culture.

A month later, a Melbourne rescue party found him. King explained the story to reporters and the party became public heroes:

“Burke and Wills were celebrated with statues, coins, and stamps. Yet their achievement was to have died in a place where others had thrived for thousands of years. The Yandruwandha got no honors for that,” writes Zimmer.

Taking the risk of a first

Zimmer enters the world of cultural evolutionism when he argues that, unlike our closest animal relatives, we rely on teaching and learning:

“Humans seem to have evolved to be especially good students. One of the most important adaptations we’ve evolved —and one of the strangest— is what Derek Lyons was testing by studying Charlotte: extreme imitation.”

There is an obvious cautionary tale derived from this human instinct, though:

“If children end up imitating people who don’t know what they’re doing, they’ll reproduce failure.”

Cumulative culture and extreme imitation were essential aspects for survival in the past but make our species a target for the bureaucratic, highly specialized frameworks of today, where the ones who accommodate uncritically seem the best fitted.

Burke, King, and Wills try to find their way out of a neverending environment they perceived as hostile

At least, they seem the best fitted to repeat mistakes. We have to know the past so we don’t repeat mistakes, but we also need to understand that some solutions won’t come from cumulative culture.

Before the Yandruwandha benefited from nardoo by imitating their elders, who had benefited themselves from their elders in return, reaching time immemorial, there was at least one reckless individual trying to survive by doing something no other he/she knew had done before.