Sarah Stilton
Review:
Nexus: A Brief History of Information Networks from the Stone Age to AI by Yuval Noah Harari (Signal, $45, 492 pages)
Yuval Noah Harari is something of a rock star public intellectual who burst onto the scene with his 2014 book Sapiens: A Brief History of Humankind, originally published in Hebrew in 2011. Previously a military historian, the Oxford-trained Harari turned his introduction to world history course at Hebrew University at Jerusalem into a book that looks at the grand sweep of human history from first man to today in under 500 pages, in which he argued that mankind is essentially a story-telling species. In Sapiens, he established his modus operandi: overwhelm the reader with details and engaging storytelling to distract from the flimsiness of the theorizing. Is it really possible that narrative was somehow responsible for collective action such as agriculture and democracy? He throws history, philosophy, psychology, political science, and science at readers with reckless abandon. For example, he denies that anyone becomes happy due to winning a lottery or romantic relationships but rather experience joy because of “various hormones coursing through her bloodstream” and “the storm of electric signals flashing between different parts of her brain.” The reader may wonder if winning the lottery or the romantic relationship was the impetus for these physiological determinants of emotion.
The success of Sapiens led to Harari being fêted by Bill Gates and Barack Obama, more bestselling books, and becoming something of an industry onto himself with a more than a dozen employees assisting him in booking events and research. His follow-up, Homo Deus: A Brief History of Tomorrow, endorsed a soft transhumanism melding man and machine. Versions of his books were translated into several dozen languages and into children’s books.
His latest history-cum-punditry is Nexus: A Brief History of Information Networks from the Stone Age to AI. The book surveys the history of human communication from cave wall scrawls to face-to-face discussion to text messages. Each new technology allowed humanity to progress: hunter-gatherers were limited by face-to-face communication, but the arrival of documentation (on tablets to tally harvests) led to bureaucracy and thus government. The printing press and radio facilitated the growth of democracy and dictatorship.
These networks of information, the author argues, are less a truth-seeking enterprise than a storytelling mechanism. But is the tallying of the grain harvest that led to centralized governments really just a story, or hard data that gets to the truth about a society’s capabilities and possibilities? Harari explains what clay tablets and iPhones have in common, and what differentiates them, although the focus is on the essential continuity of different forms of communications over time, at least until now. Artificial Intelligence, which Harari calls “alien” intelligence, could change human consciousness, and unlike the transhumanist cheerleaders of Silicon Valley, Harari thinks it will likely be a negative development — a catastrophically negative development.
Harari seeks to refute the “naïve view of information” that more of it inevitably leads us to get closer to truth. Truth, if it does not set humanity free, should lead to greater wisdom and more power. For Harari, however, truth and order are often at odds, if not inversely related. Democracy, Harari claims, is good at allowing information to flow freely which is good for pursuing the truth (a point that conflicts with his view that much of information that is shared is false) but lousy at order, whereas dictatorships are effective at imposing order but do so by limiting the free flow of information, which ultimately destroys truth.
Information need not be true. Misinformation, although an oft-abused term in the 2020s, and genuine conspiracy theories that use particular data points to further a narrative, are powerful. They can bring down regimes, prop them up, and incite genocidal attacks. Harari reports how the tract of Heinrich Kramer, The Hammer of the Witches, published in 1486, after Gutenberg invented the printing press, led to the murderous European witch hunts. The tract was so popular, it went through eight editions in less than two decades, and another 20 over the next two centuries. The lurid stories of witches were an early bestseller, proving “the dissemination of outrage and sensationalism at the expense of truth” has been a problem that pre-dated the internet. On the other hand, Copernicus’s On the Revolution of Heavenly Spheres, a foundational text in science, did not sell initial print run of 400 copies.
AI is something entirely new because it could take on a life of its own, uncontrolled by human overseers. Algorithms could “exploit with superhuman efficiency the weaknesses, biases, and addictions of the human mind,” effectively turning mankind into easily manipulated putty in the hands of machines. He predicts humanity will be powerless to confront AI, assuming all the possible negative consequences can be extrapolated without any consideration of the positive outcomes that AI might provide or possible countermeasure and mitigation mechanisms that might be developed. The likely outcome of AI is the eradication of humanity from the face of the Earth.
Sometimes Harari’s writing can seem scattershot. In a section about “The Alignment Problem” in a chapter entitled “Fallible: The Network is Often Wrong,” he begins one page praising the internet for “connecting me with my husband, whom I met on one of the first LGBTQ social media platforms back in 2002,” before veering into Carl von Clausewitz’s theory of war. The lack of information – not being able to meet a gay partner in rural Israel or the military tactics of armies – is a consistent problem, a point that could have been made much more efficiently than Harari has done.
In Homo Deus Harari made bold statements about nationalism becoming a relic of the past, “real peace and not just the absence of war,” and “the era when humankind stood helpless before natural epidemics is probably over.” That was written in 2015 and none of those assertions have aged well. Harari’s track record should have resulted in greater humility predicting where AI might lead.
Perhaps because he is doomsaying in Nexus Harari offers few of his own suggestions on how to avoid the AI apocalypse. He wisely calls for banning bots that pretend to be human, vaguely recommends a rejuvenation in “institutional trust,” and feebly urges social media platforms to do a better job at mediating truth, forgetting the censorship project of these platforms during COVID.
Harari may impress readers with the information he packs into Nexus and the story he tells is engaging enough, but he is likely to turn off his audience with his authoritarian inclinations. Harari insists “democracies can regulate the information market” because “their very survival depends” upon them doing so, but his censorship regime presumes an elite leadership in possession of truth with a capital T, which he denies exists. He seems frightened by the prospects of free expression, warning that problems “created by information” can be “made worse by more information.” He worries that “free conservation” could “slip into anarchy.” He frets about ordinary people ignoring traditional gatekeepers like the corporate media and experts – experts like Harari – without addressing why these institutions have lost the trust of the public.
At this point in his career, Harari is a brand. His Sapienship organization is a purveyor of platitudinous solutions to real-world problems. His books, including Nexus, like the stock of Sapienship platitudes (“slow down,” “learn to distinguish reality from illusion”) may impress the unimpressive, but for those with any faculty in history, science, or philosophy, his books are not particularly insightful or provocative.