Rick McGinnis:
I’m sure it’s not just because I have COVID as I write this, but I’ve found it hard to escape the sensation that modern life – society, culture, whatever you want to call it – is a lot less fun that it used to be, even if the circumstances of the last two years aren’t taken into account. “Fun” is, of course, a vague sort of word, and this statement is the essence of subjective, but I don’t think I’m the only person who surveys the world and wonders what’s gone missing.
There’s aging, of course – the slow but steady loss of friends and family, and the certainty that more will be gone, which pulls you up with a start when you realize that a chance encounter on the street or a hurried coffee was the last time you’d see someone. I’m particularly feeling the loss of my friend Kathy Shaidle, writer and contrarian; the last year and a half have been rich in incidents where a wry comment from her would have made the rankest newsfeed stupidity yield a laugh.
There’s also a disengagement with the currents of pop culture which, mostly because they seem dismal when they aren’t merely revivals or retreads, look more stagnant than flowing. I’m sure this is part of aging as well – writing about “cultural pessimism” in 1998 for the Cato Institute, Tyler Cowen noted that the mind is most engaged with culture between 15 and 25. “Therefore, in the eyes of many individuals, culture appears to be drying up and declining, which creates yet further support for pessimism.”
From the perspective of 1998, Cowen thought the culture was in fine shape, at least at retail level: “A small Tower Records will offer at least 10,000 classical music titles, and the largest Tower branch in Manhattan has over 22,000 titles…Movies, including many silents, can be rented on videocassette very cheaply, or on laser discs for those who want higher quality picture and sound. Modern video stores, run on a private for-profit basis, are libraries full of classic works…The number of bookstores has jumped nearly 10-fold, and their average size has increased dramatically. Book superstores are now commonplace.”
Nearly 25 years later, Tower Records, video rental shops and the bookstore – super or not – are gone from the streets. They may thrive online, but fellowship has gone missing from culture, an intangible but crucial element that’s missing when we consume unprecedented abundance with unimagined convenience in discrete silos or turrets.
Late during the first year of lockdowns the Spectator US published “From Cool to Cringe: what’s happened to American Culture?” — an article inspired by the insipid, hapless, tone-deaf, wince-worthy lowlights that seem to punctuate the news cycle almost daily. If writer Will Lloyd had waited until the end of lockdowns to write it, he’d have led with Chris Rock and Will Smith and the Oscars slap; back in 2020 it was Gal Gadot inviting other A-list celebrities to trade off verses of John Lennon’s “Imagine,” singing into their cellphones and giving viewers accidental glimpses of huge homes and green spaces where they were lucky enough to ride out stay-at-home orders.
How it backfired spectacularly is a memory we’ve all quietly buried, like most of that year; that it was a product of the supposed “best and brightest” of the cultural establishment of America was what set off Lloyd, whose jeremiad got a lot of traction at the time. He wanted to know how it had come to this. “Yes, the United States still gives off a massive light. But it’s not the hopeful shine of a beacon on a hill. It’s the flickering glare of a dumpster fire.”
Gradually, over the last half century, the heroes and icons, real and fictional, that America chose to speak for its ideals and ambitions and to deputize in its public battles had gone from Miles Davis, Steve McQueen, Susan Sontag, Sidney Poitier, John Wayne, Lenny Bruce, and Malcolm X to Cardi B, Will Ferrell, Beto O’Rourke, DJ Khaled, Mark Ruffalo, Hannah Gadsby, and Barack Obama.
Comic books were for kids and overgrown boys until they underwent an artistic renaissance in the ‘80s and ‘90s with titles like Maus and Ghost World, but that wasn’t what made them go from b-pictures and self-mocking camp to platform-spanning powerhouses with the new century. It was Iron Man and the Marvel Cinematic Universe. “Beset by feelings of powerlessness,” Lloyd wrote, “Americans invested their politicians with the attributes of the movie Übermenschen they worshipped. It was a bad sign.”
As a screed, it captured the frustration and demoralization many of us felt heading into the first anniversary of lockdown, and it pointed fingers at what was becoming a common enemy: “It was said that the American people were drifting apart from each other. This was wrong. The networks, Facebook and Twitter, were built to surface fresh examples of cringeworthy behavior, consumed through communal derision. They were algorithmically optimized for peer-to-peer surveillance, scapegoating and sacrificial rituals. In fact, all the psychotic grouplets of American life studied each other incestuously, searching for their enemies’ blunders and fails. And they called each other cringe.”
With lockdowns ending, vaccine passports suspended and mask mandates easing, you’d think the mood would be lightening, but last month The Atlantic published “After Babel: Why the Past 10 Years of American Life Have Been Uniquely Stupid,” an essay by social psychologist Jonathan Haidt that is to Lloyd’s screed what a bunker buster is to a knock-knock joke.
Haidt asserts that the founders of the American republic understood that our political life (and by implication social and cultural health) was “in mechanisms to slow things down, cool passions, require compromise and give leaders some insulation from the mania of the moment while holding them accountable to the people periodically, on Election Day.”
This was rooted in an understanding of human nature – what conservative writers once referred to as mankind’s tragic flaw – as expressed by James Madison as how even the freest citizens are “much more disposed to vex and oppress each other than to cooperate for their common good.”
“Where no substantial occasion presents itself,” Madison wrote,” the most frivolous and fanciful distinctions have been sufficient to kindle their unfriendly passions and excite their most violent conflicts.”
Social media, Haidt says, was destined to break down this system of restraint and reflection even before it became algorithmically designed to do precisely that. He quotes former CIA analyst Martin Gurri, writing about social networks in his 2014 book The Revolt of the Public, and saying that they “can protest and overthrow, but never govern.”
We once consumed a shared, mass culture, whose products we understood even if we didn’t enjoy them, and that provided a sense of commonality. “The digital revolution has shattered that mirror,” Gurri wrote, “and now the public inhabits those broken pieces of glass. So, the public isn’t one thing; it’s highly fragmented, and it’s basically mutually hostile. It’s mostly people yelling at each other and living in bubbles of one sort or another.”
Twitter in particular gets singled out among all the social media platforms, for a variety of reasons. First of all, tweets are short, their ideas simplified by necessity, often to little more than a verbalized shrug or sneer. Second, it’s been the preferred social medium of journalists since its inception, but worst of all, that was during a period of severe decline and contraction in traditional journalism, leading to journalists who tweeted more than they wrote (or only tweeted after they lost their jobs) and ultimately to news stories composed largely of reprinted tweets.
Haidt writes that “people on social media platforms are highly concerned with gaining status and are willing to use aggression to do so.” This might not have been the destiny of platforms like Twitter, but it was its inevitable fate when its tone was set by journalists of all sorts – employed, unemployed and aspiring, pundits, influencers, wonks, self-promoting bloggers, self-appointed spokespeople – and the flacks, handlers, spin doctors, trolls, and political operatives who crowded in behind their wake.
The downfall – the collapse of the metaphorical tower in Haidt’s title – was with two small innovations: the “like” button on Facebook, and then the “Retweet” button on Twitter, which Facebook copied with its “Share” button. Gathering data about what was liked and shared led to the algorithms designed to increase engagement by feeding users with posts and tweets that would encourage more liking and sharing. What the designers at Facebook and Twitter didn’t understand – but James Madison and other constitutional authors did – was that people are engaged as much if not more by anger than pleasure.
Much of Haidt’s essay details just how bad things are, with a final dismal summation: “the norms, institutions, and forms of political participation developed during the long era of mass communication are not going to work well now that technology has made everything so much faster and more multidirectional, and when bypassing professional gatekeepers is so easy.”
He does provide a small number of solutions, but, unfortunately, they don’t seem either realistic or original. “Harden Democratic Institutions” sounds like something any candidate in opposition says until they gain power, after which the shoddy machinery that got them there is judged sufficient. “Reform Social Media” is something everyone says when they or someone they like is smeared or canceled, but the autonomy social media enjoy by being composed of private corporations and useful to making street-level politics more effective (read: lethal) means that it’ll be a major player until the day we all, for whatever reason, get tired of its background din.
“Prepare the Next Generation” is the windiest of Haidt’s prescriptions, mostly because it’s basically a re-write of the old hand-wringing trope “What about the children?” I feel sorry for young people who don’t remember a world where information moved more slowly, but I’m old enough to remember times when, even when you had to wait for a morning newspaper or the evening news, people still managed to be at each other’s throats, though they had to do it in person.
I have no way of knowing if my children are having as much fun as I did when I was at my most carefree and young. (Whenever that was.) That’s because I’m more concerned with knowing if they’re anywhere near as anxious and depressed as I was at the same age, over things that seem meaningless now but were paramount in importance then. Haidt notes that mental health stats for young people have been surging since the early 2010s – a timeline that correlates with the rise of social media use.
Mostly I hope that my kids know something about the world we’re living in that I’m missing, and that they’ll find a way to deal with social media the way I dealt with the rise of the 24/7 cable news cycle – in my case by not owning a TV for most of the Reagan, Bush, and Clinton presidencies. Which is exactly the sort of thing you’d expect to hear from someone who still doesn’t “get” TikTok.
Haidt concludes that, in any case “there is little evidence to suggest that America will return to some semblance of normalcy and stability in the next five or 10 years.” His hope is that one side will blink – but which one, and who decides which side you’re on? Perhaps we’re all waiting for the day when we notice that we’re all having a lot less fun, and that we’re all to blame for putting everyone else in a permanent bad mood.