2001-space-odyssey

This year marks the 50th anniversary of the release of Stanley Kubrick’s 2001: A Space Odyssey, a film widely regarded as one of the best and most important in the history of cinema. I’m not here to dispute this judgment – I’m a big fan of the film, and have been since my brother-in-law took me to see a road show screening during its 10th anniversary. But any film with a date embedded in the title can’t help but inspire us to get out our calculators, fire up Google and do some calendar sums.

The technical hurdles necessary to make 2001– a film that director Kubrick intended to be the first really credible science fiction movie – were so considerable that the people in charge of actually putting astronauts into space and onto the moon dubbed the British soundstages where the film was being made “NASA East.” NASA and the U.S. space program had been in existence since 1958, but the goal of a manned moon landing had only been set in motion by President John F. Kennedy in his famous speech to Congress in May of 1961 – just three years before production on 2001 began.

It took four years for Kubrick and a vast team of technicians to make the film, which came out only a year before Apollo 11 finally landed on the moon in July of 1969. Manned lunar colonies were a major setting of 2001– mankind’s first major stepping stone off Earth and into space – but the last astronauts left the moon in 1972, never to return. With the death of Apollo 17 astronaut Gene Cernan last year, only half of the dozen men who walked on the moon remain alive.

It’s from here that the litany of disappointments with the future, as it deviated from the one imagined in Kubrick’s film, really begins.

2001imagined regular passenger flights on Pan Am planes to orbiting Hilton hotels. The hotel chain remains, but the legendary airline ceased operation in 1991, ten years before their stewardesses were supposed to be walking the aisles in zero gravity wearing Velcro grip shoes. The closest thing we had to Pan Am’s commuter flights to orbiting space stations was the Space Shuttle program, the last of which flew in 2011. There is an orbiting space station, but the cramped, modular International Space Station can support a crew of only six – a far cry from the gleaming white, curving corridors of the wheel-shaped Space Station V of Kubrick’s film with its Hilton and HoJo.

When the calendar actually caught up with the title of Kubrick’s film, 33 years after its release, the world looked little like the sleek one imagined by Kubrick and his team of technical wizards. That year was, in sombre fact, defined by a terrorist attack on the twin towers of the World Trade Center, which only broke ground halfway through the production of the film — another vision of the future that didn’t survive the brutal reality of actual history.

We are 17 years beyond the film’s “expiration date,” and the world doesn’t feel very futuristic unless you’re looking at the screen of your smartphone – the only piece of contemporary technology that matches or even exceeds the sort of thing dreamed up by Kubrick and his wizards back when phones had curly cables, TVs flickered in bad weather, a GPS was a map folded up in the glove box and people bought music on vinyl LPs. In some ways the world we live in often feels stuck in a circular loop spinning back as far as the time of 2001’sfilming, picking up detritus to recycle on its way.

We could blame Kubrick and his crew of geniuses for imaginative overreach, but it helps to understand the world in which 2001was made. Rockets and jets had only exploded into existence at the end of World War II, just 20 years before the first shots of the movie were filmed in a darkroom in New York City. Radio only became a commercial medium after World War I, and colour televisions were owned by just over three per cent of Americans in 1964. The big three networks in the U.S. would only feature all-colour programming in prime time the year before 2001was released.

I often think of my grandfather, born in a dockside slum in Birkenhead in the middle of Queen Victoria’s reign. He died in 1963, in a tidy house on a suburban street that had been a forest when he was born, in a city on the other side of the ocean. My mother, born in the middle of the Edwardian era, had narrowly avoided being born in the same dockside slum thanks to her parents’ emigration, but the pace of innovation she’d witness in her life only really accelerated with World War I. The first Ford Model T (top speed 45 mph) was sold the year after she was born, but she never learned to drive, though she did end up working for Kodak, the Apple of its day.

Even John F. Kennedy, the man who fired the political starting gun in the race to space, was born only 10 years after my mother. (And the same year as Arthur C. Clarke, the British science fiction writer who would help Kubrick realize the whole story and concept of 2001.) The son of a rich man, he’d live with the benefits of every new technological advance, from modern medicine to air travel. (My mother died in the late ‘80s without ever taking a plane flight.)

Life experience might vary wildly, but even if you didn’t – or couldn’t – avail yourself of every new wonder the modern world was experiencing,  from military experiment to retail service, living through the tumultuous events of the middle decades of the twentieth century meant accepting a certain fantastic pace of innovation as a given. Why wouldn’t you imagine moon bases just 20 years after our first footfall on its dusty surface, or nuclear powered space ships to take clean-cut crews of unsmiling men named Frank and Dave to the moons of Jupiter?

After decades of movie robots played by men in tin suits plodding through space opera sets, Kubrick and Clarke’s greatest innovation was imagining that artificial intelligence wouldn’t be embodied in a humanoid automaton but in a huge mainframe computer whose interface with its human creators was a curiously prissy voice and a single, unblinking red eye. The world of 2001is, in retrospect, pretty hit and miss as a prediction of the future, but the really compelling aspect, after five decades of mostly disappointing future, is its imagining of mankind’s pivotal encounters with alien intelligence.

Inspired by a then-fashionable theory that human evolution might have been sped up by some interference from beyond our galaxy, the film opens with our hairy and (likely) doomed ancestors being given a cognitive kick start when a smooth, black monolith is deposited in their midst, prompting the first tool use, then protein-rich meals of meat, then the first war for space and resources. After being witnesses to decades of previously unimaginable horror and violence amidst very real technological progress and widely imagined political progress, this concept resonated deeply with people who had come to accept Darwin’s radical theories just a few generations previous.

A second monolith discovered buried on the moon begins the film’s second act, and propels Dave and Frank to a rendezvous with a forest of monoliths orbiting outside a “stargate” near Jupiter, which sends the film rushing to a trippy, mind-blowing conclusion. It was this final act that appealed to young people – like my brother-in-law – who made 2001a blockbuster hit. By decade’s end they’d be avidly reading Erich von Daniken’s Chariots of the Gods?,which proposed a whole hidden history of alien interference in human history – not coincidentally published the year 2001was released.

Five decades later this all looks very dated and zeitgeisty, but our obsession with alien encounters really begins in earnest with 2001, a film that made it acceptable to ponder intelligent life outside our solar system by making the aliens invisible except for their monolithic stand-ins. Kubrick and his crew had tried and failed to come up with a plausible alien to show onscreen, but subsequent filmmakers, benefitting from technology pioneered by 2001, would populate the stars with a whole encyclopedia of visually dazzling aliens, from the spindly Greys of Close Encounters of the Third Kindto the nightmare xenomorphs of the Alienfilms, who supplant mankind as intergalactic apex predators.

When 2001was being made, astronomer Carl Sagan’s assertion that the mathematical probability of other intelligent life in the universe was undeniable was becoming what we’d now call a meme. (Kubrick and Clarke met with Sagan early in the pre-production of 2001, but the director found the young scientist arrogant and off-putting.) H.G. Wells and War of the Worldsprimed us to anticipate space aliens, but 2001let us imagine that their intentions were pure. That optimistic mood would barely last a decade.

What 2001got right was assuming that announcing the discovery of alien intelligence would cause a global crisis – a sort of existential panic. (This is made blandly explicit during a top-secret meeting held on the orbiting space station, which paints the depressing picture of a future where meetings around board tables are still a daily human reality.)

Nearly 50 years later, the film Arrival– a rare, truly worthy successor to 2001– gives us a glimpse of that panic when several alien ships arrive on Earth. People stop showing up for work and society worldwide teeters on the verge of collapse; the imposition of martial law only barely maintains something like order. Everyone, whether religious or secular, struggles to comprehend the implications of a superior species with both godlike powers and inscrutable intentions.

In the real world, reports of UFO sightings have been in decline since their peak in the ‘90s, along with ghost sightings – another 20thcentury phenomenon that became mainstream with the trauma of a World War– both apparently victims of the spread of camera phones. Now there’s a growing reaction to Carl Sagan’s “mathematical certainty” of alien life, with books like Lee Billings’ Five Billion Years of Solitude, which depicts the frustrations of searchers for extraterrestrial life, and dares to imagine a universe barren of life except for our own, lonely, implausible planet.

It’s unlikely that our longing to imagine alien life forms, probably but not inevitably superior to ourselves, will go away. But our optimism about the subject will likely be constantly tempered and eroded by the experience of living through an ever-disappointing future that fails to deliver on our grandest dreams of social and technological progress, at least in the short term.

Which is the problem, after all: We live our lives in the short term, and while our phones or our cars might get visibly better, we live constantly disappointed by that sobering and underwhelming moment where the future becomes history. We may get our moon bases one day, but most people will still prefer visiting the cottage.