Cheers and mazel tov! We’ve made it halfway through January. Yes, our bodies endured a pounding through the festive frivolities, but through that excruciating cumulative hangover we somehow survived. Our recycling bins have been collected, those bottles of bubbly out of sight and mind. New-year-new-me resolutions can now be abandoned. Anyone fancy a pint?
Or this year, does another round feel less appealing? You’re far from alone if, in 2023, you’re considering calling time once and for all. Welcome to the era of the sober-curious; the apparently ever-growing movement of people exploring what life could look like alcohol-free. Among young Brits, the numbers look irrefutable: between 2002 and 2019, the proportion of 16- to 24-year-olds in England who reported monthly drinking fell from 67% to 41%. And while the stats don’t show older adults putting down the plonk on a permanent basis, something is shifting. According to Dry January’s organisers, this year one in six UK adults who drink alcohol are attempting to participate. Alcohol-free beers were once a fringe choice; today they’re found nationwide on supermarket shelves. No longer do 0% orders come with a side of pregnancy questions or bemused stares.
Until recently, I’d assumed my millennial peers to be distinct from this new generation of abstainers – that this was firmly the preserve of Gen Z. But recently, I’ve noticed a change. Now there’s a steady stream of posts appearing on my social media feeds in which friends – in their late 20s or early 30s – announce that they are embarking on sobriety journeys of their own.
At the start of the 22nd century, humanity left Earth for the stars. The enormous ecological and climatic devastation that had characterised the last 100 years had led to a world barren and inhospitable; we had used up Earth entirely. Rapid melting of ice caused the seas to rise, swallowing cities whole. Deforestation ravaged forests around the globe, causing widespread destruction and loss of life. All the while, we continued to burn the fossil fuels we knew to be poisoning us, and thus created a world no longer fit for our survival. And so we set our sights beyond Earth’s horizons to a new world, a place to begin again on a planet as yet untouched. But where are we going? What are our chances of finding the elusive planet B, an Earth-like world ready and waiting to welcome and shelter humanity from the chaos we created on the planet that brought us into being? We built powerful astronomical telescopes to search the skies for planets resembling our own, and very quickly found hundreds of Earth twins orbiting distant stars. Our home was not so unique after all. The universe is full of Earths!
This futuristic dream-like scenario is being sold to us as a real scientific possibility, with billionaires planning to move humanity to Mars in the near future. For decades, children have grown up with the daring movie adventures of intergalactic explorers and the untold habitable worlds they find. Many of the highest-grossing films are set on fictional planets, with paid advisors keeping the science ‘realistic’. At the same time, narratives of humans trying to survive on a post-apocalyptic Earth have also become mainstream.
Given all our technological advances, it’s tempting to believe we are approaching an age of interplanetary colonisation. But can we really leave Earth and all our worries behind? No. All these stories are missing what makes a planet habitable to us. What Earth-like means in astronomy textbooks and what it means to someone considering their survival prospects on a distant world are two vastly different things. We don’t just need a planet roughly the same size and temperature as Earth; we need a planet that spent billions of years evolving with us. We depend completely on the billions of other living organisms that make up Earth’s biosphere. Without them, we cannot survive. Astronomical observations and Earth’s geological record are clear: the only planet that can support us is the one we evolved with. There is no plan B. There is no planet B. Our future is here, and it doesn’t have to mean we’re doomed.
Student chefs kneed in the crotch. Kitchen staff sexually assaulted. Cooks put in trash cans as punishment. To quote famed 19th-century French chef Marie-Antoine Carême on the working conditions of fine dining establishments, “It is the burning charcoal that kills us. Does it matter? The shorter the life, the greater the glory.”
This is the picture of the great chef that’s endured in the popular imagination: actively destroying himself in pursuit of the best possible food. If he’s cruel to his staff, it’s only because he cares—and in any event, they probably care just as much, absorbing his cruelty with gusto. For those of us who want to eat at extraordinary restaurants without feeling guilty, it’s crucial to believe that this brutality is an unfortunate but unavoidable byproduct of the search for culinary perfection.
On Monday, Noma, a Copenhagen restaurant with three Michelin stars and five Best Restaurant designations from the World’s 50 Best Restaurants, announced that it was closing for good. The restaurant’s creator, chef René Redzepi, cited the financially and emotionally unsustainable nature of cooking at such a high level. He’d been reckoning with that unsustainability since at least 2015, when an essay for Lucky Peach saw him publicly exorcising his own kitchen demons. “Maybe the old way has worked so far,” he wrote, attempting to redeem his past as a kitchen screamer. “But in the long run, it burns people out.”
That theory has borne out even more now that the pandemic has worsened the existing pains of cooking professionally and chefs have begun ditching fine dining for more casual, lower-stakes ventures. Noma alum Kim Mikkola now runs a Helsinki chain of fried-chicken shops called KotKot.* D.C.’s Michelin-starred Komi ditched its tasting menu and converted its operations to a takeout joint called Happy Gyro. As fine dining restaurant 63 Clinton’s chef-owner Sam Clonts tells me, “The level of intensity is lower in casual places. Chefs heading for more casual spots is a response to the market, but also to what’s needed in dining right now.”
In April 2022, Elon Musk acquired a 9.2 percent stake in Twitter, making him the company’s largest shareholder, and was offered a seat on the board. Luke Simon, a senior engineering director at Twitter, was ecstatic. “Elon Musk is a brilliant engineer and scientist, and he has a track record of having a Midas touch, when it comes to growing the companies he’s helped lead,” he wrote in Slack.
Twitter had been defined by the catatonic leadership of Jack Dorsey, a co-founder who simultaneously served as CEO of the payments business Block (formerly Square). Dorsey, who was known for going on long meditation retreats, fasting 22 hours a day, and walking five miles to the office, acted as an absentee landlord, leaving Twitter’s strategy and daily operations to a handful of trusted deputies. When he spoke about Twitter, it was often as if someone else were running the company. To Simon and those like him, it was hard to see Twitter as anything other than wasted potential.
In its early days, when Twitter was at its most Twittery, circa 2012, executives called the company “the free-speech wing of the free-speech party.” That was the era when the platform was credited for amplifying the Occupy Wall Street movement and the Arab Spring, when it seemed like giving everyone a microphone might actually bring down dictatorships and right the wrongs of neoliberal capitalism. That moment, which coincided with the rise of Facebook and YouTube, inspired utopian visions of how social networks could promote democracy and human rights around the world.
Twitter rode this momentum to become one of the most important companies in tech: an all-consuming obsession for those working or merely interested in politics, sports, and journalism around the world. Frequently, the platform set the news agenda and transformed nobodies into Main Characters. What it lacked in profits it more than made up for in influence.
I was nineteen, maybe twenty, when I realized I was empty-headed. I was in a college English class, and we were in a sunny seminar room, discussing “For Whom the Bell Tolls,” or possibly “The Waves.” I raised my hand to say something and suddenly realized that I had no idea what I planned to say. For a moment, I panicked. Then the teacher called on me, I opened my mouth, and words emerged. Where had they come from? Evidently, I’d had a thought—that was why I’d raised my hand. But I hadn’t known what the thought would be until I spoke it. How weird was that?
Later, describing the moment to a friend, I recalled how, when I was a kid, my mother had often asked my father, “What are you thinking?” He’d shrug and say, “Nothing”—a response that irritated her to no end. (“How can he be thinking about nothing?” she’d ask me.) I’ve always been on Team Dad; I spend a lot of time thoughtless, just living life. At the same time, whenever I speak, ideas condense out of the mental cloud. It was happening even then, as I talked with my friend: I was articulating thoughts that had been unspecified yet present in my mind.
My head isn’t entirely word-free; like many people, I occasionally talk to myself in an inner monologue. (Remember the milk! Ten more reps!) On the whole, though, silence reigns. Blankness, too: I see hardly any visual images, rarely picturing things, people, or places. Thinking happens as a kind of pressure behind my eyes, but I need to talk out loud in order to complete most of my thoughts. My wife, consequently, is the other half of my brain. If no interlocutor is available, I write. When that fails, I pace my empty house, muttering. I sometimes go for a swim just to talk to myself far from shore, where no one can hear me. My minimalist mental theatre has shaped my life. I’m an inveterate talker, a professional writer, and a lifelong photographer—a heady person who’s determined to get things out of my head, to a place where I can apprehend them.
I’m scarcely alone in having a mental “style,” or believing I do. Ask someone how she thinks and you might learn that she talks to herself silently, or cogitates visually, or moves through mental space by traversing physical space. I have a friend who thinks during yoga, and another who browses and compares mental photographs. I know a scientist who plays interior Tetris, rearranging proteins in his dreams. My wife often wears a familiar faraway look; when I see it, I know that she’s rehearsing a complex drama in her head, running all the lines. She sometimes pronounces an entire sentence silently before speaking it out loud.