The United States is experiencing an extreme teenage mental-health crisis. From 2009 to 2021, the share of American high-school students who say they feel “persistent feelings of sadness or hopelessness” rose from 26 percent to 44 percent, according to a new CDC study. This is the highest level of teenage sadness ever recorded.
The government survey of almost 8,000 high-school students, which was conducted in the first six months of 2021, found a great deal of variation in mental health among different groups. More than one in four girls reported that they had seriously contemplated attempting suicide during the pandemic, which was twice the rate of boys. Nearly half of LGBTQ teens said they had contemplated suicide during the pandemic, compared with 14 percent of their heterosexual peers. Sadness among white teens seems to be rising faster than among other groups.
But the big picture is the same across all categories: Almost every measure of mental health is getting worse, for every teenage demographic, and it’s happening all across the country. Since 2009, sadness and hopelessness have increased for every race; for straight teens and gay teens; for teens who say they’ve never had sex and for those who say they’ve had sex with males and/or females; for students in each year of high school; and for teens in all 50 states and the District of Columbia.
In March 2021, shortly after Jon Stewart joined Twitter, he tapped the microphone and used his new pulpit to make amends for an infamous act of aggression from his distant past.
“I called Tucker Carlson a dick on National television,” Stewart tweeted. “It’s high time I apologize…to dicks. Never should have lumped you in with that terrible terrible person.”
Stewart originally fired this shot 17 years ago, on October 15, 2004, but if you’re old enough, you surely remember what happened, in part because it was one of the first truly viral political videos of this century. Stewart was a guest on Tucker Carlson’s cacophonous CNN political-argument show, Crossfire, a half-hour nightly migraine of debate-club doublespeak, during which Stewart pleaded with Carlson to “stop hurting America.” “Wait, I thought you were gonna be funny,” Tucker sniffed. “No,” Stewart shot back, “I’m not gonna be your monkey.” Soon enough he was calling Tucker a dick on national television. “You’re as big a dick on your show,” he said, “as you are on any show.”
Tucker Carlson was actually the co-host of Crossfire, along with his left-leaning Clinton-era frenemy Paul Begala, but nobody remembers Begala, and why should they? The whole thing went down in history as Jon Stewart versus Tucker Carlson, with Stewart the champion by first-round knockout. Within months, CNN canceled Crossfire, hurtling Stewart into a position of political influence and superstardom that few comics in America have ever reached. Two weeks after Stewart humiliated Tucker on his own show, President George W. Bush won a narrow reelection over Senator John Kerry, and it would be no overstatement to say that, in the pre-Obama years that followed, the leader of Democratic resistance was Jon Stewart, and he was holding rallies weeknights at 11 p.m Eastern on Comedy Central.
A band shirt wont make or break an artist’s bottom line, but it sure can help in the dismal economic landscape that is the indie music industry. At the same time, there is too much clothing on our planet: The fashion industry reportedly causes anywhere between 2 to 10 percent of global carbon emissions, and some experts say that the only solution is to reduce consumption and lower manufacturing. So what’s an eco-conscious musician looking to gas up the van by selling a few T-shirts to do? Like the efforts to make vinyl more sustainable, environmentally friendly merch isn’t going to do a great deal in reversing the ravages of global warming, but it couldn’t hurt either. Producing Earth-friendly merch, however, is more expensive. For many acts barely getting by, choosing between manufacturers can mean the difference between a profitable tour and a not-so-profitable one.
Will your favorite band T-shirt outlive you? The answer, like every aspect of sustainable fashion, is complicated. (In conversations about ethical consumption, “sustainable” often becomes a vague buzzword; here, we mean clothing that has a minimal impact on the environment and is manufactured using ethical labor practices.) A majority of band shirts are made from 100 percent cotton, a naturally occurring and therefore technically biodegradable material, if left untreated. But the production of cotton requires massive amounts of water and land, rendering it environmentally unsustainable. Slightly better is organic cotton, which has a smaller environmental footprint since it is produced without synthetic chemicals. Ultimately, recycled cotton, which is derived from post-consumer or post-industrial waste, is considered to be the most sustainable, but it comes at a higher price point. Then there’s the ink that turns a boring tee into a statement of fan loyalty: water-based or plastic. Both have different environmentalfootprints, and while water-based might seem like the cleaner option, it can still cause environmental damage if handled and disposed of improperly.
It’s any given weekend in the early 80s, and my mother is prepping. She pulls the air popper out of the woodgrain kitchen cabinets, puts it on the white tile counter, and measures out a quarter-cup of kernels. The machine whirs and popcorn jitters down the chute into a stainless-steel bowl, some spilling onto the counters and floor. She portions the popped corn into three standard-issue brown paper lunch bags — one for her, one for me, one for my younger sister — neatly folds their tops, and secures each one with a single central staple.
She loads a canvas tote bag with three cans of juice, bendy straws wrapped in flimsy white paper napkins, and the bagged popcorn before announcing that it’s time to go. She doesn’t bother to conceal the tote’s contents; should anyone ask to look inside, the cat — and the popcorn — will already be out of the bag.
We pile into the car to commit our first petty crime of the day. We’re going to the movies.
My mother refused to participate in what seemed like a trap: you pay to enter a space for one experience (a movie) and are encouraged, perhaps even compelled, to incur an even greater cost for another (eating and drinking). Buying concessions — or sneaking them in — is now fully wedded to the act of moviegoing. It’s a presumed part of the experience despite the fact that it’s a distraction from the reason we’re in a theater in the first place. We fully associate the sensory experiences of smell and taste with the sights and sounds of watching movies, even though we sacrifice a degree of appreciation for each activity for the sake of the combined pleasure.
This was not inevitable. Buying a movie ticket wasn’t always an upsell opportunity, a chance to exploit decadent impulses in the permissive darkness of the theater. You might say that the movie industry, over the course of almost a century, exerted its influence over our minds, bodies, and pocketbooks to convince us that watching and eating were twinned activities. You might also say that we, the watchers and eaters, were complicit in our gradual descent into the snack bar rabbit hole.
The push-pull between making moviegoing about seeing movies — projected on a big screen, in a comfortable or even luxurious setting, in the midst of others with whom we laugh, scream, and cry — and making moviegoing about nachos and super-sized sodas may have reached its breaking point. As streaming services keep more people at home than ever before and when going to the movies remains tinged with the risk of contagion (and seems likely to be for the foreseeable future), what will happen to the seemingly indestructible marriage of movies and snacks consumed by groups of people in a public space? Will movie theaters survive, with or without concessions? And if we lose the ritual of moviegoing, what else will we lose?
My AC went out in the middle of July in Loveland, Ohio. Luckily, the local Meineke was having a slow day.
I initially assumed my accumulator was leaking—a quick fix—but the mechanic unraveled a string of successive problems, each concealing the next. Behind a scrapped compressor was a broken high-pressure cut-off switch, and then a faulty pigtail. Each part took hours to arrive and only delivered knowledge of what we needed to wait for next.
Outside the Meinecke lobby window, four lanes of highway bisected a familiar sprawl: Menards, Office Depot, Dunkin’ Donuts, O’Reilly Auto Parts, Chase Bank. Suburban Cincinnati’s commercial jungle would never let on that the land beneath it had once supported an anarchist commune.
Orson S. Murray founded Fruit Hills in 1845, near present-day Loveland, inspired by his personally-held principles of atheism, socialist feminism, and economic cooperation. Murray hailed from the radical abolitionist movement, writing in The Struggle of the Hour that slavery “makes men into brutes, driving and being driven, crushing and being crushed.” He railed against church, state, and property as “a trio of monsters” in his newspaper, The Regenerator, and cofounded a group called the Society for Universal Inquiry and Reform. Fruit Hills was one of several efforts by Universal Reformers to translate theory into a practical utopia on the rural American frontier.
Murray once wrote that “Bibles and Constitutions are only the necessities of ignorance—things to be changed—to be outgrown and displaced by better things.” Change seems to have gotten the best of Fruit Hills, however; the commune collapsed within seven years. “All the necessaries of life could be raised in abundance,” wrote one contemporary observer, “but the laborers were mostly unused to agriculture and in many instances lacked industry.” From the vantage of the Meinecke lobby, no definition of success seemed generous enough to encompass the project’s fate.
This story is fairly typical. Inland America is pocked with the unmarked graves of communitarian utopias—primitive socialist and communist experiments—that tried to rebuild the world on what was assumed to be virgin soil. Ephrata, Pennsylvania; Germantown, Tennessee; Utopia, Ohio; Brentwood, New York; Iowa’s Amana Colonies: these and many other towns were originally settled by communalists with lofty visions of abolishing private property, quashing material inequity, and transcending divisive individualism.
It makes sense that those seeking the fringe of a New World might be driven by powerful ideological convictions. But while European settlers dreamed of abolishing old hierarchies in the map’s blank spots, these blanks were always a fantasy. The allure of self-directed freedom in unsullied lands largely folded back into a vanguard of dispossession and genocide, with naïve radicals paving the way for the extension of the very structures they had hoped to escape. Their intentions complicate the mythic image of a land settled by rugged individualists, but their ultimate fates suggest bleak prospects for liberation conceived as escape, rather than transformational conflict.
With my AC in working order, I pressed on from Loveland to survey what remained of the utopians’ dreams.