When Leonardo DiCaprio’s relationship with model/actress Camila Morrone ended three months after she celebrated her 25th birthday, the lifestyle site YourTango turned to neuroscience. DiCaprio has a well-documented history of dating women under 25. (His current flame, who is 27, is a rare exception.) “Given that DiCaprio’s cut-off point is exactly around the time that neuroscientists say our brains are finished developing, there is certainly a case to be made that a desire to date younger partners comes from a desire to have control,” the article said. It quotes a couples therapist, who says that at 25, people’s “brains are fully formed and that presents a more elevated and conscious level of connection”—the type of connection, YourTango suggests, that DiCaprio wants to avoid.
YourTango was parroting a factoid that’s gained a chokehold over pop science in the past decade: that 25 marks the age at which our brains become “fully developed” or “mature.” This assertion has been used as an explanation for a vast range of phenomena. After 25, it’s harder to learn, a Fast Company piece claimed. Because “the risk management and long-term planning abilities of the human brain do not kick into high gear” until 25, an op-ed in Mint argued, people shouldn’t get married before then. In early 2020, Slate’s sex columnists Jessica Stoya and Rich Juzwiak fielded a reader question about the ethics of having sex with people under 25. “I am told, at least once every couple weeks, that if you’re under 25, you’re incapable of consent because your ‘frontal lobes are still developing,’ ” the distressed reader wrote.
Earlier this fall, while riding the subway, I overheard two friends doing some reconnaissance ahead of a party. They were young and cool—intimidatingly so, dressed in the requisite New York all black, with a dash of Y2K revival—and trying to figure out how to find a mutual acquaintance online.
“Does she have Instagram?” one asked, before adding with a laugh: “Does anybody?”
“I don’t even have it on my phone anymore,” the other confessed.
Even just a couple of years ago, it would have been unheard-of for these 20-something New Yorkers to shrug off Instagram—a sanctimonious lifestyle choice people would have regretted starting a conversation about at that party they were headed to. But now it’s not so surprising at all. To scroll through Instagram today is to parse a series of sponsored posts from brands, recommended Reels from people you don’t follow, and the occasional picture from a friend that’s finally surfaced after being posted several days ago. It’s not what it used to be.
“Gen Z’s relationship with Instagram is much like millennials’ relationship with Facebook: Begrudgingly necessary,” Casey Lewis, a youth-culture consultant who writes the youth-culture newsletter After School, told me over email. “They don’t want to be on it, but they feel it’s weird if they’re not.” In fact, a recent Piper Sandler survey found that, of 14,500 teens surveyed across 47 states, only 20 percent named Instagram their favorite social-media platform (TikTok came first, followed by Snapchat).
Simply being on Instagram is a very different thing from actively engaging with it. Participating means throwing pictures into a void, which is why it’s become kind of cringe. To do so earnestly suggests a blithe unawareness of your surroundings, like shouting into the phone in public.
In other words, Instagram is giving us the ick: that feeling when a romantic partner or crush does something small but noticeable—like wearing a fedora—that immediately turns you off forever.
“People who aren’t influencers only use [Instagram] to watch other people make big announcements,” Lee Tilghman, a former full-time Instagram influencer, told me over the phone. “My close friends who aren’t influencers, they haven’t posted in, like, two years.”
As is always the case, the ick came about quite suddenly—things were going great for Instagram, until they just weren’t. In 2014, the app hit 300 million monthly active users, surpassing Twitter for the first time. The Instagram Stories feature, a direct rip-off of Snapchat, was introduced in August 2016 and outpaced the original just one year later. But although Instagram now has 2 billion monthly users, it faces an existential problem: What happens when the 18-to-29-year-olds who are most likely to use the app, at least in America, age out or go elsewhere? Last year, The New York Timesreported that Instagram was privately worried about attracting and retaining the new young users that would sustain its long-term growth—not to mention whose growing shopping potential is catnip to advertisers. TikTok is already more popular among young American teens. Plus, a series of algorithm changes—and some questionable attempts to copy features from other apps—have disenchanted many of the users who are sticking around.
In the weeks since Sam Bankman-Fried’s cryptocurrency empire was revealed to be a house of lies, mainstream news organizations and commentators have often failed to give their readers a straightforward assessment of exactly what happened. August institutions including the New York Times and Wall Street Journal have uncovered many key facts about the scandal, but they have also repeatedly seemed to downplay the facts in ways that soft-pedaled Bankman-Fried’s intent and culpability.
It is now clear that what happened at the FTX crypto exchange and the hedge fund Alameda Research involved a variety of conscious and intentional fraud intended to steal money from both users and investors. That’s why a recent New York Times interview was widely derided for seeming to frame FTX’s collapse as the result of mismanagement rather than malfeasance. A Wall Street Journal article bemoaned the loss of charitable donations from FTX, arguably propping up Bankman-Fried’s strategic philanthropic pose. Vox co-founder Matthew Yglesias, court chronicler of the neoliberal status quo, seemed to whitewash his own entanglements by crediting Bankman-Fried’s money with helping Democrats in the 2020 elections – sidestepping the likelihood that the money was effectively embezzled.
Perhaps most perniciously, many outlets have described what happened to FTX as a “bank run” or a “run on deposits,” while Bankman-Fried has repeatedly insisted the company was simply overleveraged and disorganized. Both of these attempts to frame the fallout obfuscate the core issue: the misuse of customer funds.
Banks can be hit by “bank runs” because they are explicitly in the business of lending customer funds out to generate returns. They can experience a short-term cash crunch if everyone withdraws at the same time, without there being any long-term problem.
But FTX and other crypto exchanges are not banks. They do not (or should not) do bank-style lending, so even a very acute surge of withdrawals should not create a liquidity strain. FTX had specifically promised customers it would never lend out or otherwise use the crypto they entrusted to the exchange.
For refrigerators across America, the passing of Thanksgiving promises a major purge. The good stuff is the first to go: the mashed potatoes, the buttery remains of stuffing, breakfast-worthy cold pie. But what’s that in the distance, huddled gloomily behind the leftovers? There lie the marginalized relics of pre-Thanksgiving grocery runs. Heavy cream, a few days past its sell-by date. A desolate bag of spinach whose label says it went bad on Sunday. Bread so hard you wonder if it’s from last Thanksgiving.
The alimentarily unthinking, myself included, tend to move right past expiration dates. Last week, I considered the contents of a petite container in the bowels of my fridge that had transcended its best-by date by six weeks. Did I dare to eat a peach yogurt? I sure did, and it was great. In most households, old items don’t stand a chance. It makes sense for people to be wary of expired food, which can occasionally be vile and incite a frenzied dash to the toilet, but food scientists have been telling us for years—if not decades—that expiration dates are mostly useless when it comes to food safety. Indeed, an enormous portion of what we deem trash is perfectly fine to eat: The food-waste nonprofit ReFED estimated that 305 million pounds of food would be needlessly discarded this Thanksgiving.
Expiration dates, it seems, are hard to quit. But if there were ever a moment to wean ourselves off the habit of throwing out “expired” but perfectly fine items because of excessive caution, it is now. Food waste has long been a huge climate issue—rotting food’s annual emissions in the U.S. approximate that of 42 coal-fired power plants—and with inflation’s brutal toll on grocery bills, it’s also a problem for your wallet. People throw away roughly $1,300 a year in wasted food, Zach Conrad, an assistant professor of food systems at William and Mary, told me. In this economy? The only things we should be tossing are expiration dates themselves.
Expiration dates, part of a sprawling family of labels that includes the easily confused siblings “best before,” “sell by,” and “best if used by,” have long muddled our conception of what is edible. They do so by insinuating that food has a definitive point of no return, past which it is dead, kaput, expired—and you might be, too, if you dare eat it. If only food were as simple as that.
The problem is that most expiration dates convey only information about an item’s quality. With the exception of infant formula, where they really do refer to expiration, dates generally represent a manufacturer’s best estimate of how long food is optimally fresh and tasty, though what this actually means varies widely, not least because there is no federal oversight over labeling. Milk in Idaho, for example, can be “sold by” grocery stores more than 10 days later than in neighboring Montana, though the interim makes no difference in terms of quality. Some states, such as New York and Tennessee, don’t require labels at all.
Twelve years ago, when Sepp Blatter, then the president of FIFA, stood before a packed auditorium on a snowy afternoon in Zurich, clutching an envelope containing the name of the country chosen to host the 2022 World Cup, it was already an open secret that he presided over a rancid institution. Despite earning multibillion-dollar revenues, the governing body of world soccer had long exploited an arcane quirk of Swiss law to stay registered as a nonprofit association, insulating itself from the country’s anti-corruption laws. Thus, enjoying similarly scant regulation to an Alpine yodelling group, soccer’s ruling élite had enriched themselves with impunity. And, as long as the play on the field remained unscathed, it seemed that the fans didn’t much care how FIFA administrators filled their pockets. The announcement that the World Cup was to be sent to the tiny desert state of Qatar was the moment when that entente ended.
The World Cup is FIFA’s prized jewel, worth billions of dollars in TV rights and sponsorship deals. It is the biggest sporting tournament on earth, watched by around half of the planet’s population, and choosing the host country is arguably the organization’s most sacred duty. In 2010, Qatar was a repressive autocracy with a thin soccer tradition and barely any sports infrastructure. Yet it had beaten established footballing nations, among them the U.S.A., Australia, and Japan, which offered indisputably stronger bids. How had this happened?
Qatar had never even qualified for a World Cup, and it had not a single soccer stadium fit for hosting an event on this scale. Delivering the tournament would require a frenzy of construction by migrant workers, who make up around ninety-five per cent of the country’s workforce and toil under a labor system riddled with abuses. The country’s human-rights record was and remains chilling: same-sex relationships are criminalized, journalists and activists have repeatedly been detained, women live under the tight control of male guardians, and flogging is a legal form of punishment. In addition, a Qatar World Cup could be downright dangerous. FIFA’s own assessors had warned that a tournament in the country would be at high risk of a terror attack, and that the health of players and fans alike could be endangered by the desert heat.
For Qatar’s emir at the time, Sheikh Hamad bin Khalifa al-Thani, winning the rights to host the tournament was a huge coup, advancing his plan to diversify the country’s gas-rich economy, wash away criticisms of its human-rights record, and position Qatar as a serious global power. But, for many international onlookers, it was hard to imagine any legitimate basis on which the country could have been selected as the strongest bidder. Suspicions abounded that the right to host the 2022 World Cup had been bought. It wasn’t until three years later that the receipts surfaced.