Joined28 March 2014
Articles711
The millennial internet first died in 2015. I remember the day exactly because I was one of seven staffers, in addition to many more permalancers, at Gawker Media who were laid off as part of a company-wide restructuring. I received a message on Slack, was asked to join a meeting in a nearby conference room...
IBM is one of the oldest technology companies in the world, with a raft of innovations to its credit, including mainframe computing, computer-programming languages, and AI-powered tools. But ask an ordinary person under the age of 40 what exactly IBM does (or did), and the responses will be vague at best.
It’s a grey November day; rain gently pocks the surface of the tidal pools. There is not much to see in this East Sussex nature reserve – a few gulls, a little grebe, a solitary wader on the shore – but already my breathing has slowed to the rhythm of the water lapping the shingle, my shoulders have dropped and I feel imbued with a sense of calm.
“It turns out that I’m young, and I have a whole life of shit I can do,” she says. “Maybe because my life became so adult very young, I forgot that I was still that young. I settled in a lot of ways: I lived my life as if I were in my 70s. I realized recently that I don’t need to do that.”
Cultural upheavals can be a riddle in real time. Trends that might seem obvious in hindsight are poorly understood in the present or not fathomed at all. We live in turbulent times now, at the tail end of a pandemic that killed millions and, for a period, reordered existence as we knew it. It marked, perhaps more than any other crisis in modern times, a new era, the world of the 2010s wrenched away for good.
Since the term nostalgia first became common currency, no area of life has been associated with it more than popular culture. From Alvin Toffler onward, intellectuals frequently drew on revivals of past styles in music and fashion or used films and television series set in the past as examples to substantiate their claims that nostalgia had become omnipresent.
During a reading project I undertook to better understand the “third wave of democracy” — the remarkable and rapid rise of democracies in Latin America, Asia, Europe and Africa in the 1970s and 80s — I came to realize that this ascendency of democratic polities was not the result of some force propelling history toward its natural, final state, as some scholars have argued.
Here is a very dumb truth: for a decade, the default answer to nearly every problem in mass media communication involved Twitter. Breaking news? Twitter. Live sports commentary? Twitter. Politics? Twitter. A celebrity has behaved badly? Twitter. A celebrity has issued a Notes app apology for bad behavior? Twitter. For a good while, the most reliable way to find out what a loud noise in New York City was involved asking Twitter. Was there an earthquake in San Francisco? Twitter. Is some website down? Twitter.
Many academic fields can be said to ‘study morality’. Of these, the philosophical sub-discipline of normative ethics studies morality in what is arguably the least alienated way. Rather than focusing on how people and societies think and talk about morality, normative ethicists try to figure out which things are, simply, morally good or bad, and why.