The millennial internet first died in 2015. I remember the day exactly because I was one of seven staffers, in addition to many more permalancers, at Gawker Media who were laid off as part of a company-wide restructuring. I received a message on Slack, was asked to join a meeting in a nearby conference room...
Joined28 March 2014
Articles711
IBM is one of the oldest technology companies in the world, with a raft of innovations to its credit, including mainframe computing, computer-programming languages, and AI-powered tools. But ask an ordinary person under the age of 40 what exactly IBM does (or did), and the responses will be vague at best.
A little more than a year ago, the world seemed to wake up to the promise and dangers of artificial intelligence when OpenAI released ChatGPT, an application that enables users to converse with a computer in a singularly human way.
It’s a grey November day; rain gently pocks the surface of the tidal pools. There is not much to see in this East Sussex nature reserve – a few gulls, a little grebe, a solitary wader on the shore – but already my breathing has slowed to the rhythm of the water lapping the shingle, my shoulders have dropped and I feel imbued with a sense of calm.
“It turns out that I’m young, and I have a whole life of shit I can do,” she says. “Maybe because my life became so adult very young, I forgot that I was still that young. I settled in a lot of ways: I lived my life as if I were in my 70s. I realized recently that I don’t need to do that.”
Cultural upheavals can be a riddle in real time. Trends that might seem obvious in hindsight are poorly understood in the present or not fathomed at all. We live in turbulent times now, at the tail end of a pandemic that killed millions and, for a period, reordered existence as we knew it. It marked, perhaps more than any other crisis in modern times, a new era, the world of the 2010s wrenched away for good.
Since the term nostalgia first became common currency, no area of life has been associated with it more than popular culture. From Alvin Toffler onward, intellectuals frequently drew on revivals of past styles in music and fashion or used films and television series set in the past as examples to substantiate their claims that nostalgia had become omnipresent.
During a reading project I undertook to better understand the “third wave of democracy” — the remarkable and rapid rise of democracies in Latin America, Asia, Europe and Africa in the 1970s and 80s — I came to realize that this ascendency of democratic polities was not the result of some force propelling history toward its natural, final state, as some scholars have argued.
At this point, it doesn’t matter how Napoleon does. Critics might love it or critics might hate it. It might crater at the box office, or it might single-handedly resuscitate the theatrical viewing experience.
For about five minutes a few months ago, people seemed to genuinely believe that our culture was entering the age of “deinfluencing.” “Step aside, influencers,” wrote CNN.
You are currently logged on to the largest version of the internet that has ever existed. By clicking and scrolling, you’re one of the 5 billion–plus people contributing to an unfathomable array of networked information—quintillions of bytes produced each day.
Here is a very dumb truth: for a decade, the default answer to nearly every problem in mass media communication involved Twitter. Breaking news? Twitter. Live sports commentary? Twitter. Politics? Twitter. A celebrity has behaved badly? Twitter. A celebrity has issued a Notes app apology for bad behavior? Twitter. For a good while, the most reliable way to find out what a loud noise in New York City was involved asking Twitter. Was there an earthquake in San Francisco? Twitter. Is some website down? Twitter.
Many academic fields can be said to ‘study morality’. Of these, the philosophical sub-discipline of normative ethics studies morality in what is arguably the least alienated way. Rather than focusing on how people and societies think and talk about morality, normative ethicists try to figure out which things are, simply, morally good or bad, and why.
Twenty-five years ago, the burgeoning science of consciousness studies was rife with promise. With cutting-edge neuroimaging tools leading to new research programmes, the neuroscientist Christof Koch was so optimistic, he bet a case of wine that we’d uncover its secrets by now.
The myth of The Writer looms large in our cultural consciousness. When most readers picture an author, they imagine an astigmatic, scholarly type who wakes at the crack of dawn in a monastic, book-filled, shockingly affordable house surrounded by nature.