A little more than a year ago, Elon Musk began his reign at Twitter with an elaborately staged pun. On Wednesday, October 26, 2022, he posted a tweet with a video that showed him carrying a sink through the lobby of the company’s San Francisco headquarters. “Entering Twitter HQ—let that sink in!” he wrote.
For a preview of how AI will collide with creative industries, look to advertising. Amazon, Google, and Meta have all started encouraging advertisers to use AI tools to generate ad copy and imagery, promising high performance, lower costs, and super-specific targeting. Now, brands are paying to advertise with AI-generated virtual influencers — synthetic characters that can offer at least some promotional juice at a fraction of the cost.
The millennial internet first died in 2015. I remember the day exactly because I was one of seven staffers, in addition to many more permalancers, at Gawker Media who were laid off as part of a company-wide restructuring. I received a message on Slack, was asked to join a meeting in a nearby conference room...
Cultural upheavals can be a riddle in real time. Trends that might seem obvious in hindsight are poorly understood in the present or not fathomed at all. We live in turbulent times now, at the tail end of a pandemic that killed millions and, for a period, reordered existence as we knew it. It marked, perhaps more than any other crisis in modern times, a new era, the world of the 2010s wrenched away for good.
IBM is one of the oldest technology companies in the world, with a raft of innovations to its credit, including mainframe computing, computer-programming languages, and AI-powered tools. But ask an ordinary person under the age of 40 what exactly IBM does (or did), and the responses will be vague at best.
Since the term nostalgia first became common currency, no area of life has been associated with it more than popular culture. From Alvin Toffler onward, intellectuals frequently drew on revivals of past styles in music and fashion or used films and television series set in the past as examples to substantiate their claims that nostalgia had become omnipresent.
Here is a very dumb truth: for a decade, the default answer to nearly every problem in mass media communication involved Twitter. Breaking news? Twitter. Live sports commentary? Twitter. Politics? Twitter. A celebrity has behaved badly? Twitter. A celebrity has issued a Notes app apology for bad behavior? Twitter. For a good while, the most reliable way to find out what a loud noise in New York City was involved asking Twitter. Was there an earthquake in San Francisco? Twitter. Is some website down? Twitter.
During a reading project I undertook to better understand the “third wave of democracy” — the remarkable and rapid rise of democracies in Latin America, Asia, Europe and Africa in the 1970s and 80s — I came to realize that this ascendency of democratic polities was not the result of some force propelling history toward its natural, final state, as some scholars have argued.
Many academic fields can be said to ‘study morality’. Of these, the philosophical sub-discipline of normative ethics studies morality in what is arguably the least alienated way. Rather than focusing on how people and societies think and talk about morality, normative ethicists try to figure out which things are, simply, morally good or bad, and why.
It’s a grey November day; rain gently pocks the surface of the tidal pools. There is not much to see in this East Sussex nature reserve – a few gulls, a little grebe, a solitary wader on the shore – but already my breathing has slowed to the rhythm of the water lapping the shingle, my shoulders have dropped and I feel imbued with a sense of calm.