A little more than a year ago, Elon Musk began his reign at Twitter with an elaborately staged pun. On Wednesday, October 26, 2022, he posted a tweet with a video that showed him carrying a sink through the lobby of the company’s San Francisco headquarters. “Entering Twitter HQ—let that sink in!” he wrote.
To be a man is to dominate others. This is what I absorbed as a boy: masculinity means mastery, power, control. To be socialized into manhood is to gain a love of hierarchy and a willingness to do whatever is necessary to preserve your own position within it.
For a preview of how AI will collide with creative industries, look to advertising. Amazon, Google, and Meta have all started encouraging advertisers to use AI tools to generate ad copy and imagery, promising high performance, lower costs, and super-specific targeting. Now, brands are paying to advertise with AI-generated virtual influencers — synthetic characters that can offer at least some promotional juice at a fraction of the cost.
The millennial internet first died in 2015. I remember the day exactly because I was one of seven staffers, in addition to many more permalancers, at Gawker Media who were laid off as part of a company-wide restructuring. I received a message on Slack, was asked to join a meeting in a nearby conference room...
Cultural upheavals can be a riddle in real time. Trends that might seem obvious in hindsight are poorly understood in the present or not fathomed at all. We live in turbulent times now, at the tail end of a pandemic that killed millions and, for a period, reordered existence as we knew it. It marked, perhaps more than any other crisis in modern times, a new era, the world of the 2010s wrenched away for good.
You are currently logged on to the largest version of the internet that has ever existed. By clicking and scrolling, you’re one of the 5 billion–plus people contributing to an unfathomable array of networked information—quintillions of bytes produced each day.
IBM is one of the oldest technology companies in the world, with a raft of innovations to its credit, including mainframe computing, computer-programming languages, and AI-powered tools. But ask an ordinary person under the age of 40 what exactly IBM does (or did), and the responses will be vague at best.
Since the term nostalgia first became common currency, no area of life has been associated with it more than popular culture. From Alvin Toffler onward, intellectuals frequently drew on revivals of past styles in music and fashion or used films and television series set in the past as examples to substantiate their claims that nostalgia had become omnipresent.
Here is a very dumb truth: for a decade, the default answer to nearly every problem in mass media communication involved Twitter. Breaking news? Twitter. Live sports commentary? Twitter. Politics? Twitter. A celebrity has behaved badly? Twitter. A celebrity has issued a Notes app apology for bad behavior? Twitter. For a good while, the most reliable way to find out what a loud noise in New York City was involved asking Twitter. Was there an earthquake in San Francisco? Twitter. Is some website down? Twitter.
A little more than a year ago, the world seemed to wake up to the promise and dangers of artificial intelligence when OpenAI released ChatGPT, an application that enables users to converse with a computer in a singularly human way.
During a reading project I undertook to better understand the “third wave of democracy” — the remarkable and rapid rise of democracies in Latin America, Asia, Europe and Africa in the 1970s and 80s — I came to realize that this ascendency of democratic polities was not the result of some force propelling history toward its natural, final state, as some scholars have argued.
Many academic fields can be said to ‘study morality’. Of these, the philosophical sub-discipline of normative ethics studies morality in what is arguably the least alienated way. Rather than focusing on how people and societies think and talk about morality, normative ethicists try to figure out which things are, simply, morally good or bad, and why.
It’s a grey November day; rain gently pocks the surface of the tidal pools. There is not much to see in this East Sussex nature reserve – a few gulls, a little grebe, a solitary wader on the shore – but already my breathing has slowed to the rhythm of the water lapping the shingle, my shoulders have dropped and I feel imbued with a sense of calm.
At this point, it doesn’t matter how Napoleon does. Critics might love it or critics might hate it. It might crater at the box office, or it might single-handedly resuscitate the theatrical viewing experience.
Twenty-five years ago, the burgeoning science of consciousness studies was rife with promise. With cutting-edge neuroimaging tools leading to new research programmes, the neuroscientist Christof Koch was so optimistic, he bet a case of wine that we’d uncover its secrets by now.