In early May 2009, 12 men arrived in La Rochelle on the west coast of France, carrying a few pairs of Speedos in their luggage. They had not come to swim but, as they liked to put it, to “fly”. Their sport, which involves diving from cliffs, buildings or bridges, always comes with an atmosphere of nervous excitement, but this time the stakes were higher than ever before. Cliff diving had long been at the obscure end of extreme sports, a pursuit for thrill-seekers with day jobs. Now, the energy drink company Red Bull was launching what it called a “cliff diving world series”, with eight events scheduled across the summer that would attract hundreds of thousands of spectators. Here was a chance at fame and, if not fortune, for the very best of the divers, a modest living.
In traditional pool diving, the highest event is the 10-metre platform, and even Olympic divers can find the height unsettling. In La Rochelle, the organisers had affixed a short platform to the ramparts of the medieval Saint Nicolas Tower, 26 metres above the frigid sea – as high as an eight-storey building. In their three seconds of flight, the divers would reach speeds of more than 50mph. At that speed, a head-first entry is too dangerous. They would need to break the water with their feet, trying to make as little splash as possible. In each of their three competitive jumps, the divers could take off facing forwards, backwards or, most terrifyingly, from a handstand position. As they fell, they would do as many twists and somersaults as they dared in order to impress the judges before hitting the sea. Make a mistake and it was like you’d “run full speed into a wall”, as the Colombian Orlando Duque, the favourite to win the new series, explained at the time.
Amid all the recent commentary about John Cleese resurrecting Fawlty Towers, one fact struck me as even more preposterous than the setting’s proposed relocation to a Caribbean boutique hotel: when the original series aired, Cleese was only 35 years old.
When it comes to screen culture, middle age isn’t what it used to be. People magazine gleefully reported last year that the characters in And Just Like That, the rebooted series of Sex and the City, were the same age (average 55) as the Golden Girls when they made their first outing in the mid-80s. How can that be possible? My recollection of the besequined Florida housemates was that they were teetering off this mortal coil, but then everyone seems old when you are young.
Meanwhile, a popular Twitter account, The Meldrew Point, has the sole purpose of celebrating people who, implausibly, have reached the age the actor Richard Wilson was when he appeared in the first episode of One Foot in the Grave (19,537 days). It’s hard to believe, but these 53-and-a-half-year-olds include J-Lo, Renée Zellweger, Molly Ringwald, Julia Sawalha and Ice Cube.
Back in the day, 40 was the marker for midlife, but now, finding consensus on when middle age begins and what it represents isn’t easy. The Collins English dictionary gnomically defines it as “the period in your life when you are no longer young but have not yet become old”. The Encyclopaedia Britannica says it is between 40 and 60. Meanwhile, a 2018 YouGov survey reported that most Britons aged between 40 and 64 considered themselves middle-aged – but so did 44% of people aged between 65 and 69.
“There’s no point trying to impose chronological age on what is or is not middle age,” says Prof Les Mayhew, the head of global research at the International Longevity Centre UK. “With people living longer, your 30s are no longer middle age; that has switched to the 40s and 50s.” But even then, he believes putting a number on it is meaningless. “In some cases, in your 50s, you might be thinking about a second or even third career, but for others you might have serious health problems and be unable to work.
“Governments are always trying to impose these labels of administrative convenience for things that are supposed to happen at a certain age – for example, you are allegedly an adult at the age of 18 and you aren’t old enough to receive a state pension until 66. Totally arbitrary. Meanwhile, GPs want you to book in for a ‘midlife MOT’, which is a great jazzy concept to get out of what should be happening – an annual health check-up.”
In his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the computer scientist Joseph Weizenbaum observed some interesting tendencies in his fellow humans. In one now-famous anecdote, he described his secretary’s early interactions with his program ELIZA, a proto-chatbot he created in 1966. Following a set of rules meant to approximate patient-directed Rogererian psychotherapy, and following a simple script called DOCTOR, the program made quite an impression:
I was startled to see how quickly and how very deeply people conversing with DOCTOR became emotionally involved with the computer and how unequivocally they anthropomorphized it. Once, my secretary who had watched me work on the program for many months and therefore surely knew it to be merely a computer program, started conversing with it. After only a few interchanges with it, she asked me to leave the room.
Weizenbaum took this first to mean something about people, rather than machines. He then observed:
I knew from long experience that the strong emotional ties many programmers have to their computers are often formed after only short exposures to their machines. What I had not realized is that extremely short exposures to a relatively simple computer program could induce powerful delusional thinking in quite normal people.
Weizenbaum is today known as a computing visionary and a father of what is most commonly called artificial intelligence. You can play with ELIZA on various websites; a modern user, who has almost certainly interacted with numerous superior ELIZA successors on their computers, phones, and customer-support calls, won’t have trouble tripping up Weizenbaum’s program. His observation about people, however, remains durable. We — and I count myself among them even as I write this — keep getting owned by chatbots.
Read the rest of this article at: New York Magazine
The 2022 Ford Bronco Raptor, among the most expensive offerings in the car manufacturer’s line of tough-guy throwback SUVs, features 418 horsepower, a 10-speed transmission, axles borrowed from off-road-racing vehicles, and 37-inch tires meant for driving off sand dunes at unnecessarily high speeds. But when the automotive site Jalopnik got its hands on a Bronco Raptor for testing, the writer José Rodríguez Jr. singled out something else entirely to praise about the $70,000 SUV: its buttons. The Bronco Raptor features an array of buttons, switches, and knobs controlling everything from its off-road lights to its four-wheel-drive mode to whatever a “sway bar disconnect” is. So much can be done by actually pressing or turning an object that Rodríguez Jr. found the vehicle’s in-dash touch screen—the do-it-all “infotainment system” that has become ubiquitous in new vehicles—nearly vestigial.
Then again, the ability to manipulate a physical thing, a button, has become a premium feature not just in vehicles, but on gadgets of all stripes. Although the cheapest models of the Amazon Kindle line are simple touch-screen slabs, the $250 Oasis features dedicated “Page Forward”/“Back” buttons, while the $370 version of the Kindle Scribe comes with a “premium pen” for note-taking that itself has a button. Or consider the Apple Watch, among the most expensive smartwatches around: All models come with a button and knob on their right side just below the bezel—plus a second button for the more expensive Ultra model. The bargain-bin knockoffs sold on Amazon, by contrast, offer nothing but a screen on a strap. Speaking of which, I recently bought an Amazon-brand smart thermostat with a touch screen that nearly burned my house down. Perhaps a dial, like the one on the primo Google Nest, could have helped.
There’s a reason the Star Trek: The Next Generation crew had touch screens way back in 1987: to remind you that it is a show that takes place in the future, which is where the touch screens are and buttons aren’t. At 33, I’m old enough to remember when my dad got a BlackBerry that had ditched its keyboard for a touch screen. Holding the device, with its translucent rubber cover and blank, reflective display, felt like cradling a new era. But although plenty of high-end gadgets, including the iPhone, are mostly screen, something seems to have changed in recent years. “It’s as if in the tech world it’s a sign of luxury: I have a button or a knob,” Douglas Rushkoff, a CUNY professor and the author of Survival of the Richest, told me. Of all things, buttons have seemed to become something like a status symbol in their own right.
Read the rest of this article at: The Atlantic