news

News 02.11.22 : Today’s Articles of Interest from Around the Internets

by

News 02.11.22 : Today’s Articles of Interest from Around the Internets
@polinailieva
News 02.11.22 : Today’s Articles of Interest from Around the Internets
@polinailieva
News 02.11.22 : Today’s Articles of Interest from Around the Internets
@polinailieva

Whether on British television news panels or late-night TV in America, it is hard to get away from talk about artificial intelligence (AI). Even the president of the United States has weighed in on AI, introducing an “AI Bill of Rights.”

The popular thing is to amplify the current media narrative — that AI will render millions jobless. It will eventually become more powerful than humans and destroy its creators. Just Google “AI doomsday” and then run and hide under the covers.

I will be the first to admit that all technology has a dark side. Email came with spam and scammers; mobile phones came with robocalls and endless tracking by companies like Facebook. Artificial intelligence, too, will be used for nefarious purposes.

But there is a more pragmatic way to think about all this. We should think of AI as augmented intelligence, not artificial intelligence, and open our eyes to the positive impacts this technology is having. We live in a complex, data-saturated, increasingly digital world. Our institutions and information flow at the speed of the network — and we need help to make sense of it all. As a species, we are forced to move and react faster. If anything, the demands on our time have multiplied.

Read the rest of this article at: The Spectator

News 02.11.22 : Today’s Articles of Interest from Around the Internets

News 02.11.22 : Today’s Articles of Interest from Around the Internets

On October 23rd, the Russian defense minister, Sergei Shoigu, made phone calls to the defense ministers of four NATO member countries to tell each of them that Ukraine was planning to detonate a “dirty bomb”—that is, a conventional weapon spiked with radioactive material—on its own territory. Three of the four recipients of this information—France, the United Kingdom, and the United States—responded that day with an unusual joint statement denouncing the claim. (Shoigu’s fourth interlocutor was Turkey.) Russian leaders and propagandists, who covered the phone calls in some detail, don’t necessarily think that anyone, anywhere, will believe that Ukraine would use a radioactive weapon against its own people just so it can blame Russia for the attack. Shoigu’s phone calls were preëmptive, another example of Russia creating information noise, sowing doubt, asserting the fundamental unknowability of the facts of war. On Thursday, Vladimir Putin said that he had personally directed Shoigu to make the calls, and this claim underscored their true meaning: Russia is preparing for a nuclear, or nuclearish, strike in Ukraine.

This was not the first, second, or third time that Moscow had sent this message. Putin has been rattling the nuclear sabre since the start of the full-scale invasion in February, and, indeed, for many years before. In 2014, months after annexing Crimea and at the height of engineering a pro-Russian insurgency war in eastern Ukraine, Russia changed its military doctrine to open up the possibility of a nuclear first strike in response to a threat from NATO. In 2018, Putin first proffered his promise—since reprised, and replayed many times by Russian television—that, in a world-scale nuclear event, Russians will go to heaven while Americans “just croak.” The threat of a nuclear strike has become more apparent—more frequently repeated on Russian propaganda channels—since the Ukrainian counter-offensive began, at the end of the summer.

The more the Kremlin has signalled its readiness to drop a nuclear bomb, the more the rest of the world has sought a reason to believe that it will not. Earlier this month, the U.K.’s defense secretary, Ben Wallace, reassured the audience at a Conservative Party conference that, although Putin’s actions could be “totally irrational,” he wouldn’t use nuclear weapons because he couldn’t risk losing the support of China and India—both of which, Wallace asserted, had put Putin on notice. President Biden has offered a different perspective: Putin, he said, is a “rational actor who has miscalculated significantly” in launching his offensive in Ukraine, and this was the reason he wouldn’t use nuclear arms. (On another occasion, Biden said that a Russian nuclear strike would unleash Armageddon.) Jake Sullivan, the U.S. national-security adviser, has consistently said that the White House takes Putin’s threats seriously and would respond decisively in the case of a nuclear attack. Still, in recent weeks, as Moscow has ramped up its warnings, it has become conventional wisdom, or perhaps just good form, to say that Putin isn’t really going to use nukes. “Russian President Vladimir Putin will probably not drop an atomic bomb on Ukraine,” a September Washington Post editorial began, axiomatically. Bloomberg’s European-affairs columnist Andreas Kluth started a recent column by instructing the reader to “put aside, if you can, the growing anxiety about Russian President Vladimir Putin going nuclear in his barbaric war in Ukraine” because, Kluth asserted, the risk “remains small.”

Read the rest of this article at: The New Yorker

Gamification — the use of ideas from game design for purposes beyond entertainment — is everywhere. It’s in our smartwatches, cajoling us to walk an extra thousand steps for a digital trophy. It’s in our classrooms, where teachers use apps to reward and punish children with points. And it’s in our jobs, turning the work of Uber drivers and call center staff into quests and missions, where success comes with an achievement and $50 bonus, and failure — well, you can imagine.

Many choose to gamify parts of their lives to make them a little more fun, like learning a new language with Duolingo or going for a run with my own Zombies, Run! app. But the gamification we’re most likely to encounter in our lives is something we have no control over — in our increasingly surveilled and gamified workplaces, for instance, or through the creeping advance of manipulative gamification in financial, insurance, travel and health services.

In my new book, “You’ve Been Played,” I argue that governments must regulate gamification so that it respects workers’ privacy and dignity. Regulators must also ensure that gamified finance apps and video games don’t manipulate users into losing more money than they can afford. Crucially, I believe any gamification intended for schools and colleges must be researched and debated openly before deployment.

But I also believe gamification can strengthen democracies, by designing democratic participation to be accessible and to build consensus. The same game design ideas that have made video games the 21st century’s dominant form of entertainment — adaptive difficulty, responsive interfaces, progress indicators and multiplayer systems that encourage co-operative behaviour — can be harnessed in the service of democracies and civil society.

Read the rest of this article at: Noema

News 02.11.22 : Today’s Articles of Interest from Around the Internets

Advertisement




News 02.11.22 : Today’s Articles of Interest from Around the Internets

Last weekend, a friend forwarded me a video. I clicked on the link nonchalantly, expecting a joyful puppy or perhaps a triumphant head of lettuce. But as the clip played, I sat up straighter, a coldness creeping over my heart. It started innocently enough, with a woman browsing in a store, but something catches her eye, and the chilling wall is revealed: A wall of ’90s Halloween costumes.

For $5, you can wrap a velvet choker around your neck, adorn your hair with butterfly clips, and clasp a fake Nokia 3310 to your ear. Ten dollars gets you a black slip dress, the kind I remember proudly pairing with purple Doc Martens for a Red Hot Chili Peppers concert back in ’96. The bomber jacket was a particular twist of the knife — at age 13, owning a bomber was my raison d’être, and I harassed my mum into buying me a particularly hideous sky-blue version I wore diligently for the rest of the summer, no matter if it rained or shined. (It really didn’t matter: puffed cloth proved equally ineffective against wet or cold.)

If I’d felt old when my local club changed ’80s Night to ’90s Night — presumably deciding those nostalgic for eighties classics should now be in bed by nine with a cup of cocoa — the Halloween wall made me feel like I’d picked the wrong chalice. Not only had my teenage outfits morphed into vintage costumes, but they’d done so just as I was swaddling myself in a cocoon of nostalgia, blissfully unaware of just how historical it was.

I had come across Buffy the Vampire Slayer on Disney+, and dipped in for a quick reunion. I was now on season four and in deep, reveling in all those knitted jumpers and chunky, chunky shoes. My teenage self would have been agog — whole seasons on tap was surely witchcraft only Willow could pull off. When Buffy first wielded a stake and a pun, back in 1997, I was lucky to even see a complete episode. Buffy aired on a Friday night. Since I was spending that timeslot getting rejected from bars, I would set up a video cassette to record it (you know the sort, you can buy a replica from your local Halloween store). Repeatedly instructing my parents to press record at eight, I was lucky to see half a show, with mum inevitably only remembering her mission by eight-thirty.

Read the rest of this article at: Longreads

Follow us on Instagram @thisisglamorous

News 02.11.22 : Today’s Articles of Interest from Around the Internets

Let me explain something about me: When I was 12, I started having panic attacks, brought on by fears that I couldn’t shake, even though I knew they were irrational. I was terrified, for example, that I’d become depressed—but I’d never been depressed before, and didn’t feel depressed. My junior high school devoted a series of assemblies to warning us budding teenagers that we were entering the most dangerous years of our lives, now ripe targets for cutting, suicide, eating disorders, overdoses, AIDS, and fatal car accidents. I would spend hours, even days, worrying that one of these things might be coming for me. My mind seemed to spin out of control—I couldn’t stop fixating, I couldn’t calm down, and I couldn’t understand what was happening.

Finding language to describe suffering of any kind is hard, but eventually, fearing I was going irreversibly insane, I tried—first for my mother, then for a doctor. Soon I was told there was a name for my particular distress: obsessive-compulsive disorder. Receiving this news at 13 was both relieving and shattering. (And surprising. There had been no assembly suggesting we watch out for anxiety disorders.) With the diagnosis came explanations and context for what I had not been able to interpret, as well as a body of scientific knowledge about treatment. Still, OCD can be an upsetting diagnosis, partly because according to current psychiatric understanding, it’s a chronic illness. You don’t typically get cured. You “learn to manage it” and, like most chronic conditions, it ebbs and flows based on a variety of factors. I felt horror at being indelibly marked and feared that I’d never get back to “my old self.” Who was I now?

Rachel Aviv begins her nonfiction debut, Strangers to Ourselves: Unsettled Minds and the Stories That Make Us, with the story of her own childhood introduction to psychiatry—briefer and more unusual than mine. At 6, she tells us, she abruptly stopped eating. She refused to say the names of food, “because pronouncing the words felt like the equivalent of eating,” and refused to say the number eight, because it sounded like ate. That year, she was admitted to the eating-disorders unit at Children’s Hospital of Michigan, thought to be the youngest person on record to be diagnosed with anorexia. Her doctors were perplexed, speculating that her anorexia might be related to the stress of her parents’ divorce. Aviv didn’t understand her diagnosis, its implications, or its cultural associations—she couldn’t even spell it. “I had a diseas called anexexia,” she wrote in her diary two years later.

In the ward, she was guided by older girls, from whom she learned the conventions of the disorder. “I hadn’t known that exercise had anything to do with body weight, but I began doing jumping jacks with Carrie and Hava at night.” She refused to sit down after the girls taught her the phrase couch potato. But eventually Aviv ate because she’d been forbidden to see or speak to her parents unless she did. “My goals realigned.” She was discharged from the hospital after six weeks, and assimilated back into her old life, eventually consenting to sit down with the rest of her class at school. The illness lifted as mysteriously as it had descended. From here, she suggests, she went back to a normal, healthy life.

“To use the terms of the historian Joan Jacobs Brumberg, who has written eloquently about the genesis of eating disorders, I was ‘recruited’ for anorexia, but the illness never became a ‘career,’ ” Aviv writes. “It didn’t provide the language with which I came to understand myself.” She proposes that she recovered because she was too young at the time of her illness to decipher or internalize the cultural and psychiatric narratives that attend it. She had no “insight,” a term used by psychiatrists to describe the quality of being self-aware and rational regarding one’s illness. Typically, insight is crucial to a good prognosis: If you have insight, you have what doctors would call the “correct attitude to a morbid change in oneself,” as a 1934 paper in The British Journal of Medical Psychology put it. But Aviv, pointing out that a correct attitude to a morbid change “depends on culture, race, ethnicity, and faith,” supplies a different, more acerbic definition of insight: “the degree to which a patient agrees with his or her doctor’s interpretation.”

Read the rest of this article at: The Atlantic

P.S. previous articles & more by P.F.M.