In the News 13.12.17 : Today’s Articles of Interest from Around the Internets


In the News 13.12.17 : Today’s Articles of Interest from Around the Internets
In the News 13.12.17 : Today’s Articles of Interest from Around the Internets
In the News 13.12.17 : Today’s Articles of Interest from Around the Internets
Britta Nickel

The Resulting Fallacy Is Ruining Your Decisions

Most poker players didn’t go to graduate school for cognitive linguistics. Then again, most poker players aren’t Annie Duke.

After pursuing a psychology Ph.D. on childhood language acquisition, Duke turned her skills to the poker table, where she has taken home over $4 million in lifetime earnings. For a time she was the leading female money winner in World Series of Poker history, and remains in the top five. She’s written two books on poker strategy, and next year will release a book called Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts.

Don’t be so hard on yourself when things go badly and don’t be so proud of yourself when they go well.

In it, Duke parlays her experience with cards into general lessons about decision making that are relevant for all of us. If a well-reasoned decision leads to a negative outcome, was it the wrong decision? How do we distinguish between luck and skill? And how do we move beyond our cognitive biases?

Stuart Firestein, a professor of neuroscience at Columbia University, sat down with Duke in October to talk to her about life and poker.

Read the rest of this article at: Nautilus

Jim Simons, the Numbers King

171218_r31148 (1)

A visit to a scientific-research center usually begins at a star professor’s laboratory that is abuzz with a dozen postdocs collaborating on various experiments. But when I recently toured the Flatiron Institute, which formally opened in September, in lower Manhattan, I was taken straight to a computer room. The only sound came from a susurrating climate-control system. I was surrounded by rows of black metal cages outfitted, from floor to ceiling, with black metal shelves filled with black server nodes: boxes with small, twinkling lights and protruding multicolored wires. Tags dangled from some of the wires, notes that the tech staff had written to themselves. I realized that I’d seen a facility like this only in movies. Nick Carriero, one of the directors of what the institute calls its “scientific-computing core,” walked me around the space. He pointed to a cage with empty shelves. “We’re waiting for the quantum-physics people to start showing up,” he said.

The Flatiron Institute, which is in an eleven-story fin-de-siècle building on the corner of Twenty-first Street and Fifth Avenue, is devoted exclusively to computational science—the development and application of algorithms to analyze enormous caches of scientific data. In recent decades, university researchers have become adept at collecting digital information: trillions of base pairs from sequenced human genomes; light measurements from billions of stars. But, because few of these scientists are professional coders, they have often analyzed their hauls with jury-rigged code that has been farmed out to graduate students. The institute’s aim is to help provide top researchers across the scientific spectrum with bespoke algorithms that can detect even the faintest tune in the digital cacophony.

Read the rest of this article at: The New Yorker

Tuscany Tote in Midnight

Shop the Tuscany Tote in Midnight
at Belgrave Crescent &

How Email Open Tracking Quietly Took Over the Web

“I just came across this email,” began the message, a long overdue reply. But I knew the sender was lying. He’d opened my email nearly six months ago. On a Mac. In Palo Alto. At night.

I knew this because I was running the email tracking service Streak, which notified me as soon as my message had been opened. It told me where, when, and on what kind of device it was read. With Streak enabled, I felt like an inside trader whenever I glanced at my inbox, privy to details that gave me maybe a little too much information. And I certainly wasn’t alone.

There are some 269 billion emails sent and received daily. That’s roughly 35 emails for every person on the planet, every day. Over 40 percent of those emails are tracked, according to a study published last June by OMC, an “email intelligence” company that also builds anti-tracking tools.

The tech is pretty simple. Tracking clients embed a line of code in the body of an email—usually in a 1×1 pixel image, so tiny it’s invisible, but also in elements like hyperlinks and custom fonts. When a recipient opens the email, the tracking client recognizes that pixel has been downloaded, as well as where and on what device. Newsletter services, marketers, and advertisers have used the technique for years, to collect data about their open rates; major tech companies like Facebook and Twitter followed suit in their ongoing quest to profile and predict our behavior online.

Read the rest of this article at: Wired


The Ghost and the Princess

There is an “official theory” about the nature of minds that “hails chiefly from Descartes,” wrote Gilbert Ryle, an Oxford philosopher. According to the theory, each person has a mind that is a private, inner world. It has no spatial dimensions and is not subject to laws that govern physical objects, yet it is mysteriously connected to a material body during a person’s earthly life. Ryle dubbed this “the dogma of the Ghost in the Machine.”

People have not always thought of the mind and the body in this way. Homer’s heroes are not depicted as composites that are only partly physical. Their awareness, intelligence, and other mental activities are part of their bodily lives. And although the shades of the dead lurk in the Homeric underworld, these etiolated creatures are little more than fading echoes of the living. Some later Greek philosophers explicitly stated that the soul is made of physical stuff. For Democritus, it was tiny units of solid matter. For the Stoics, it was a mixture of fire and air.

Unlike Homer and the Greek materialists, Plato did believe in something like René Descartes’ ghost in the machine. A person has an inner rational self, according to Plato, which can escape its bodily imprisonment with its powers intact. Yet Ryle was right to single out Descartes even though parts of the “official theory” can be traced to Plato. Descartes sharpened the concepts of mind and matter, crystallizing ideas that took shape in the seventeenth century and giving us the modern form of the so-called mind-body problem.

In his Meditations on First Philosophy, published in 1641, Descartes announced that he was essentially a thinking thing: “Thought…alone is inseparable from me.” There is an outer world, which includes my body, but I could still exist even if it were all destroyed. And what exactly is a thinking thing? Something that is aware. Descartes explained that by “thought” he meant “everything that is within us in such a way that we are immediately aware of it.” This included sensation, will, intellect, and imagination. Thus Descartes made consciousness the distinguishing mark of the mental.

Read the rest of this article at: Lapham’s Quarterly

Follow us on Instagram @thisisglamorous

Freud In The Scanner

An old therapist of mine had a signed photograph of Sigmund Freud hanging on her wall. A gift from a former patient who had employed forgery skills in a side business of dubious legality, it was the iconic Freud photo: full suit, blank scowl, half-smoked cigar. Once, mid-session, I asked my therapist what she thought of Freud’s theories. ‘I don’t think much of them,’ she replied.

Her dismissiveness wasn’t a surprise. By any measure, Freud was one of the most influential thinkers of the 20th century. Following his death in 1939, the British author W H Auden was able to declare in his poem ‘In Memory of Sigmund Freud’ (1939) that Freud had represented ‘a whole climate of opinion’, and the subsequent two decades represented the heyday of psychoanalysis. No longer. Outside academia, for those who give it any thought, psychoanalysis is generally regarded as having followed phrenology and mesmerism into the dustbin of psychological enquiry. Boys lusting after their mothers; girls desiring a penis – such are the luridly risible impressions that persist in the popular imagination.

What went wrong? In 1996, Tom Wolfe wrote that ‘the demise of Freudianism can be summed up in a single word: lithium’. The American author described how in the early 1950s, after years of psychoanalytic ineffectiveness, rapid physical relief for sufferers of bipolar disorder arrived in the form of a pill. Wolfe’s example is microcosmic of a wider state of affairs. The waning of psychoanalysis corresponds precisely to the rise of modern neuroscience, whose physicalist approach now drives psychiatry. Today, almost anyone could have a go at describing serotonin, or dopamine, or Prozac. Few of those same people could define the primal scene, or the super-ego. As the American author Siri Hustvedt puts it in The Shaking Woman, or a History of My Nerves (2010), Freud is now seen by many if not most as ‘a mystic, a man whose ideas bear no relation to physical realities, a kind of monster of mirage who derailed modernity by feeding all kinds of nonsense to a gullible public until his thought was finally shattered by a new scientific psychiatry founded on the wonders of pharmacology’.

But in recent decades, this picture of philosophical antagonism has been complicated. Around 20 years ago, there emerged a new field, bearing the predictably cumbersome name of neuropsychoanalysis. Adherents to this amorphous research programme – spearheaded by the South African neuropsychologist and psychoanalyst Mark Solms of the University of Cape Town – are keen to rehabilitate Freud’s reputation for the age of the brain. They remind us that the young Freud started his career in neurology, and spent two decades in the hard sciences. They point to Freud’s attempts during the 1890s to ‘furnish a psychology that shall be a natural science’ and stress his lifelong belief that one day his theories would be augmented and refined by the empirical study of our grey matter. Neuropsychoanalysis published the inaugural issue of its academic journal in 1999, and held its first conference a year later. Since then, an increasing number of psychoanalysts have begun to investigate what neuroscience can offer their theories and practice, and conciliatory positions have been adopted by some of the most influential brain scientists of the era: Antonio Damasio, Joseph LeDoux, Jaak Panksepp, V S Ramachandran and, above all, Eric Kandel.

Could it be that Wolfe was wrong when he declared that the era of lithium means the end for Freud? Might the analyst’s couch and the brain scanner have something to offer one another?

Read the rest of this article at: aeon

P.S. previous articles & more by P.F.M.