In the News 15.03.17 : Today’s Articles of Interest from Around the Internets
Wednesday 15th March, 2017
‘We smile, we say hello to everybody, we enjoy ourselves.’
Against the backdrop of all the perceived hysteria, the luxury Italian house of Gucci appears to be stubbornly – perhaps even gleefully – bucking the system, thumbing its nose to the general mood of uncertainty. Presenting itself as unshackled and reinvigorated, Gucci’s new era continues to steadily unfold with healthy sales and an aura pitched somewhere between carefree and supremely confident. Barely 15 months into a monumental reinvention, the overriding sensation within the company appears to be unity and fearlessness.
It wasn’t always so, of course. The events leading up to today’s newfound optimism can be traced back to long before December 2014, when faltering sales and lukewarm reviews finally left couple Patrizio di Marco and Frida Giannini – Gucci CEO and creative director respectively – helpless in the face of their inevitable and unceremonious ousting. The brand had simply stagnated for too long, never truly able to sustain the phenomenal period of growth and gravitas that Tom Ford and CEO Domenico de Sole commanded between 1994 and 2004. When François-Henri Pinault, the chairman of Gucci’s owner Kering, replaced di Marco with Marco Bizzarri – a high-achieving CEO previously at Stella McCartney and Bottega Veneta – it felt as if he were drawing a line in the sand. Announcing a new era. A revolution, even. And when Bizzarri announced that Gucci’s new creative director would be Alessandro Michele, speculation surrounding the brand reached new heights. While Michele’s name was met with a universal, ‘Who?’
Read the rest of this article at System
Are Liberals On The Wrong Side Of History?
Of all the prejudices of pundits, presentism is the strongest. It is the assumption that what is happening now is going to keep on happening, without anything happening to stop it. If the West has broken down the Berlin Wall and McDonald’s opens in St. Petersburg, then history is over and Thomas Friedman is content. If, by a margin so small that in a voice vote you would have no idea who won, Brexit happens; or if, by a trick of an antique electoral system designed to give country people more power than city people, a Donald Trump is elected, then pluralist constitutional democracy is finished. The liberal millennium was upon us as the year 2000 dawned; fifteen years later, the autocratic apocalypse is at hand. Thomas Friedman is concerned.
You would think that people who think for a living would pause and reflect that whatever is happening usually does stop happening, and something else happens in its place; a baby who is crying now will stop crying sooner or later. Exhaustion, or a change of mood, or a passing sound, or a bright light, something, always happens next. But for the parents the wait can feel the same as forever, and for many pundits, too, now is the only time worth knowing, for now is when the baby is crying and now is when they’re selling your books.
Read the rest of this article at The New Yorker
Escape To Another World
David Mullings was always a self-starter. Born in Jamaica, he moved to Florida to go to university, and founded his first company – a digital media firm that helped Caribbean content find a wider audience – before finishing business school at the University of Miami. In 2011 he opened a private-equity firm with his brother. In 2013 the two made their first big deal, acquiring an 80% stake in a Tampa-based producer of mobile apps. A year later it blew up in their faces, sinking their firm and their hopes.
Mullings struggled to recover from the blow. The odd consulting gig provided a distraction and some income. Yet depression set in as he found himself asking whether he had anything useful to contribute to the wider world.
Then Destiny called.
Like millions of people of a certain age, the Nintendo Entertainment System had occupied a crucial place in Mullings’s childhood. It introduced him to video gaming, gave him a taste for it, made him aware of the fact that he was good at it: a “born gamer”, in his words. Yet the pixelated worlds of the Mario brothers, for all their delights, were nothing like the experiences available to gamers today.
Read the rest of this article at 1843 Magazine
Materialism holds the high ground these days in debates over that most ultimate of scientific questions: the nature of consciousness. When tackling the problem of mind and brain, many prominent researchers advocate for a universe fully reducible to matter. ‘Of course you are nothing but the activity of your neurons,’ they proclaim. That position seems reasonable and sober in light of neuroscience’s advances, with brilliant images of brains lighting up like Christmas trees while test subjects eat apples, watch movies or dream. And aren’t all the underlying physical laws already known?
From this seemly hard-nosed vantage, the problem of consciousness seems to be just one of wiring, as the American physicist Michio Kaku argued in The Future of the Mind (2014). In the very public version of the debate over consciousness, those who advocate that understanding the mind might require something other than a ‘nothing but matter’ position are often painted as victims of wishful thinking, imprecise reasoning or, worst of all, an adherence to a mystical ‘woo’.
Read the rest of this article at aeon
Falling in Love with Words: The Secret Life of a Lexicographer
In an excerpt from her new book, Merriam-Webster lexicographer Kory Stamper describes how she fell in love with words and offers a peek into the complex process of making dictionaries.
We are in an uncomfortably small conference room. It is a cool June day, and though I am sitting stock-still on a corporate chair in heavy air-conditioning, I am sweating heavily through my dress. This is what I do in job interviews.
A month earlier, I had applied for a position at Merriam-Webster, America’s oldest dictionary company. The posting was for an editorial assistant, a bottom-of-the-barrel position, but I lit up like a penny arcade when I saw that the primary duty would be to write and edit English dictionaries. I cobbled together a résumé; I was invited to interview. I found the best interview outfit I could and applied extra antiperspirant (to no avail).
Steve Perrault, the man who sat opposite me, was (and still is) the director of defining at Merriam-Webster and the person I hoped would be my boss. He was very tall and very quiet, a sloucher like me, and seemed almost as shyly awkward as I was, even while he gave me a tour of the modest, nearly silent editorial floor. Apparently, neither of us enjoyed job interviews. I, however, was the only one perspiring lavishly.
“So tell me,” he ventured, “why you are interested in lexicography.”
I took a deep breath and clamped my jaw shut so I did not start blabbing. This was a complicated answer.
* * *
I grew up the eldest, book-loving child of a blue-collar family that was not particularly literary. According to the hagiography, I started reading at three, rattling off the names of road signs on car trips and pulling salad-dressing bottles out of the fridge to roll their tangy names around on my tongue: Blue Chee-see, Eye-tal-eye-un, Thouse-and Eyes-land. My parents cooed over my precociousness but thought little of it.
I chawed my way through board books, hoarded catalogs, decimated the two monthly magazines we subscribed to (National Geographic and Reader’s Digest) by reading them over and over until they fell into tatters. One day my father came home from his job at the local power plant, exhausted, and dropped down onto the couch next to me. He stretched, groaning, and plopped his hard hat on my head. “Whatcha reading, kiddo?” I held the book up for him to see: Taber’s Cyclopedic Medical Dictionary, a book from my mother’s nursing days of yore. “I’m reading about scleroderma,” I told him. “It’s a disease that affects skin.” I was about nine years old.
Read the rest of this article at Longreads