news

News 29.06.22 : Today’s Articles of Interest from Around the Internets

by

News 29.06.22 : Today’s Articles of Interest from Around the Internets
@cove.house
News 29.06.22 : Today’s Articles of Interest from Around the Internets
@nastyanastya
News 29.06.22 : Today’s Articles of Interest from Around the Internets
@tessaneustadt

Strange as it sounds, scientists still do not know the answers to some of the most basic questions about how life on Earth evolved. Take eyes, for instance. Where do they come from, exactly? The usual explanation of how we got these stupendously complex organs rests upon the theory of natural selection.

You may recall the gist from school biology lessons. If a creature with poor eyesight happens to produce offspring with slightly better eyesight, thanks to random mutations, then that tiny bit more vision gives them more chance of survival. The longer they survive, the more chance they have to reproduce and pass on the genes that equipped them with slightly better eyesight. Some of their offspring might, in turn, have better eyesight than their parents, making it likelier that they, too, will reproduce. And so on. Generation by generation, over unfathomably long periods of time, tiny advantages add up. Eventually, after a few hundred million years, you have creatures who can see as well as humans, or cats, or owls.

This is the basic story of evolution, as recounted in countless textbooks and pop-science bestsellers. The problem, according to a growing number of scientists, is that it is absurdly crude and misleading.

For one thing, it starts midway through the story, taking for granted the existence of light-sensitive cells, lenses and irises, without explaining where they came from in the first place. Nor does it adequately explain how such delicate and easily disrupted components meshed together to form a single organ. And it isn’t just eyes that the traditional theory struggles with. “The first eye, the first wing, the first placenta. How they emerge. Explaining these is the foundational motivation of evolutionary biology,” says Armin Moczek, a biologist at Indiana University. “And yet, we still do not have a good answer. This classic idea of gradual change, one happy accident at a time, has so far fallen flat.”

There are certain core evolutionary principles that no scientist seriously questions. Everyone agrees that natural selection plays a role, as does mutation and random chance. But how exactly these processes interact – and whether other forces might also be at work – has become the subject of bitter dispute. “If we cannot explain things with the tools we have right now,” the Yale University biologist Günter Wagner told me, “we must find new ways of explaining.”

In 2014, eight scientists took up this challenge, publishing an article in the leading journal Nature that asked “Does evolutionary theory need a rethink?” Their answer was: “Yes, urgently.” Each of the authors came from cutting-edge scientific subfields, from the study of the way organisms alter their environment in order to reduce the normal pressure of natural selection – think of beavers building dams – to new research showing that chemical modifications added to DNA during our lifetimes can be passed on to our offspring. The authors called for a new understanding of evolution that could make room for such discoveries. The name they gave this new framework was rather bland – the Extended Evolutionary Synthesis (EES) – but their proposals were, to many fellow scientists, incendiary.

Read the rest of this article at: The Guardian

News 29.06.22 : Today’s Articles of Interest from Around the Internets

News 29.06.22 : Today’s Articles of Interest from Around the Internets

Old age is not exactly a time of life that most of us welcome, although globally speaking it is a privilege to reach it. In Western societies, the shocked realisation that we are growing old often fills us with alarm and even terror. As Simone de Beauvoir writes in her magisterial study of the topic, La vieillesse (1970) – translated in the UK as Old Age, and in the US as The Coming of Age (1972) – old age arouses a visceral aversion, often a ‘biological repugnance’. Many attempt to push it as far away as possible, denying that it will ever happen, even though we know it already dwells within us.

In fleeing from our own old age, we also seek to distance ourselves from its harbingers – from those who are already old: they are ‘the Other’. They are (with some exceptions) viewed as a ‘foreign species’, and as ‘outside humanity’. Excluded from the so-called normal life of society, most are condemned to conditions where their sadness, as Beauvoir puts it, ‘merges with their consuming boredom, with their bitter and humiliating sense of uselessness, and with their loneliness in the midst of a world that has nothing but indifference for them’. Beauvoir’s work sets out to show how old people are viewed and treated as the Other ‘from without’ and also – by drawing on memoirs, letters and other sources – to present their experiences ‘from within’. Her aim is to ‘shatter’ what she calls the ‘conspiracy of silence’ surrounding the old for, she insists, if their voices were heard, we would have to acknowledge that these were ‘human voices’ (emphasis added).

On Beauvoir’s view, most societies prefer to shut their eyes rather than see ‘abuses, scandals, and tragedies’ – they opt for the ease of accepting what is, instead of the self-scrutiny and struggle that is required to envision and enact what life could be. Speaking of her own society, she claims that it cared no more about orphans, young offenders or the disabled than it did about the old. However, what she finds astonishing about the latter case is that ‘every single member of the community must know that his future is in question; and almost all of them have close personal relationships with some old people’. So what explains this failure to face our future, to see the humanity in all human life?

Beauvoir’s answer is that famous existentialist phenomenon: bad faith. She believes that bad faith is a persistent human temptation, but that it does not take the same shape in all lives, or at all stages of life. In general terms, bad faith is the over-identification with one of two poles of human existence: on the one hand, there are all the contingent and unchosen facts about you, such as when and where you were born, your parents, your country, your material conditions, the shape, colour and ability of your body. They also, and importantly, include your dependency on other human beings and their dependence on you. This pole Beauvoir calls ‘facticity’.

The other pole, ‘freedom’, concerns your ability to act as you will, within the constraints of your situation, to take up and transform these facts. If you are a waitress with no corporate experience and you apply to be a CEO, this is likely to involve facticity-denying bad faith. If you are a waitress and conclude that you will never be anything but a waitress, this is likely to involve freedom-denying bad faith: you are foreclosing your possibilities by concluding that only what already is could ever be. So how does this temptation affect our attitudes toward the old? Beauvoir thinks that the not-yet-old are guilty of facticity-denying bad faith: their aversion to the already-old expresses an attempt to flee from their own ageing and mortality. This flight may offer them temporary refuge from unwanted futures but, for the old people they flee, it creates a hostile and lonely world.

Read the rest of this article at: Aeon

At the end of the Korean war in 1953, 21 American former prisoners of war chose to settle in the People’s Republic of China rather than return to the Land of the Free. The US government reacted with astonished horror at the way that these unfortunate dupes had been “brainwashed” – a term adapted by western journalists just three years earlier from the original Chinese – by their jailers. It had entirely missed the point that each man had arrived at a considered, individual, decision about why his life might be nicer under Mao than Eisenhower.

Take Clarence Adams, an African-American soldier who had experienced vicious racism growing up in Tennessee and was in no hurry to return for an encore. Adams chose to settle in Bejing instead, worked as a publisher, married a university professor and enjoyed being called “comrade”. Only after 12 years did he start to feel that the time was right to return with his new family to the country of his birth. Far from being welcomed home as a man who had gone looking for opportunities in the approved American way, the FBI regarded him as somewhere between a psychiatric patient and a political traitor. Yet if anyone had shown evidence of being able to think for himself it was surely Adams.

In this frankly brilliant book, Daniel Pick sets out to explore why the idea of mind control became such a contested topic during the second half of the 20th century. His skills as a historian and a practising psychoanalyst allow Pick to move beyond a methodology in which human subjects are either reduced to data points or inflated into grand actors. In other words, he shows us Adams as neither a powerless pawn nor a figure of heroic resistance, but rather someone who muddled through the bewildering world as best he could, changing his mind certainly but never giving it away.

Read the rest of this article at: The Guardian

News 29.06.22 : Today’s Articles of Interest from Around the Internets

Advertisement




News 29.06.22 : Today’s Articles of Interest from Around the Internets

I was in my second year of university when I discovered the joy of making meals from scratch. My desire to cook was sparked, partly, by the sheer excitement of having access to a kitchen. After a year of mostly eating the same few vegetarian meals from a dorm cafeteria, the possibilities now felt endless. Between classes, I rushed home and delighted in the ritual of trying a new technique or recipe: the perfect temperature at which to roast cauliflower, how to fry tofu so the edges were just crispy enough, how to use up all those cans of chickpeas I bought on sale.

My curiosity was further bolstered by the fact that I was coming into young adulthood around the same time that YouTube cooking channels were becoming part of pop culture; this was 2017, when Bon Appétit had just launched its now-famous Gourmet Makes series, and shows like Binging with Babish were garnering millions of views. As a terminally online student, I spent far too much of my time watching their videos and experimenting with their recipes.

One afternoon, I was making a stir-fry in the small kitchen of a house where I lived. As vegetables sputtered in hot oil on the stove, I remember one of my roommates coming downstairs and poking her head into the kitchen.

“What smells so good?” She smiled.

“Probably the garlic,” I answered, stirring the bright contents of the pan.

I recall her eyeing the mess I’d made on the counter—discarded pieces of carrot and onion, spices spilled from plastic bags—and stopping when she saw the little jar. A spoon lay beside it, still slick with preservative oil.

“You use the pre-minced stuff?”

“Yeah,” I said.

She wrinkled her nose and receded. I could tell I’d done something unsavoury.

I’d been in the habit of buying jarred garlic—the kind that comes minced and suspended in oil—because it was easy to use. I was still getting accustomed to the patience and time that cooking required, and the jarred stuff seemed like a no-brainer: a way to save a few minutes. But, after that day, I stopped buying it. I’m not sure I even finished that container—it likely sat half-empty in the fridge for the rest of the semester. As I started to pride myself on my cooking, I also became hyperaware of wanting to do things the right way, and I noticed all the recipes and cooking shows I followed only ever used fresh ingredients.

Online, people affirmed my new belief. They joked that those who use jarred garlic can’t cook. “What if you met your soul mate, but then found out they cook with pre-minced garlic in a jar,” one tweet said. Media outlets published article after article condemning the stuff. An old quote by the celebrity chef Anthony Bourdain seemed to resurface every few months. “Avoid at all costs that vile spew you see rotting in screwtop jars,” he had written in his book Kitchen Confidential, first published in 2000. “Too lazy to peel fresh? You don’t deserve to eat garlic.”

This was the first of many haughty ideas I’d hear about cooking and how selective we should be with our food. Who would buy a bottle of lemon juice when you could buy fresh lemons? Shredded cheese when you could grate your own? Pre-sliced mushrooms when they were apparently cheaper and better whole? An image of pre-peeled oranges from Whole Foods caused so much outrage in 2016 that the product was pulled from stores. Then there was the notion that those who opted for these packaged foods or ingredients weren’t just lazy—they were wreaking havoc on the environment with all that unnecessary plastic. Under all of this criticism, there was always a minority voice telling people they were being ableist. Those voices reminded us that disabled people rely on these shortcuts—that not everyone can chop and peel and slice. Over and over again, those voices were drowned out by the majority.

I, too, ignored them, and I learned to love fresh garlic. I learned to love the feeling of smashing a clove between a cutting board and the flat side of a knife. Of mincing it: always by hand, never with a press. I learned to always use more than the recipe called for, to love the smell and sizzle of it frying, to make sure it didn’t get all browned and bitter in the pan. I learned to appreciate the slight (but usually unnoticeable) difference in taste when I used the fresh stuff, the allicin edge a little sharper or more present.

Read the rest of this article at: The Walrus

Follow us on Instagram @thisisglamorous

News 29.06.22 : Today’s Articles of Interest from Around the Internets

At the end of March, a book that had been condemned to die came back to life. There was no star-studded launch, and no great fanfare, although this book is now somewhat famous. The new publisher of the poet Kate Clanchy’s memoir Some Kids I Taught and What They Taught Me felt it wrong to cash in on the controversy that has engulfed it. So the new editions – with some intriguing changes to the original text – were quietly resupplied to bookshops willing to stock them.

What follows is a tale that reverberates well beyond publishing. It’s about whose voice is heard, which stories are told, and by whom. But it has broader implications for working life, too, particularly in industries where so-called culture wars raging through the outside world can no longer be left at the office door.

When Some Kids first emerged in 2019, Clanchy was much admired for her work at an Oxford comprehensive, teaching children from diverse backgrounds to write poetry, with sometimes luminous results. A celebration of multicultural school life, coupled with candid reflections on her own flaws, Some Kids was lauded by reviewers and won the Orwell prize for political writing, with judges praising a “brilliantly honest writer” whose reflections were “moving, funny and full of love”. But then things began to unravel.

In November 2020, a teacher posted on the amateur reviewers’ website Goodreads that the book was “centred on this white middle-class woman’s harmful, judgmental and bigoted views on race, class and body image”, using “racist stereotypes” to describe pupils. The author, she said, wrote of their “chocolate skin” and “almond eyes”.

Clanchy hit back, initially on Goodreads and then in July 2021 on Twitter, claiming “someone made up a racist quote and said it was in my book” and urging her followers to challenge reviews she said had caused threats against her. Literary giants, including the 75-year-old children’s author (and president of the Society of Authors) Philip Pullman, rose to her defence. Yet it quickly emerged that those phrases (although not, as we will later hear from Clanchy, everything attributed to her) were in the book. Her prickly response not only sat awkwardly with Some Kids’ theme of a narrator open to learning about herself – one who believed, she wrote, that deep down “most people are prejudiced; that I am, that prejudice happens in the reading of poetry as well as everything else” – but had unintended consequences for her critics, too.

Three writers of colour, Monisha Rajesh, Prof Sunny Singh and Chimene Suleyman, who had challenged Clanchy on Twitter, endured months of racist abuse and sometimes violent threats, despite Clanchy’s own publisher, Picador, describing their criticisms as “instructive and clear-sighted”. An 18-year-old autistic writer named Dara McAnulty, who had questioned Clanchy’s description of two autistic pupils as “jarring company”, was forced off social media by abusive messages. Picador, having initially apologised, saying Clanchy would rewrite the book, then announced this January that it was parting company with her by mutual consent. (She has suggested Some Kids would have been pulped had Mark Richards, co-founder of the new publishing house Swift, not bought the rights.) Clanchy, who lost both her parents and got divorced in the same year her career imploded, meanwhile disclosed in December that she had, at times, felt suicidal.

The row erupted at an anxious time for publishing, following similar pushback at novels ranging from Jeanine Cummins’s 2020 book American Dirt – whose portrayal of a migrant Mexican family was critically acclaimed, until Latin American writers accused its author (who is of Irish and Puerto Rican heritage) of peddling stereotypes and inaccuracies – to the queer black author Kosoko Jackson’s A Place for Wolves, a gay love story set during the Kosovo war that was withdrawn in 2019 at the writer’s request after Goodreads reviewers attacked his representation of Muslim characters.

The Nobel prize winner Kazuo Ishiguro has recently suggested authors are running scared of an “anonymous lynch mob” online, while the novelist Sebastian Faulks vowed no longer to describe female characters’ appearance after being criticised for doing so in the past. Debate rages over whether these are long overdue correctives, or represent the stifling of imagination; whether art has the right to offend, and whether publishing would be navigating all this less clumsily if it weren’t a predominantly white middle-class industry itself.

That Some Kids got so far without ringing alarm bells merely confirms some of its critics’ suspicions of a business employing many people like Clanchy, but few who resemble her pupils. Yet others in the industry are troubled that one writer was seemingly left to face the fallout alone, as a scapegoat for wider collective sins.

“It was a group fail,” says one veteran agent, who asks to remain anonymous. “I think the publisher failed in their duty of care to the writer. I think the author failed in her duty of care to her pupils, and in saying that she didn’t write what she did. Nobody emerges from that story well. Harm has been done, and now everyone’s afraid.”

Read the rest of this article at: The Guardian

P.S. previous articles & more by P.F.M.