THE WAY YOU talk can reveal a lot about you—especially if you’re talking to a chatbot. New research reveals that chatbots like ChatGPT can infer a lot of sensitive information about the people they chat with, even if the conversation is utterly mundane.
The phenomenon appears to stem from the way the models’ algorithms are trained with broad swathes of web content, a key part of what makes them work, likely making it hard to prevent. “It’s not even clear how you fix this problem,” says Martin Vechev, a computer science professor at ETH Zurich in Switzerland who led the research. “This is very, very problematic.”
Vechev and his team found that the large language models that power advanced chatbots can accurately infer an alarming amount of personal information about users—including their race, location, occupation, and more—from conversations that appear innocuous.
Vechev says that scammers could use chatbots’ ability to guess sensitive information about a person to harvest sensitive data from unsuspecting users. He adds that the same underlying capability could portend a new era of advertising, in which companies use information gathered from chabots to build detailed profiles of users.
Some of the companies behind powerful chatbots also rely heavily on advertising for their profits. “They could already be doing it,” Vechev says.
The Zurich researchers tested language models developed by OpenAI, Google, Meta, and Anthropic. They say they alerted all of the companies to the problem. OpenAI spokesperson Niko Felix says the company makes efforts to remove personal information from training data used to create its models, and fine tunes them to reject request for personal data. “We want our models to learn about the world, not private individuals,” he says. Individuals can request that OpenAI delete personal information surfaced by its systems. Anthropic referred to its privacy policy, which states that it does not harvest or “sell” personal information. Google, and Meta did not respond to a request for comment
“This certainly raises questions about how much information about ourselves we’re inadvertently leaking in situations where we might expect anonymity,” says Florian Tramèr, an assistant professor also at ETH Zurich who was not involved with the work but saw details presented at a conference last week.
Tramèr says it is unclear to him how much personal information could be inferred this way, but he speculates that language models may be a powerful aid for unearthing private information. “There are likely some clues that LLMs are particularly good at finding, and others where human intuition and priors are much better,” he says.
Read the rest of this article at: Wired
Travel and history can both inspire a sense of moral relativism, as they did for the Greek historian and traveller Herodotus in the 5th century BCE. What should one make of the fact that what counts as adultery, for example, differs around the world? In Lust in Translation (2007), the contemporary writer Pamela Druckerman chronicles how the rules of infidelity vary ‘from Tokyo to Tennessee’. It can be tempting to conclude that the correct answer to moral questions is ultimately settled by convention, perhaps like matters of etiquette such as how to eat your food. For Herodotus, the recognition of cultural difference led him to declare, echoing the words of the Greek poet Pindar, that ‘custom is king of all.’
The acclaimed British philosopher Bernard Williams, writing in the 1970s, showed that a common way of arguing for moral relativism is confused and contradictory. Nonetheless, he went on to defend a philosophical worldview that incorporated some of relativism’s underlying ideas. There is much to learn, when we think about the ongoing culture wars over moral values, from the encounters with relativism that recur throughout Williams’s work. First, however, it’s useful to understand why a prevalent feature of the culture wars, arguing over which words to use, itself quickly leads to arguments over relativism.
Consider the following memorable scene in Sally Rooney’s novel Conversations with Friends (2017). The central character, Frances, who is sleeping with Bobbi, rejects her friend Philip’s insistence that ‘in basic vocabulary she is your girlfriend.’ Frances is right to resist Philip’s attempt to put a familiar label on things: she is trying to live in a way for which there aren’t words yet. Elsewhere in the book, Frances questions not only the word ‘couple’ but even the term ‘relationship’ to depict her life with Bobbi. If she isn’t sure how to describe her complicated situation, it’s in part because it doesn’t easily fit into the grids of conventional thought. She wants, to use an image from James Joyce, to ‘fly by’ the nets of language.
The words your society uses, as Frances is highly aware, shape the self you can become. Language is loaded with ethical expectations. If you agree that you are in a ‘couple’ with someone, for instance, then that commonly (though not always) carries with it the expectation that you will not be in bed with anyone else. That norm can be challenged, and has been, by those who are in open relationships. However, if you are trying to live in a way that is new, and doesn’t fit into accustomed categories, then it’s likely that you will be misunderstood and deprived of social recognition. Even so, as the American philosopher Judith Butler has argued in Undoing Gender (2004), there are situations where it’s better to be unintelligible than to force oneself into the existing menu of social options.
Read the rest of this article at: Aeon
OSLO, Norway — With motor vehicles generating nearly a 10th of global CO2 emissions, governments and environmentalists around the world are scrambling to mitigate the damage. In wealthy countries, strategies often revolve around electrifying cars — and for good reason, many are looking to Norway for inspiration.
Over the last decade, Norway has emerged as the world’s undisputed leader in electric vehicle adoption. With generous government incentives available, 87 percent of the country’s new car sales are now fully electric, a share that dwarfs that of the European Union (13 percent) and the United States (7 percent). Norway’s muscular EV push has garnered headlines in outlets like the New York Times and the Guardian while drawing praise from the Environmental Defense Fund, the World Economic Forum, and Tesla CEO Elon Musk. “I’d like to thank the people of Norway again for their incredible support of electric vehicles,” he tweeted last December. “Norway rocks!!”
Read the rest of this article at: Vox
Carpenter wrote one of the most iconic movie themes of all time in a rush, out of necessity, and basically by noodling on a keyboard.
The mother of invention forced his hand—Carpenter’s shoestring budget for Halloween, $300,000, meant that he was the only composer he could afford—and the best tool at hand for a director with no formal musical training and no money in 1978 was a synthesizer. So he popped into a synth studio operated by a fellow USC alum and quickly whipped together a lean, economical score that would give the film some of the tension and propulsion and electric jolts that were missing on the screen.
The result was one of the most spine-tingling yet downright danceable film scores ever produced. Carpenter played a handful of simple ideas on a grand piano and synth keyboard, conjuring them up in the moment while smarter techs made the actual machines work. “The stuff you hear on my early albums is all me playing,” Carpenter says, “and, boy, it has to be simple if I’m doing it.”
“John had a particular skill set that was unconscious,” says Dan Wyman, the synth programmer and orchestrator who helped Carpenter achieve the Halloween score. The director knew enough about music theory to take his haunting hook around a sequential series of harmonic keys known as the “circle of fifths,” which meant he could rotate the obsessive riff almost endlessly. “That’s what really allowed Halloween to work,” says Wyman. “To him, it was noodling—but it was really intelligent noodling.”
Carpenter’s the first person to play down his musical acuity: “I don’t have great chops, on anything,” he says. Nor does he claim any understanding of the analog contraptions he used to create his famous scores. “Oh, hell no. I just played the keyboard. I would say, ‘Let me have a bass sound, Dan,’ and he would do it. ‘Let me have a violin.’ Until recently, I’ve really never been conversant with the machinery at all.”
“He’s very humble about it,” says Alan Howarth, Carpenter’s other synth guru and cocomposer for many years. When a record label approached Howarth about producing a soundtrack album for Escape From New York, Carpenter’s response was: “Really? Somebody would want to listen to that stuff?”
Yes—they would. Carpenter is the only filmmaker in history who is as beloved for his musical creations as he is for his movies. The hypnotic minimalism and tension he cooked up with delectable synth waves gave films like Assault on Precinct 13, Escape From New York, Big Trouble in Little China, and Prince of Darkness a sonic identity and electric drive that lingered in our psyches and even transcended the images to become almost a mini-genre. He may not have meant to do it, but the Master of Horror made himself a master movie musician.
Read the rest of this article at: The Ringer
Sara and David Ramirez didn’t expect to have children. It wasn’t that they didn’t want kids. They were in their 30s, newly married and eager to start a family together. But a doctor had told Sara that, due to an old car-crash injury, it was unlikely that she’d be able to carry to term. The diagnosis was crushing, but the couple were healthy and otherwise happy. David had two kids from a previous marriage. Life, they told themselves, could have been worse.
Then, in 2001, a miracle: Sara got pregnant. She gave birth the following spring to a perfect, pudgy-cheeked baby. Sara and David revelled in their unexpected parenthood, taking their smiley toddler to the beaches of south Texas, near their home in Houston, and to historic lighthouses along the Gulf of Mexico. He was a precocious kid. By age eight, he’d memorized the periodic table. A few years later, he built a DIY smelter and began melting scrap metal into ingots for fun. (His parents put an end to that hobby when a failed experiment sent shrapnel shooting off the balcony of their third-floor apartment.) He assembled gaming rigs from computer components, keeping one PC for himself and giving others away to friends so they could play together. That was just their son—selfless to a fault. In his teens, he started displaying unusual behaviour: changing his clothes multiple times a day, washing his hands excessively, refusing to sit down at restaurants because he thought the chairs were dirty. After Sara and David sought professional help for him, doctors diagnosed autism spectrum disorder and OCD. Kids sometimes mocked his peculiarities, but he never responded in kind. He knew what it was like to be an outsider, and he didn’t want anyone else to feel that way.
The Ramirez family moved around—from Colorado to Texas to Michigan and back—and, over the years, the couple had two more kids, Matthew and Elijah. The weekend their eldest child turned 18, he sat down with his parents and said, “I gotta tell you something.” He explained that, though he had been assigned male at birth and lived his childhood as a boy, he knew in his heart that he was a girl. There were hardly any other trans people in their tiny mountain town of Montrose, Colorado, so he’d been afraid to come out. But it was time: from now on, her name was Noelle. “There are three ways I can do this,” she said. “I can do it here. I can disappear for five years and come back. Or I can just kill myself.”
Aghast at the thought of Noelle harming herself, David and Sara told her that they loved her and would support her in any way they could. They offered to pay for gender-affirming surgery. “We tried everything to make her happy,” David says.
Noelle’s friends showed the same support. Even strangers approached her to applaud her courage. Still, somehow Noelle couldn’t shake the feeling that she didn’t belong in the world. She applied for college—she wanted to be a nurse—but, deep down, she was anxious about growing up. When she thought about all the responsibilities and stresses of adulthood, she felt scared, not excited. David and Sara sent her to therapy, and Noelle hoped that the cloud hovering over her would lift. But, in her darkest moments, she entertained another possibility: What if she ended her life?
Read the rest of this article at: Toronto Life