PARIS (Reuters) - Neuroscientists are making such rapid progress in unlocking the brain’s secrets that some are urging colleagues to debate the ethics of their work before it can be misused by governments, lawyers or advertisers.
The news that brain scanners can now read a person’s intentions before they are expressed or acted upon has given a new boost to the fledgling field of neuroethics that hopes to help researchers separate good uses of their work from bad.
The same discoveries that could help the paralyzed use brain signals to steer a wheelchair or write on a computer might also be used to detect possible criminal intent, religious beliefs or other hidden thoughts, these neuroethicists say.
“The potential for misuse of this technology is profound,” said Judy Illes, director of the Stanford University neuroethics program in California. “This is a truly urgent situation.”
The new boost came from a research paper published last week that showed neuroscientists can now not only locate the brain area where a certain thought occurs but probe into that area to read out some kinds of thought occurring there.
Its author, John-Dylan Haynes of the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig, Germany, compared this to learning how to read books after simply being able to find them before. “That is a huge step,” he said.
Haynes hastened to add that neuroscience is still far from developing a scanner that could easily read random thoughts.
“But what we can do is read out some simple things that are quite useful for applications, such as simple intentions, attitudes or emotional states,” he said. “We’re finding we can read out yes-or-no situations.”
Haynes and his research team used a brain scanning technique called functional magnetic resonance imaging to detect a volunteer’s unspoken decision to add or subtract two numbers flashed on a screen. They got it right 70 percent of the time.
Barbara Sahakian, a clinical neuropsychologist at Cambridge University in Britain, saw possible misuse of this similar to the plot of Steven Spielberg’s 2002 movie Minority Report, where police arrest people who psychics predict will commit a murder.
“We have to discuss how we want to use this technology and who should have access to it,” she said.
Martha Farah, director of the University of Pennsylvania’s Center for Cognitive Neuroscience, said advances such as Haynes’s way of reading more out of imaging data were opening the path to very rapid growth in understanding the brain.
“We’re just beginning to find out the power of these more fine-grained analyses,” she said. “From the neuroethics point of view, this could be really big.”
Farah, Illes and Sahakian are among a small group of neuroscientists who founded the Neuroethics Society in May 2006 to promote an international debate about the proper use of the discoveries their field was making and will make in future.
“As a neuroscientist, I’m trained to think about these issues once I have a result in hand,” Illes said. “But we need to think about those ethical implications right now.
“People want to know if, when they go to an airport, their luggage will go through one scanner and their brains will go through another. Do I think that’s around the corner? I do.”
NO TO COMMERCIAL MIND-READING
Haynes estimated his research into unspoken intentions could yield simple applications within the next 5 to 10 years, such as reading a person’s attitude to a company during a job interview or testing consumer preferences through “neuromarketing.”
There are already companies trying to use brain scanners to build a more accurate lie detector, a technology that could dazzle judges and juries so much that they could mistake it for the final word in deciding a case, the researchers said.
Law enforcement officials might use the technology, which tracks heightened activity in areas linked to mental responses to outside stimuli, to screen people for pedophilia, racial bias, aggression or other undesirable tendencies, they said.
“If you’re reading out something for neuromarketing or job interviews, or doing this against people’s wills, that could be considered unethical,” Haynes said.
Lie detection is more complex, he said, because it can violate mental privacy but also prove innocence. In some cases, refusing to use it to uphold a right of mental privacy could end up denying an accused person’s right of self-defence.
Amid all these worrying scenarios, Haynes said people should learn about the promise and the limits of brain scanning so they can make informed decisions when new applications arise. They should also keep the technology in its proper context.
“I still strongly support the power of simple questions in psychology,” he said. “If you want to tell what someone is going to do, the best way to find out is to ask them.”