Wednesday, April 18, 2012

Big Brother: Can the fMRI Undermine Privacy?


Researchers at Dartmouth recently conducted a study in which they were able to predict future behavior of the participants based on fMRI scans of the brain. Yes, that’s right; it is now possible to make a pretty good guess as to how disposed an individual is to certain types of behavior.
I will summarize briefly. The researchers showed each subject a series of images while measuring activity in the pleasure center of the participant’s brain. These images — varying from the sexually arousing to the culinarily intriguing — and the degree of increased brain activity associated with each image could accurately predict the behavior of the participant in the future. For example, if a participant showed a particularly acute reaction to a picture of a cake, the researchers hypothesized that the individual would tend to gain weight. After conducting a survey of all participants some months after the experiment (and weighing them before and after), these hypotheses were deemed statistically valid.
So, while this experiment did not attempt to predict thoughts and immediate actions, it did demonstrate the ability to “mind-read” — to accurately gauge an individual’s mental proclivities. Consider what might happen if such fMRI scans became routine in law enforcement: just as we are finger-printed and blood-typed now after a run-in with the police, perhaps in the future we might be brain-scanned for future capacity for criminal behavior.
The problem here is that, while we are guaranteed the right not to self-incriminate, brain-scans don’t lie. If subjected to such a scan, the law could force a person’s psyche into the open — not in any great detail, but enough to be able to judge future behavior. And this is bad; bad for the person in question, and bad for society as a whole. The reason is that, despite the best intentions of the law — protecting the general welfare, for instance — when the law can “steal” information and make significant decisions based on that information (decisions like whether or not to release a minor criminal), that threatens the privacy of everyone. What is to stop the government from extending such a program to try and identify potential criminals before they commit a crime. Sound familiar? 
That’s pretty much the same function as the Thought Police in 1984, and I think that’s where our society — if we were to use brain scans in law enforcement — might end up. If we agree to compromise the individual’s privacy for the sake of general safety, we give up what makes our individual lives worth living; we sacrifice the individual to the aggregate. And while it might not be for the same reasons or have the same manipulative character of the government in 1984, the effects on society would be the same: demoralization and dehumanization, turning the spontaneous and free into the homogenous and mechanical. It’s all well and good to study the brain and learn how it works, but when we can read each other’s minds — and the research cited is a far cry from that — we should be worried.
(1) http://www.popsci.com/technology/article/2012-04/brain-scans-can-reliably-predict-your-future-insatiable-appetites

1 comment:

dmrd said...

I completely agree that the prospect of thought crime and loss of privacy that could result from such technologies is frightening. I also believe, however, that the concerns surrounding this prospect are overblown in most cases. Technologies such as FMRI are relatively new, and although we are beginning to increase our understanding of the brain to the point where we can predict certain things, such as whether a person will make a purchase a few seconds before they make a decision, we know very little about interpreting more complicated decisions and proclivities. For most purposes, brain scans are little better than a polygraph test – error prone and more flash than substance. In modern law, many judges actually refuse to accept neuroscientific evidence because of this quality. Many feel that the quality of “science” that surrounds such evidence biases a jury more than it objectively should. You do note that we are a long way off from actual mind reading, but there is no evidence that a more thorough understanding of the brain will necessary lead to reliable readings from brain scans. They run into the exact same problems that face more traditional techniques such as if a person genuinely believes certain lies that they tell. In the end, while the dire predictions that come with integrating technologies such as brain scans into law enforcement are frightening, I believe that concerns are mostly overblown.

http://www.theneuroethicsblog.com/2012/03/daubert-and-frye-neuroscience-in.html