Dishonesty is one of the biggest obstacles investigators face when trying to solve a crime. People lie—especially when telling the truth spells trouble. As such, a prosecution’s case is often built around cutting through falsehoods: using DNA testing, eyewitness testimony, digital data and other evidence to convince a jury that a defendant did what they swear they didn’t.
But now, forensic science may be on the verge of a straighter path to exposing lies: Through functional magnetic resonance imaging (fMRI), a brain scan that maps cerebral activity by measuring blood flow. Early results of fMRI for lie detection are promising, with accuracy rates higher than 75 percent.
Searching for a reliable scientific safeguard against fibbing is hardly a new endeavor, says Jane Campbell Moriarty, a law professor at Duquesne University whose research focuses on the intersection between neuroscience and the law.
“There’s a really long history of trying to tell if people are telling the truth,” she says, with stories of lie-detector tests dating back to rice-chewing examinations in ancient Asia. Spit up rice that wasn’t sufficiently wet, and you were branded a lair.
“The theory was that when you lie, you get nervous and your salivary glands stop working as well,” says Moriarty.
By the early middle ages, Europeans had developed the “hot iron bar” test (where honesty was proven via your unburnt hand), and then—in the early 20th century—there was the polygraph, whose inventor William Moulton Marston also created the comic book character Wonder Woman and her Lasso of Truth. The Lasso was a weapon that forced anyone captured by it to be honest.
And that’s how Marston envisioned his polygraph; the device would wrap around the taker’s chest and record blood pressure. In doing so, it would record the truth, says Moriarty.
None of those lie-detector tests have stood the test of time—since the 1980s, the polygraph has been inadmissible as evidence in most courts, in part because they have a tendency to mark innocent people as liars. That’s because the measurements (of pulse, respiration, galvanic skin response) are actually pointing to anxiety, not dishonesty. The rationale was innocent people would remain calm under pressure, while the guilty would get nervous. Unfortunately for investigators (and the wrongfully accused), just being suspected of crime can also trigger anxiety, thereby lessening the machine’s accuracy.
But unlike the polygraph (or the rice test), the fMRI doesn’t measure anxiety. Instead, Moriarty says experts look at brain activity, or “cognitive load,” because “it takes more effort to lie than to tell the truth.” The results seem to be outpacing any of the old tricks, with a 2016 study suggesting that the fMRI is 24 percent more likely to pick up on lies than the polygraph.
But there’s still a lot of work that needs to be done before the fMRI can become a reliable indicator of dishonesty. All the studies thus far have been conducted as small, controlled laboratory environments, often with healthy college-student subjects being instructed to lie by their questioners. That’s a far cry from, say, a schizophrenic drug addict with violent tendencies trying to willfully hide a homicide from the police.
“We have no idea if this works in the real world,” says Moriarty.
Even if it did, there are questions of its legality within the framework of the American justice system.
Marc Blitz, a law professor at Oklahoma City University and the author of Searching Minds by Scanning Brains, says lie-detection brain scans could be a direct affront to the 4th amendment (protection from unreasonable search and seizure) as well as the 5th amendment (protection from self-incrimination).
“My focus is on unreasonable search…[when] police or other government officials try to conduct surveillance in an area where it’s unreasonable for them to be,” says Blitz.
Because while the fMRI (and other brain-reading devices) might be more accurate than the polygraph in detecting lies, Blitz argues brain scanning is also a fundamentally more invasive procedure, and courts may determine that the legal protections should reflect that, regardless of how accurate the scans prove to be.
“It’d be a more dystopian world for the government to be able to read your mind than to get information about your heart rate or your lung function,” he says.
But while brain-scan evidence might be a ways away stateside, it’s already reared its head abroad.
In 2008, an Indian judge sentenced a woman after she failed a brain-scan test in the murder of her ex-fiancé, the first brain-scan lie detection submitted as evidence in a criminal court. (That case involved an electroencephalogram test, or EEG, a brain scan that relies on electric activity and has shown spottier results than the fMRI.)
Meanwhile, companies in the United States like the Truthful Brain Corporation and No Lie MRI have offered fMRI lie-detection services to defendants who want to bolster claims that their word is good, in the hopes they’ll eventually be permitted to submit their findings to court.
And whether that happens or not, brain scanning might still eventually become part of police work. The polygraph, after all, is regularly used by police, despite its inadmissibility as evidence. Often, cops use the information gathered from polygraphs to rule out suspects who pass the test, or, in the case of a failed test, point it out to a suspect, in the hope of coercing a confession.
So despite legal obstacles standing in the way, fMRIs may still end up making a safer world.
Moriarty isn’t so sure.
“We have a long history of misusing our ability to get to the truth of a matter,” she says. “Imagine if the government found a cheap and easy and super effective way to tell when you were lying. Would they know how to control themselves with it? Everything we’ve seen to date suggests no.”