This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Insights Insights
| 2 minutes read

Many Forensic Psychologists Don’t Use Evidence-Based Protocols

According to a study by psychologists and lawyers, up to 1/3 of forensic psychologists don’t use evidence-based protocols for their evaluations. Even more distressing is the finding that lawyers rarely challenge such untested procedures and courts even more rarely exclude the evidence. The researchers noted that their data “suggest that some of the weakest tools tend to get a pass from the courts.”

In reaching that conclusion, the study’s authors reviewed 22 surveys of psychologists about tools they use in forensics work, and chose 364 different assessment tools that the psychologists reported using. Most states now follow the federal Daubert standard for expert testimony, which standard requires that the proponent show a solid basis for the expert’s opinion. Using that framework, the first part of the study analyzed those 364 tools in terms of general acceptance and whether the protocol had been subjected to testing. The researchers found that 10% of the tools apparently had not been tested, and they were unable to find evidence of general acceptance for half of them. Of the half that did show up in the literature, 66% (or 33% of the total pool) were generally accepted, while the others were disputed or were not accepted. In other words, two-thirds of the protocols that forensic psychologists use in court-related testimony are NOT generally accepted in the field.

The second part of the study looked at how the legal system treats the protocols. For purposes of this analysis, the researchers narrowed the 364 tools down to 30 exemplars. They then researched legal databases for mention of those 30 tools. They found that lawyers challenged the use of the tools in only 5% of the cases with discussion of the tools. Within that subset of challenges, the judge excluded the testimony only 32% of the time. The courts often stated that the counsel’s argument related to the weight that the jury should give to the testimony.

Psychology and other soft sciences are harder to measure than other sciences, mainly because they can’t use many of the tools that chemists and physicists can use. It is unethical, for example, to measure something about a control group of people, subject them to trauma, and then do the measurements again to see what changed. The best that we can do is indirect measures looking for high correlations. That difficulty, however, does not excuse ignoring what evidence-based research exists.

This study indicates that psychologists are doing exactly that, and that the legal system is allowing it to happen. Admittedly, it is a relatively limited study with all of the ambiguities inherent in evaluating soft sciences. Yet, the legal system should pay attention to the possibilities that it raises. Lawyers need to delve more into the basis for experts’ opinions, and judges need to require evidence-based opinions from experts in the soft sciences just as they do other experts.

Our bottom-line conclusion is that evidentiary challenges to psychological tools are rare and challenges to the most scientifically suspect tools are even rarer or are nonexistent.

Tags

ausburn_deborah, youth services law, expert testimony, insights