Judicial gatekeeping on expert testimony is often discussed as a concern about junk science – with an implication that anything not plainly lacking in scientific basis is a mere question of persuasiveness that should therefore go to the jury. The New Jersey Appellate Division recently took that concept to its extreme conclusion, holding that whenever a well-credentialed expert relies on some sort of scientific data and can offer an explanation for his conclusions, that testimony must be admitted, no matter the methodological flaws. Those flaws go merely to the strength of the testimony, the panel determined, and weaknesses can be exposed on cross examination and countered by other experts. As a result, the trial judge’s studied judgment to bar flawed expert testimony in the ongoing In re Accutane Litigation was reversed.
In a brief filed this week in the New Jersey Supreme Court, we argue that focus on “junk science” is misplaced, and in fact has it backwards. Identifying obvious “junk science” may be relatively easy for a jury. Where judicial gatekeeping is most critical, and where flawed expert testimony is most dangerous, is rather where a well-credentialed expert presents an interpretation of the underlying data based on flawed methodology that jurors are ill-suited to evaluate.
What Do Juries Do Best?
Juries function best when evaluating credibility of ordinary witnesses, aided by the adversarial process. Inconsistencies can be revealed through cross-examination; erroneous and false testimony can be countered by opposing witnesses. Jurors can then arrive at their own conclusion as to whether the stop light was red or green.
But jurors are more easily misled when presented with competing interpretations of scientific data. Studies demonstrate that jurors struggle to make methodology-based distinctions when evaluating expert testimony. Juries presume the legitimacy of admitted evidence, and do not reliably distinguish between low and high-quality evidence. And the adversarial process does not effectively assist jurors in recognizing flaws in expert testimony, in part because jurors lack the opportunity to delve into the underlying studies to make their own critical evaluations.
What Judges Do Better
Judges, by contrast, have the opportunity to immerse themselves in the relevant studies and make meaningful inquiries into the methodological soundness of proposed testimony. Especially in the case of MCL judges, they even have the opportunity to develop some familiarity with the type of scientific and technical disputes that occur frequently in medical causation cases. Of course, as part of that evaluation, judges must also justify their decision – and their analysis is more readily reviewable on appeal, further enhancing both accuracy and predictability.
The focus on “junk science” therefore gets it precisely backwards. A standard that would permit well-credentialed experts to offer “plausible explanations” would eliminate judicial gatekeeping in those cases where jurors are least equipped to assess scientific validity and methodological soundness, and where judges’ comparative advantage in evaluating complex scientific theories highest.
The solution is to adopt a standard that is not limited to a screen for well-qualified experts opining on scientific data, but that requires rigorous judicial scrutiny of the analytical process itself, and that provides trial court judges reliable guidance in their gatekeeping responsibilities. Adopting the Daubert standard to expressly align New Jersey courts with the prevailing standard in federal courts and vast majority of other state courts would provide New Jersey judges with effective guidance on thorny methodological issues and ensure predictable and robust judicial gatekeeping in New Jersey state courts.