Getting beyond bias
- Subtitle: Legal Report: Forensics
|Illustration: Mick Coulas|
The issue was laid bare in the notorious case of Ontario pathologist Charles Smith, whose think-dirty approach to child pathology contributed to a string of wrongful convictions in the 1980s and 1990s. The Office of the Chief Coroner for Ontario identified 20 cases in which Smith had reached questionable conclusions, including 12 that ended with convictions, prompting a public inquiry led by Justice Stephen Goudge.
Goudge’s scathing report highlighted the shaken public confidence in forensic pathology and the justice system, but it has also marked a turning point for forensic evidence collection and expert witness testimony, according to Frank Addario, a former president of the Criminal Lawyers’ Association. “Since the Goudge inquiry into Dr. Smith’s practices, everyone has tried to elevate their standards. You can see that in the way that Crown counsel approach cases and in the new skepticism that judges bring to forensic evidence. We need experts of course, but we also need to control their use so that we can evaluate whether they adhered to rigorous standards,” he says. “The issue of bias will always plague forensic science. The only tried way to combat it is through transparency and rigour.”
Stemming from one of Goudge’s recommendations, Ontario expert witnesses must now sign Form 53, which confirms their duty to the court above and beyond the party that retained them. “Most good experts always worked under this regime anyway,” but the reminder always helps, says Peter Steger, a principal at Cohen Hamilton Steger, which specializes in business valuations, damages quantification, and forensic accounting. “When you’re collecting evidence, you should always be thinking about what your mandate is. And that’s to assist the court,” says Steger. “The court wants to know that your investigation was fair and unbiased, and that you’ve looked at all possible explanations for the wrongdoing, not that you’re just promoting one side of the argument. That doesn’t do anybody any good.”
Steger says clients sometimes struggle to grasp the neutral nature of his role, but the more sophisticated ones appreciate him standing his ground with a negative report. “They’d rather have the truth, instead of building up unrealistic expectations based on something rose-coloured that won’t stand up later,” he says.
Besides, any experts tempted to boost their clients’ cases are putting themselves at serious risk, according to Chris Milburn, the managing director of FTI Consulting, who is often called upon to quantify economic damages in commercial disputes. The cut and thrust of the adversarial process means it is bound to be exposed, and keeping in mind the opprobrium heaped upon Smith and other errant experts is as good a strategy as any for avoiding bias, says Milburn. “If you get some really strong censorship by a judge, that could be a career-ender. Essentially, we are in the reputation business, and there’s no single case that’s worth putting your reputation on the line.”
Ralph Balbaa, the president of HITE Engineering Corp., provides forensic engineering support for the Ministry of Labour, as well as for lawyers defending charges related to workplace accidents. He says accepting retainers from both sides of the typical investigation helps reduce any appearance of bias. “We’ve worked both sides of the fence, and it really depends on who calls us first,” he says.
At Giffen Koerth, materials engineer Okrutny is particularly interested in the more subtle, and often unintentional, biases that experts — like all people — bring to their work. While a minority of shady practitioners may consciously ignore evidence that doesn’t suit their position, even the most well-intentioned experts are subject to a series of innate biases that affects their approach to the evidence and the conclusions reached. The simplest way to combat them is to acknowledge that they exist, and attempt to understand them, says Okrutny. “Without understanding bias, we think we’re unbiased and impartial, but we let our emotions unintentionally get in the way of our decisions,” he says. “I believe that most experts are naturally good. They don’t want to be biased, or screw other people and take advantage. They just want to work out what happened and help."
He quotes Plato’s observation that all human behaviour flows from three sources: desire, emotion, and knowledge. “In theory our conclusions should be based only on the third one, knowledge. That means you’ve got to be able to control the other two when you’re collecting evidence, writing reports, or testifying,” says Okrutny. He identifies four major sources of bias that experts should look out for from the very beginning of every case:
1. Anchoring bias: “This happens when someone gives you an idea or a concept, and then you begin thinking based on that anchor they’ve provided,” says Okrutny. “If you’re not careful, then you can end up focussing the investigation all on that one spot.”
2. Bandwagon effect: “Statistically, the probability of any person doing something increases based on the number of people already doing it. If you go into a scene with 30 people talking about how the crane fell over because it was overloaded, you’re now going to feel awkward or intimidated for thinking it was something else that caused it to fall over,” says Okrutny. “In the best-case scenario, you waste some time by focussing too much on that idea, but in the worst case, you completely miss other evidence because you assumed everyone else was correct.”
3. Confirmation bias: “People will try to rationalize their actions in any aspect of life. You tend to confirm your initial instinct by looking for information that confirms it,” says Okrutny. “You end up subconsciously favouring data that fits your hypothesis, so that it’s easy to recall the 20 pieces of evidence that fit, versus the 100 that didn’t.”
4. Noble-cause bias: “This happens when people are trying to do something for the greater good. They want to help,” says Okrutny. He recalls one case where a lawyer who had hired him sent pictures of an injured party. “I didn’t need to see them, but I instantly felt a rush of emotion. Maybe I was able to move on because I was aware of different biases that can act on me.”
Addario says it’s easy for lawyers to inadvertently influence an expert’s analysis and open the door to an appearance of bias by posing hypotheticals or offering their own theories of cases. “I try to hold off where possible so that if asked, my expert can say: ‘I wasn’t told what I was looking for when I got hired,’” he says.
Rick Derbecker, the vice president of specialty services at Construction Control Inc., says he has been retained as an expert in construction cases where he has been given no information about suspected defects. “The cost for the forensics work is much greater because you do a very thorough analysis of a whole bunch of things,” he says. “And the hazard for all parties is that you’ll get this wonderfully unbiased report, with no content on the issue you actually wanted to know about.” He says a simple way to avoid the appearance of bias is to keep reports as quantitative as possible. “If I can get actual numbers that I have measured, those are much easier to defend than a pure opinion based on years of experience.”
In the business valuation field where Milburn plies his trade, an abundance of numbers makes quantification easy. But he says he has to be wary about his sources in order to avoid any appearance of bias. “You need to make sure the data and evidence you’re relying on when you produce a report is as authoritative as possible. There’s different weight that will go to certain evidence,” he says. “Sometimes you’ll be working with financial projections from your own client, and it can be tricky. Of course they think the business was going to make $10 million a year from now until forever, but what are the assumptions they’ve made and how reasonable were they? You’ve got to do the due diligence on the data you get.”
Published in Features