Page 255 - Safety Risk Management for Medical Devices
P. 255
234 Safety Risk Management for Medical Devices
confirmation bias, where only the information that supports the goodness of the
beloved is accepted, and the information to the contrary is rejected.
Confirmation bias appears in science and engineering as well. The paper
False-Positive Psychology by Simmons et al. [38] talks about biased selection and
processing of test results to support hypotheses. Simmons says, “flexibility in data
collection, analysis, and reporting dramatically increases actual false-positive rates.”
This is what is also referred to as “cherry picking.” Some researchers have, in the
past, selectively presented only the data that supported their claims and discarded
the data that refuted their claims. This is what gave rise to the replication crisis in the
early 2010s, as many scientific studies were difficult or impossible to reproduce in
subsequent investigations.
Confirmation bias is also a reason why when an outlier happens in test results,
attempts are made to find the root cause of the aberrant measurement, and discard it.
But when the test results meet the expectations, no root-cause analysis is done.
Anchoring bias—You may have experienced that sometimes if you are not
previously very confident in your thought or opinion, hearing someone else’s
thoughts sways you to their side. This is called anchoring bias. It is when another
thought or piece of information anchors your thoughts and biases you toward the
anchor. Imagine if you were going to guess the number of jelly beans in a jar to
be 300. But before you could say anything, a respected, smarter person says there
are at least 1000 jelly beans in that jar. Could you see yourself changing your
estimate to a higher number?
In working meetings, typically a few people tend to dominate and anchor the
thoughts of the other team members. Usually people with more authority or seniority
have that power. Also, people with the loudest voices or those who are more self-
assured or impassioned can anchor other people’s thoughts.
Availability bias—When a thought or concept is more recent or easier to remem-
ber, it is seen as more true or more relevant. This is called the availability bias. For
example, when a serious adverse event happens in the field, e.g., if a patient is seri-
ously injured by a medical device, the whole engineering team and the management
see that event as the highest risk that must be addressed. At the same time, it is possi-
ble that an even worse risk which has not happened yet, is lurking in the background.
The above examples of cognitive traps are some of the many ways that our minds
can be led to make poor decisions. With respect to risk management, we need to be
vigilant so that we do not miss Hazards, make good estimates of the risks, and make
the best design decisions that reduce the risks of the medical devices. In your daily
work, be mindful of cognitive biases that we all have, and try to be objective in con-
sidering and evaluating your own thoughts as well of the thoughts of others.