Page 290 - Mechanical Engineers' Handbook (Volume 2)
P. 290
4 Systems Engineering Methodology and Methods 281
16. Law of Small Numbers. People are insufficiently sensitive to quality of evidence.
They often express greater confidence in predictions based on small samples of data
with nondisconfirming evidence than in much larger samples with minor discon-
firming evidence. Sample size and reliability often have little influence on confi-
dence.
17. Order Effects. The order in which information is presented affects information re-
tention in memory. Typically, the first piece of information presented (primacy ef-
fect) and the last presented (recency effect) assume undue importance in the mind
of the decision-maker.
18. Outcome-Irrelevant Learning System. Use of an inferior processing or decision rule
can lead to poor results that the decision-maker can believe are good because of
inability to evaluate the impacts of the choices not selected and the hypotheses not
tested.
19. Representativeness. When making inference from data, too much weight is given
to results of small samples. As sample size is increased, the results of small samples
are taken to be representative of the larger population. The ‘‘laws’’ of representa-
tiveness differ considerably from the laws of probability and violations of the con-
junction rule P(A B) P(A) are often observed.
20. Selective Perceptions. People often seek only information that confirms their views
and values and disregard or ignore disconfirming evidence. Issues are structured on
the basis of personal experience and wishful thinking. There are many illustrations
of selective perception. One is ‘‘reading between the lines’’—for example, to deny
antecedent statements and, as a consequence, accept ‘‘if you don’t promote me, I
won’t perform well’’ as following inferentially from ‘‘I will perform well if you
promote me.’’
Of particular interest are circumstances under which these biases occur and their effects on
activities such as the identification of requirements for a system or for planning and design.
Through this, it may be possible to develop approaches that might result in debiasing or
amelioration of the effects of cognitive bias. A number of studies have compared unaided
expert performance with simple quantitative models for judgment and decision making.
While there is controversy, most studies have shown that simple quantitative models perform
better in human judgment and decision-making tasks, including information processing, than
holistic expert performance in similar tasks. There are a number of prescriptions that might
be given to encourage avoidance of possible cognitive biases and to debias those that do
occur:
1. Sample information from a broad database and be especially careful to include da-
tabases that might contain disconfirming information.
2. Include sample size, confidence intervals, and other measures of information validity
in addition to mean values.
3. Encourage use of models and quantitative aids to improve upon information analysis
through proper aggregation of acquired information.
4. Avoid the hindsight bias by providing access to information at critical past times.
5. Encourage people to distinguish good and bad decisions from good and bad out-
comes.
6. Encourage effective learning from experience. Encourage understanding of the de-
cision situation and methods and rules used in practice to process information and
make decisions so as to avoid outcome-irrelevant learning systems.