Page 30 - Pipeline Risk Management Manual Ideas, Techniques, and Resources
P. 30
Risk process-the general steps 119
Table 1.2 Types of bias and heuristics
Heuristic or bias Description
Availability heuristic Judging likelihood by instances most easily or vividly recalled
Availability bias Overemphasizing available or salient instances
Hindsight bias Exaggerating in retrospect what was known in advance
Anchoring and adjustment heuristic Adjusting an initial probability to a final value
Insufficient adjustment Insufficiently modifying the initial value
Conjunctive distortion Misjudging the probability of combined events relative to their individual values
Representativeness heuristic Judging likelihood by similarity to some reference class
Representativeness bias Overemphasizing similarities and neglecting other information; confusing “probability ofA given B’
with “probability ofB given A”
Insensitivity to predictability Exaggerating the predictive validity of some method or indicator
Base-rate neglect Overlooking frequency information
Insensitivity to sample size Overemphasizing significance of limited data
Overconfidence bias Greater confidence than warranted, with probabilities that are too extreme or distributions too narrow
about the mean
Underconfidence bias Less confidence than warranted in evidence with high weight but low strength
Personal bias Intentional distortion of assessed probabilities to advance an assessor’s self-interest
Organizational bias Intentional distortion of assessed probabilities to advance a sponsor’s interest in achieving an outcome
Source: From Vick. Steven G.. Degrees of Belief: Subjective Probability and Engineering Judgment. ASCE Press, Reston, VA, 2002.
threats. We know the options in mitigating the threats. But in latter type of error. The only cost is the effort to get the correct
knowing these things, we also must know the uncertainty information. So, this “guilty until proven innocent” approach is
involved-we cannot know and control enough of the details to actually an incentive to reduce uncertainty.
entirely eliminate risk. At any point in time, thousands of forces Uncertainty also plays a role in inspection information.
are acting on a pipeline, the magnitude of which are “unknown Many conditions continuously change over time. As inspection
and unknowable.” information gets older, its relevance to current conditions
An operator will never have all of the relevant information he becomes more uncertain. All inspection data should therefore
needs to absolutely guarantee safe operations. There will be assumed to deteriorate in usefulness and, hence, in its
always be an element of the unknown. Managers must control risk-reducing ability. This is further discussed in Chapter 2.
the “right” risks with limited resources because there will The great promise of risk analysis is its use in decision
always be limits on the amount of time, manpower, or money support. However, this promise is not without its own element
that can be applied to a risk situation. Managers must weigh of risk-the misuse of risk analysis, perhaps through failure
their decisions carefully in light of what is known and to consider uncertainty. This is discussed as a part of risk
unknown. It is usually best to assume that management in Chapter 15. As noted in Ref. [74]:
Uncertainty = increased risks The primary problem with risk assessment is that the information on
which decisions must be based is usually inadequate. Because the
This impacts risk assessment in several ways. First, when decisions cannot wait, the gaps in information must be bridged by
information is unknown, it is conservatively assumed that inference and belief, and these cannot be evaluated in the same way as
facts. Improving the quality and comprehensiveness of knowledge is
unfavorable conditions exist. This not only encourages the fre- by far the most effective way to improve risk assessment, but some
quent acquisition of information, but it also enhances the risk limitations are inherent and unresolvable, and inferences will always
assessment’s credibility, especially to outside observers. be required.
It also makes sense from an error analysis standpoint. Two
possible errors can occur when assessing a condition-saying it
is “good,” when it is actually “bad,” and saying it is “bad” when IV. Risk process-the general steps
it is actually “good.” If a condition is assumed to be good
when it is actually bad, this error will probably not be discov- Having defined some basic terms and discussed general risk
ered until some unfortunate event occurs. The operator will issues, we can now focus on the actual steps involved in risk
most likely be directing resources toward suspected deficien- management. The following are the recommended basic steps.
cies, not recognizing that an actual deficiency has been hidden These steps are all fully detailed in this text.
by an optimistic evaluation. At the point of discovery by inci-
dent, the ability of the risk assessment to point out any other Step 1: Risk modeling
deficiency is highly suspect. An outside observer can say,
“Look, this model is assuming that everything is rosy-how The acquisition of a risk assessment process, usually in
can we believe anything it says?!” On the other hand, assuming the form of a model, is a logical first step. A pipeline risk
a condition is bad when it is actually good merely has the effect assessment model is a set of algorithms or rules that use
of highlighting the condition until better information makes available information and data relationships to measure levels
the “red flag” disappear. Consequences are far less with this of risk along a pipeline. An assessment model can be selected