Page 301 -
P. 301

284   Chapter 10   Sociotechnical systems






                                                                                        Active Failure
                                                                                       (Human Error)



                  Figure 10.9
                  Reason’s Swiss cheese
                  model of system failure  System Failure            Barriers



                                      In this model, the defenses built into a system are compared to slices of Swiss
                                    cheese. Some types of Swiss cheese, such as Emmental, have holes and so the anal-
                                    ogy is that the latent conditions are comparable to the holes in cheese slices. The
                                    position of these holes is not static but changes depending on the state of the overall
                                    sociotechnical system. If each slice represents a barrier, failures can occur when the
                                    holes line up at the same time as a human operational error. An active failure of sys-
                                    tem operation gets through the holes and leads to an overall system failure.
                                      Normally, of course, the holes should not be aligned so operational failures are
                                    trapped by the system. To reduce the probability that system failure will result from
                                    human error, designers should:

                                    1.  Design a system so that different types of barriers are included. This means that
                                        the ‘holes’ will probably be in different places and so there is less chance of the
                                        holes lining up and failing to trap an error.
                                    2.  Minimize the number of latent conditions in a system. Effectively, this means
                                        reducing the number and size of system ‘holes’.

                                      Of course, the design of the system as a whole should also attempt to avoid the
                                    active failures that can trigger a system failure. This may involve designing the oper-
                                    ational processes and the system to ensure that operators are not overworked, dis-
                                    tracted, or presented with excessive amounts of information.


                            10.5.2 System evolution

                                    Large, complex systems have a very long lifetime. During their life, they are
                                    changed to correct errors in the original system requirements and to implement new
                                    requirements that have emerged. The system’s computers are likely to be replaced
                                    with new, faster machines. The organization that uses the system may reorganize
                                    itself and hence use the system in a different way. The external environment of the
                                    system may change, forcing changes to the system. Hence evolution, where the sys-
                                    tem changes to accommodate environmental change, is a process that runs alongside
                                    normal system operational processes. System evolution involves reentering the
                                    development process to make changes and extensions to the system’s hardware, soft-
                                    ware, and operational processes.
   296   297   298   299   300   301   302   303   304   305   306