Page 308 -
P. 308

11.1   Dependability properties  291




                               Critical systems

                        Some classes of system are ‘critical systems’ where system failure may result in injury to people, damage to the
                        environment, or extensive economic losses. Examples of critical systems include embedded systems in medical
                        devices, such as an insulin pump (safety-critical), spacecraft navigation systems (mission-critical), and online
                        money transfer systems (business critical).
                        Critical systems are very expensive to develop. Not only must they be developed so that failures are very rare
                        but they must also include recovery mechanisms that are used if and when failures occur.
                                    http://www.SoftwareEngineering-9.com/Web/Dependability/CritSys.html




                                       2.  Software  failure System  software  may  fail  because  of  mistakes  in  its
                                          specification, design, or implementation.
                                       3.  Operational failure Human users may fail to use or operate the system correctly.
                                          As hardware and software have become more reliable, failures in operation are
                                          now, perhaps, the largest single cause of system failures.

                                         These failures are often interrelated. A failed hardware component may mean
                                       system operators have to cope with an unexpected situation and additional workload.
                                       This puts them under stress and people under stress often make mistakes. This can
                                       cause the software to fail, which means more work for the operators, even more
                                       stress, and so on.
                                         As a result, it is particularly important that designers of dependable, software-
                                       intensive systems take a holistic systems perspective, rather than focus on a single
                                       aspect of the system such as its software or hardware. If hardware, software, and
                                       operational processes are designed separately, without taking into account the poten-
                                       tial weaknesses of other parts of the system, then it is more likely that errors will
                                       occur at the interfaces between the different parts of the system.




                                11.1 Dependability properties


                                       All of us are familiar with the problem of computer system failure. For no obvious
                                       reason, our computers sometimes crash or go wrong in some way. Programs running
                                       on these computers may not operate as expected and occasionally may corrupt the
                                       data that is managed by the system. We have learned to live with these failures but
                                       few of us completely trust the personal computers that we normally use.
                                         The dependability of a computer system is a property of the system that reflects
                                       its trustworthiness. Trustworthiness here essentially means the degree of confidence
                                       a user has that the system will operate as they expect, and that the system will not
                                       ‘fail’ in normal use. It is not meaningful to express dependability numerically.
   303   304   305   306   307   308   309   310   311   312   313