Page 120 - Concise Encyclopedia of Robotics
P. 120

False Resilience
                            “legitimate”percepts. In a poorly designed system, however, false negatives
                            or positives can cause erratic operation.
                         FAULT RESILIENCE
                            The term fault resilience can refer to either of two different characteristics
                            of a computerized robotic system.
                              The first type of fault-resilient system can also be called sabotage-proof.
                            Suppose that all the strategic (nuclear) defenses of the United States are
                            placed under the control of a computer. It is imperative that it be impos-
                            sible for unauthorized people to turn it off. Backup systems are necessary.
                            No matter what anyone tries to do to cause the system to malfunction or
                            become inoperative,the system must be capable of resisting or overcoming
                            such attack.
                              Some engineers doubt that it is possible to build a totally sabotage-
                            proof computer. They  quote  the  saying, “Build  a  more  crime-proof
                            system, and you get smarter criminals.”Also, any such system would have
                            to be engineered and built by human beings. At least one of those people
                            could be bribed or blackmailed into divulging information on how to
                            defeat the security provisions. And of course, no one can anticipate all of
                            the things that might go wrong with a system. According to Murphy’s
                            law, which is usually stated tongue-in-cheek but which can often mani-
                            fest itself as truth,“If something can go wrong, it will.”And the corollary,
                            less  often  heard  but  perhaps  just  as  true, is “If something  cannot  go
                            wrong, it will.”
                              The second type of fault resilience is also known as graceful degrada-
                            tion. Many computers and also computer-controlled robotic systems are
                            designed so that if some parts fail, the system still works, although perhaps
                            at reduced efficiency and speed. See GRACEFUL DEGRADATION.
                         FEEDBACK
                            Feedback is a means by which a closed-loop system regulates itself. Feedback
                            is used extensively in robotics.
                              An example of feedback can be found in a simple thermostat mecha-
                            nism, connected to a heating/cooling unit. Suppose the thermostat is set
                            for 20 degrees Celsius (20°C). If the temperature rises much above 20°C,
                            a signal is sent to the heating/cooling unit, telling it to cool the air in the
                            room. If the temperature falls much below 20°C, a signal tells the unit to
                            heat the room. This process is illustrated in the block diagram.
                              In a system that uses feedback to stabilize itself, there must be some
                            leeway between the opposing functions. In the case of the thermostatically
                            controlled heating/cooling system, if both thresholds are set for exactly
                            20°C, the system will constantly and rapidly cycle back and forth between




                                                   
   115   116   117   118   119   120   121   122   123   124   125