Page 207 - A Practical Guide from Design Planning to Manufacturing
P. 207

180   Chapter Six

          Whether hardware or software emulation is used, any simulation will
        be dramatically slower than the physical processor; the number of pos-
        sible combinations of instructions and operands is far too vast to test
        completely even with full-speed processors. One hundred PCs running
                                                                  9
        HDL simulations at 1 Hz for a year could simulate more than 10 cycles.
        One hundred PCs testing 1 GHz prototypes for a year could simulate
                    18
        more than 10 cycles, but the total number of possible inputs operands
                                                                   38
        to a single double-precision floating-point divide is more than 10 . The
        infamous Pentium FDIV bug is an example of how difficult finding logic
        errors is. In 1994, after Intel had already shipped more than a million
        Pentium processors, it was discovered that the processor design con-
        tained a logic bug that caused the result of the floating-point divide
        instruction to be incorrect for some combinations of inputs. The chance
        of hitting one of the combinations that triggered the bug by applying
        random inputs was estimated at 1 in 9 billion. 3
          Testing instructions at random or even testing random pieces of real
        programs will never be able to thoroughly validate the design. The number
        of processor cycles that can be simulated is limited by the time required.
        The true art of design verification is not simulating lots of code, but choos-
        ing code that is likely to expose bugs in the HDL model. Although vali-
        dation engineers do not perform design themselves, they must become
        intimately familiar with the details of the design in order to find its weak
        points. Ultimately these engineers may have a better understanding of
        the overall logical operation of the processor than anyone else on the
        design team.
          To measure the progress of design verification, most validation teams
        create measures of test coverage. The number of different key microar-
        chitectural states and events triggered by each test simulation are meas-
        ured. New tests are written specifically to try out uncovered areas. Running
        the same types of tests over and over gives the illusion of a well-tested
        design without actually improving the chance of finding new bugs. Real
        validation progress is measured by tracking improvements in test cover-
        age rather than simply the total number of cycles simulated.
          Design verification tests only the behavior of the HDL model. The
        other half of pre-silicon validation is implementation verification, which
        makes sure that the processor’s circuits and layout will faithfully repro-
        duce the behavior of the HDL. In the past, test vectors were simulated
        on the HDL model and a netlist that described the circuit implementation,
        in order to perform implementation verification. The behavior of HDL
        and implementation were compared to make sure they were identical.
        The weakness in this approach is that behavior is proved to be the same
        only for the test vectors tried. There is always the chance that there are
        differences in behavior for other untested combinations of inputs.


          3
          “Statistical Analysis of Floating Point Flaw.”
   202   203   204   205   206   207   208   209   210   211   212