Page 249 - Discrete Mathematics and Its Applications
P. 249

228  3 / Algorithms


                                                    For more information about the complexity of algorithms, consult the references, including
                                                [CoLeRiSt09], for this section listed at the end of this book. (Also, for a more formal discussion
                                                of computational complexity in terms of Turing machines, see Section 13.5.)

                                                PRACTICAL CONSIDERATIONS Note that a big-  estimate of the time complexity of an
                                                algorithm expresses how the time required to solve the problem increases as the input grows
                                                in size. In practice, the best estimate (that is, with the smallest reference function) that can be
                                                shown is used. However, big-  estimates of time complexity cannot be directly translated into
                                                the actual amount of computer time used. One reason is that a big-  estimate f (n) is  (g(n)),
                                                where f (n) is the time complexity of an algorithm and g(n) is a reference function, means
                                                that C 1 g(n) ≤ f (n) ≤ C 2 g(n) when n>k, where C 1 , C 2 , and k are constants. So without
                                                knowing the constants C 1 , C 2 , and k in the inequality, this estimate cannot be used to determine
                                                a lower bound and an upper bound on the number of operations used in the worst case. As
                                                remarked before, the time required for an operation depends on the type of operation and the
                                                computer being used. Often, instead of a big-  estimate on the worst-case time complexity of
                                                an algorithm, we have only a big-O estimate. Note that a big-O estimate on the time complexity
                                                of an algorithm provides an upper, but not a lower, bound on the worst-case time required for
                                                the algorithm as a function of the input size. Nevertheless, for simplicity, we will often use
                                                big-O estimates when describing the time complexity of algorithms, with the understanding
                                                that big-  estimates would provide more information.
                                                    Table 2 displays the time needed to solve problems of various sizes with an algorithm using
                                                the indicated number n of bit operations, assuming that each bit operation takes 10 −11  seconds, a
                                                reasonable estimate of the time required for a bit operation using the fastest computers available
                                                today. Times of more than 10 100  years are indicated with an asterisk. In the future, these times
                                                will decrease as faster computers are developed. We can use the times shown in Table 2 to see
                                                whether it is reasonable to expect a solution to a problem of a specified size using an algorithm
                                                with known worst-case time complexity when we run this algorithm on a modern computer.
                                                Note that we cannot determine the exact time a computer uses to solve a problem with input of
                                                a particular size because of a myriad of issues involving computer hardware and the particular
                                                software implementation of the algorithm.
                                                    It is important to have a reasonable estimate for how long it will take a computer to solve a
                                                problem. For instance, if an algorithm requires approximately 10 hours, it may be worthwhile to
                                                spendthecomputertime(andmoney)requiredtosolvethisproblem.But,ifanalgorithmrequires
                                                approximately 10 billion years to solve a problem, it would be unreasonable to use resources to
                                                implement this algorithm. One of the most interesting phenomena of modern technology is the
                                                tremendous increase in the speed and memory space of computers. Another important factor
                                                that decreases the time needed to solve problems on computers is parallel processing, which
                                                is the technique of performing sequences of operations simultaneously.
                                                    Efficient algorithms, including most algorithms with polynomial time complexity, benefit
                                                most from significant technology improvements. However, these technology improvements



                                            TABLE 2 The Computer Time Used by Algorithms.
                                             Problem Size                           Bit Operations Used
                                                 n            log n        n        n log n      n 2        2 n         n!

                                                10          3 × 10 −11  s  10 −10  s  3 × 10 −10  s  10 −9  s  10 −8  s  3 × 10 −7  s
                                                10 2        7 × 10 −11  s  10 −9  s  7 × 10 −9  s  10 −7  s  4 × 10 11  yr  *
                                                10 3      1.0 × 10 −10  s  10 −8  s  1 × 10 −7  s  10 −5  s  *       *
                                                10 4      1.3 × 10 −10  s  10 −7  s  1 × 10 −6  s  10 −3  s  *       *
                                                10 5      1.7 × 10 −10  s  10 −6  s  2 × 10 −5  s  0.1s  *           *
                                                10 6        2 × 10 −10  s  10 −5  s  2 × 10 −4  s  0.17 min  *       *
   244   245   246   247   248   249   250   251   252   253   254