Page 240 - Discrete Mathematics and Its Applications
P. 240
3.3 Complexity of Algorithms 219
size.A second measure is the amount of computer memory required to implement the algorithm
when input values are of a specified size.
Questionssuchastheseinvolvethecomputationalcomplexityofthealgorithm.Ananalysis
of the time required to solve a problem of a particular size involves the time complexity of the
algorithm. An analysis of the computer memory required involves the space complexity of
the algorithm. Considerations of the time and space complexity of an algorithm are essential
when algorithms are implemented. It is obviously important to know whether an algorithm will
produce an answer in a microsecond, a minute, or a billion years. Likewise, the required memory
must be available to solve a problem, so that space complexity must be taken into account.
Considerations of space complexity are tied in with the particular data structures used to
implement the algorithm. Because data structures are not dealt with in detail in this book, space
complexity will not be considered. We will restrict our attention to time complexity.
Time Complexity
The time complexity of an algorithm can be expressed in terms of the number of operations
used by the algorithm when the input has a particular size. The operations used to measure time
complexity can be the comparison of integers, the addition of integers, the multiplication of
integers, the division of integers, or any other basic operation.
Time complexity is described in terms of the number of operations required instead of actual
computer time because of the difference in time needed for different computers to perform basic
operations. Moreover, it is quite complicated to break all operations down to the basic bit oper-
ations that a computer uses. Furthermore, the fastest computers in existence can perform basic
bit operations (for instance, adding, multiplying, comparing, or exchanging two bits) in 10 −11
second (10 picoseconds), but personal computers may require 10 −8 second (10 nanoseconds),
which is 1000 times as long, to do the same operations.
We illustrate how to analyze the time complexity of an algorithm by consideringAlgorithm 1
of Section 3.1, which finds the maximum of a finite set of integers.
EXAMPLE 1 Describe the time complexity of Algorithm 1 of Section 3.1 for finding the maximum element
in a finite set of integers.
Solution: The number of comparisons will be used as the measure of the time complexity of the
algorithm, because comparisons are the basic operations used.
To find the maximum element of a set with n elements, listed in an arbitrary order, the
temporary maximum is first set equal to the initial term in the list. Then, after a comparison
i ≤ n has been done to determine that the end of the list has not yet been reached, the temporary
maximum and second term are compared, updating the temporary maximum to the value of
the second term if it is larger. This procedure is continued, using two additional comparisons
for each term of the list—one i ≤ n, to determine that the end of the list has not been reached
and another max <a i , to determine whether to update the temporary maximum. Because two
comparisons are used for each of the second through the nth elements and one more comparison
is used to exit the loop when i = n + 1, exactly 2(n − 1) + 1 = 2n − 1 comparisons are used
whenever this algorithm is applied. Hence, the algorithm for finding the maximum of a set
of n elements has time complexity (n), measured in terms of the number of comparisons
used. Note that for this algorithm the number of comparisons is independent of particular input
of n numbers. ▲
Next, we will analyze the time complexity of searching algorithms.