Page 241 - Discrete Mathematics and Its Applications
P. 241
220 3 / Algorithms
EXAMPLE 2 Describe the time complexity of the linear search algorithm (specified as Algortihm 2 in
Section 3.1).
Solution: The number of comparisons used by Algorithm 2 in Section 3.1 will be taken as the
measure of the time complexity. At each step of the loop in the algorithm, two comparisons
are performed—one i ≤ n, to see whether the end of the list has been reached and one x ≤ a i ,
to compare the element x with a term of the list. Finally, one more comparison i ≤ n is made
outside the loop. Consequently, if x = a i ,2i + 1 comparisons are used. The most comparisons,
2n + 2, are required when the element is not in the list. In this case, 2n comparisons are used
to determine that x is not a i , for i = 1, 2,...,n, an additional comparison is used to exit the
loop, and one comparison is made outside the loop. So when x is not in the list, a total of 2n + 2
comparisons are used. Hence, a linear search requires (n) comparisons in the worst case,
because 2n + 2is (n). ▲
WORST-CASECOMPLEXITY ThetypeofcomplexityanalysisdoneinExample2isaworst-
case analysis. By the worst-case performance of an algorithm, we mean the largest number of
operations needed to solve the given problem using this algorithm on input of specified size.
Worst-case analysis tells us how many operations an algorithm requires to guarantee that it will
produce a solution.
EXAMPLE 3 Describe the time complexity of the binary search algorithm (specified as Algorithm 3 in
Section 3.1) in terms of the number of comparisons used (and ignoring the time required to
compute m = (i + j)/2 in each iteration of the loop in the algorithm).
k
Solution: For simplicity, assume there are n = 2 elements in the list a 1 ,a 2 ,...,a n , where k is a
nonnegative integer. Note that k = log n. (If n, the number of elements in the list, is not a power
k
of 2, the list can be considered part of a larger list with 2 k+1 elements, where 2 <n< 2 k+1 .
Here 2 k+1 is the smallest power of 2 larger than n.)
At each stage of the algorithm, i and j, the locations of the first term and the last term of
the restricted list at that stage, are compared to see whether the restricted list has more than one
term. If i< j, a comparison is done to determine whether x is greater than the middle term of
the restricted list.
At the first stage the search is restricted to a list with 2 k−1 terms. So far, two comparisons
have been used. This procedure is continued, using two comparisons at each stage to restrict
the search to a list with half as many terms. In other words, two comparisons are used at the
k
first stage of the algorithm when the list has 2 elements, two more when the search has been
reduced to a list with 2 k−1 elements, two more when the search has been reduced to a list with
2 k−2 elements, and so on, until two comparisons are used when the search has been reduced to a
1
list with 2 = 2 elements. Finally, when one term is left in the list, one comparison tells us that
there are no additional terms left, and one more comparison is used to determine if this term is x.
Hence, at most 2k + 2 = 2 log n + 2 comparisons are required to perform a binary search
k
when the list being searched has 2 elements. (If n is not a power of 2, the original list is expanded
to a list with 2 k+1 terms, where k = log n , and the search requires at most 2 log n + 2
comparisons.) It follows that in the worst case, binary search requires O(log n) comparisons.
Note that in the worst case, 2 log n + 2 comparisons are used by the binary search. Hence, the
binary search uses (log n) comparisons in the worst case, because 2 log n + 2 = (log n).
From this analysis it follows that in the worst case, the binary search algorithm is more efficient
than the linear search algorithm, because we know by Example 2 that the linear search algorithm
has (n) worst-case time complexity. ▲
AVERAGE-CASE COMPLEXITY Another important type of complexity analysis, besides
worst-case analysis, is called average-case analysis. The average number of operations used to
solve the problem over all possible inputs of a given size is found in this type of analysis.Average-
case time complexity analysis is usually much more complicated than worst-case analysis.