Divide and Conquer Recall Divide the problem into
Divide and Conquer
Recall • Divide the problem into a number of sub-problems that are smaller instances of the same problem. • Conquer the sub-problems by solving them recursively. If the subproblem sizes are small enough, however, just solve the subproblems in a straightforward manner. • Combine the solutions to the sub-problems into the solution for the original problem.
Recurrences • A recurrence is an equation or inequality that describes a function in terms of its value on smaller inputs. • We define the running time of MERGE-SORT via a recurring equation • We will study three methods to solve recurrences—that is, for obtaining asymptotic “Θ” or “O” bounds on the solution: 1. Substitution method 2. Recurrence tree 3. Master Method
Strassen’s algorithm for matrix multiplication • If A = (aij) and B = (bij) are square n x n matrices, then in the product C = A x B, we define the cij , for i, j = 1, 2, … , n, by • We must compute n 2 matrix entries, and each is the sum of n values.
Matrix multiplication procedure • For two square matrices the following procedure computes their product Time Complexity
A simple divide-and-conquer algorithm
Recursive algorithm for matrix product
Analysis • T(1) = ϴ(1) • In lines 6– 9, we recursively call SQUARE-MATRIX-MULTIPLYRECURSIVE a total of eight times. • the time taken by all eight recursive calls is 8 T (n/2) • We also must account for the four matrix additions in lines 6– 9. Each of these matrices contains n 2/4 entries, and so each of the four matrix additions takes ϴ (n 2) time • T(1) = ϴ(n 3)
Strassen’s method
Algorithm
Algorithm (1) • Step 3 • Step 4
Algorithm (2)
Solving Recurrences We have three methods to solve recurrence equations 1. Substitution Method 2. Recurrence Tree Method 3. Master Method
Substitution Method The substitution method for solving recurrences comprises two steps: 1. Guess the form of the solution. 2. Use mathematical induction to find the constants and show that the solution works.
Example Consider the recurrence • We guess that the solution is T(n) = O(n lg n) • The substitution require us to prove that T(n) <= cnlgn for c > 0 • We start by assuming that this bound holds for all positive m < n, in particular for m = Floor (n/2) • It gives • Substituting it into the recurrence yields
Example - continued Now we require to show that this solution holds for boundary conditions • Let us assume, for the sake of argument, that T(1) = 1 • For n = 1? • We only require to prove for any n 0 • for n > 3, the recurrence does not depend directly on T(1) • Distinction between base case of recurrence and induction • We derive from the recurrence that T(2) = 4 and T(3) = 5 • We can complete the inductive proof that T(n) =c n lgn for some constant c>=1 by choosing c large enough so that T(2) <= c 2 lg 2 and T(3) <= c 3 lg 3 • c >= 2?
Another Example • Make sure you show the same exact form when doing a substitution proof. • Consider the recurrence T (n) = 8 T (n/2) + Θ (n 2) • For an upper bound: T (n) ≤ 8 T (n/2) + cn 2. Guess: T (n) ≤ dn 3. T (n) ≤ 8 d(n/2)3 + cn 2 = 8 d(n 3/8) + cn 2 = dn 3 + cn 2 ≤ dn 3 doesn’t work!
Another Example - continued • Remedy: Subtract off a lower-order term. Guess: T (n) ≤ dn 3 − d’n 2. T (n) ≤ 8(d(n/2)3 − d’(n/2)2) + cn 2 = 8 d(n 3/8) − 8 d’(n 2/4) + cn 2 = dn 3 − 2 d’n 2 + cn 2 = dn 3 − d’n 2 + cn 2 ≤ dn 3 − d’n 2 if −d’n 2 + cn 2 ≤ 0 , d’ ≥ c
Yet another example • T(n) = cn + 3 T(2 n/3) • How about F(n) = n lg n? • cn + 3 k. F(2 n/3) = cn + 3 k(2 n/3) lg (2 n/3) = cn + 2 kn lg n − 2 kn lg (2/3) = cn + 2 kn lg (3/2) • There is no way to choose k to make the left side (kn lg n) larger • Therefore, n lg n is not correct
Yet another example - continued • Try a higher order of growth like n 2 or n 3, but which one? Maybe nx • We can solve for the correct exponent x by plugging in knx: cn + 3 T(2 n/3) = cn + 3 k(2/3)xnx • This will be asymptotically less than knx as long as 3(2/3)x > 1 , which requires x > lg 3/23 • Let a = lg 3/2 3, then our algorithm is O(na+ε) for any positive ε • Let's try O(na) itself, the RHS after substituting kna is cn + 3(2/3)akna = cn + kna ≥ kna • This tells us that kna is an asymptotic lower bound on T(n): T(n) is Ω(na). So the complexity is somewhere between Ω(na) and O(na+ε). It is in fact Θ(na). • To show the upper bound, we will try F(n) = na + bn where b is a constant to be filled in later.
Yet another example - continued • The idea is to pick a b so that bn will compensate for the cn term that shows up in the recurrence. • Because bn is O(na), showing T(n) is O(na + bn) is the same as showing that it is O(na). Substituting k. F(n)for T(n) in the RHS of the recurrence, we obtain: cn + 3 k. F(2 n/3) = cn + 3 k((2 n/3)a + b(2 n/3)) = cn + 3 k(2 n/3)a + 3 kb(2 n/3) = cn + kna + 2 kbn = kna + (3 kb+c)n • The substituted LHS of the recurrence is kna + kbn, which is larger than kna + (2 kb+c)n as long as kb>2 kb+c, or b<−c/k. There is no requirement that b be positive, so choosing k=1, b= − 1 satisfies the recurrence. • Therefore T(n) = O(na + bn) = O(na), and since T(n) is both O(na) and Ω(na), it is Θ(na).
Substitution method - warning • Be careful when using asymptotic notation. • The false proof for the recurrence T (n) = 4 T (n/4) + n, that T (n) = O(n): T (n) ≤ 4(c(n/4)) + n ≤ cn + n = O(n) wrong! • Because we haven’t proven the exact form of our inductive hypothesis (which is that T (n) ≤ cn), this proof is false.
Recursion tree method • Use to generate a guess. Then verify by substitution method. • T (n) = T (n/3)+T (2 n/3)+Θ (n). • For upper bound, rewrite as T (n) ≤ T (n/3) + T (2 n/3) + cn; • for lower bound, as T (n) ≥ T (n/3) + T (2 n/3) + cn. • By summing across each level, the recursion tree shows the cost at each level of recursion (minus the costs of recursive calls, which appear in subtrees):
Recursion tree method
Recursion tree method • There are log 3 n full levels, and after log 3/2 n levels, the problem size is down to 1. • Each level contributes ≤ cn. • Lower bound guess: ≥ dn log 3 n =(nlg n) for some positive constant d. • Upper bound guess: ≤ dn log 3/2 n=O(nlg n) for some positive constant d. • Then prove by substitution.
Recursion tree method Upper bound: Guess: T (n) ≤ dn lg n. Substitution: T (n) ≤ T (n/3) + T (2 n/3) + cn ≤ d(n/3) lg(n/3) + d(2 n/3) lg(2 n/3) + cn = (d(n/3) lg n − d(n/3) lg 3) + (d(2 n/3) lg n − d(2 n/3) lg(3/2)) + cn = dn lg n − d((n/3) lg 3 + (2 n/3) lg 3 − (2 n/3) lg 2) + cn = dn lg n − dn(lg 3 − 2/3) + cn ≤ dn lg n if −dn(lg 3 − 2/3) + cn ≤ 0 , d ≥ c / lg 3 − 2/3 Lower bound?
Another example • T(n) = 3 T(n/4) + cn 2.
Another example
Another example • The sub-problem size for a node at depth i is n=4 i • Thus, the sub-problem size hits n = 1 when n/4 i =1 or, equivalently, when i= log 4 n. • Thus, the tree has log 4 n + 1 levels (at depths 0; 1; 2, … , log 4 n). • the number of nodes at depth i is 3 i • each node at depth i, for i = 0; 1; 2, …, log 4 n -1, has a cost of c(n/4 i)2 • Total cost at depth i is (3/16) i cn 2 • The bottom level, at depth log 4 n has each cost T(1) for a total cost of
Another example • Taking advantage of a decreasing geometric sequence
Another example
Master Method • Used for many divide-and-conquer recurrences of the form • T (n) = a. T (n/b) + f (n) , • where a ≥ 1, b > 1, and f (n) > 0. • Based on the master theorem
Master Method
Examples
Examples
Examples
Examples
- Slides: 37