Reminder Recursive algorithm Reduce problem to one or
Reminder: Recursive algorithm • Reduce problem to one or more sub-problems of smaller sizes (linear or tree recursion) and solve them recursively • Solve the very small sized problems directly • Usually some computations are required in order to split problem into sub-problems or /and to combine the results 2
Example 1 • Compute f(n)=n! • Notice that f(n)=n*f(n-1) if n>1 and f(1)=1; • Algorithm: 1. If n=1 return 1 2. Computing factorial for n-1 if n>1 3. Multiply result by n and return 3
Example 2 a • Compute the average of set of n=2^k numbers – Avg({x 1. . xn})=(x 1 +. . + xn)/n • Notice that: – Avg({x 1. . xn})=[Avg({x 1. . Xn/2})+Avg({xn/2+1. . xn})]/2 • Algorithm 1. If input set size is 1, return the input as its average 2. Divide set into 2 subsets of n/2 if n>1 3. Find an average of each subset 4. Return average of two results 4
Example 2 b • Compute the average of set of numbers – Avg({x 1. . xn})=(x 1 +. . + xn)/n • Notice that: – Avg({x 1. . xn})=[xn + (n-1)*Avg({x 1. . Xn-1})]/n • Algorithm 1. If input set size is 1, return the input as its average 2. Find average of last n-1 numbers 3. Multiply result by (n-1) add xn and the divide by n 5
Example 3 • Decide if the number x is a power of 2 F(x)= #t, if x=2^n for some n; or #f, otherwise • Notice – F(x)=F(x/2) if x is even, F(x)=#t if x=1 and #f otherwise • Algorithms 1. if x=1 return #t; If x is odd return #f; else 2. Compute and return for x/2 6
Recursive calls • Function/procedure that calls to itself called recursive procedure • Can’t call to itself every time as have to stop somewhere • Recursive algorithms are usually implemented using recursive calls 7
The conditional form (cond (<test-1> <consequent-1>) (<test-2> <consequent-2>). . . (<test-n> <consequent-n>) (else <consequent-else>)) (define (abs x) (cond ((> x 0) x) ((= x 0) 0) (else (- x)))) 8
Scheme Examples • n! (define (f n) (if (= n 1) 1 (* n f (- n 1))))) • Decide whether x is a power of 2 (define (is_pow 2 x) (cond ( ((= x 1) #t) ((odd? x) #f) (else (is_pow 2 (/ x 2)))) )) 9
Special case of recursive processes • No need to do computations after return of recursive call calc return val call return same val calc return val call no calc call return val call no calc return same val calc call no calc return val 10
converting a tail recursive call to an iterative process call don’t wait calc return val In Scheme, a tail recursive procedure results in an iterative process! 11
Different Meanings of Recursions Recursive Algorithms Iterative Algorithms Recursive Scheme Function Tail Recursion Recursive Process Iterative Process 12
Fibonacci Series Every element is the sum of its predecessors: 1, 1, 2, 3, 5, 8, 13, 21… Recursive algorithm fib(0) = 0 fib(1) = 1 fib(n) = fib(n-1) + fib(n-2) : n > 1 Tree recursion (define (fib n) (cond ((= n 0) 0) ((= n 1) 1) (else (+ (fib (- n 1)) (fib (- n 2)))))) Iterative algorithm Initialize: a = 1, b = 0 count = n Step: anew = aold + bold bnew = aold count = count - 1 Tail Recursion (define (fib n) (define (fib-iter a b count) (if (= count 0) b (fib-iter (+ a b) a (- count 1)))) (fib-iter 1 0 n)) 13
Tree Recursion 14
Implementing loops using tail recursion • Usually using helper local function (iterator) • Pass loop local variables as an argument (usually includes some sort of counter) • Check stopping condition • Make initial call with initial state arguments 15
Example(Recursion) • Input: Amount of money : $1. 50 • Output: Number of different ways to construct the amount using coins of 1, 2, 5, 10, 15, 20, 50 cents • Assume there is unlimited number of coins of each value 16
Counting Change = 2 x = 3 x = + 4 x + + 93 x 17
Counting Change CC( , { CC( - })= })+ , { }) 18
Counting Change (define (count-change amount) (cc amount 6)) (define (cc amount kinds-of-coins) 1 (cond ((= amount 0) ) 0 ((or (< amount 0) (= kinds-of-coins 0)) ) amount (else (+ (cc ) (- kinds-of-coins 1) (cc (- amount (first-denomination kinds-of-coins)) kinds-of-coins ))))) (define (first-denomination kinds-of-coins) (cond ((= kinds-of-coins 1) 1) ((= kinds-of-coins 2) 2) ((= kinds-of-coins 3) 5) ((= kinds-of-coins 4) 10) ((= kinds-of-coins 5) 20) ((= kinds-of-coins 6) 50)))) 19
Orders of Growth • We say that function R(n) O(f(n)) if there is a constant c>0 such that for all n≥ 1 R(n) ≤ c f(n) • We say that function R(n) (f(n)) if there is a constant c>0 such that for all n≥ 1 c f(n) ≤ R(n) • We say that function R(n) (f(n)) if there are constants c 1, c 2> 0 such that for all n≥ 1 c 1 f(n) ≤ R(n) ≤ c 2 f(n) 20
Asymptotic version • Suppose f(n) < g(n) for all n≥N • There is always a constant c such that f(n) < c*g(n) for all n ≥ 1 • C=(max{f(n): n<N}+1)/min{g(n): n<N} 21
Practice 5 n 3 + 12 n + 300 n 1/2 + 1 (n 3) n 2 + 30 (n 2) 350 (1) 25 n 1/2 + 40 log 5 n + 100000 (n 1/2) 12 log 2(n+1) + 6 log(3 n) (log 2 n) 22 n+3 + 37 n 15+ 9 (4 n) 22
Comparing algorithms • Running time of a program implementing specific algorithm may vary depending on – Speed of specific computer(e. g. CPU speed) – Specific implementation(e. g. programming language) – Other factors (e. g. input characteristics) • Compare number of operation required by different algorithms asymptotically, i. e. order of growth as size of input increases to infinity 23
Efficient Algorithm • Algorithm for which number of operations and amount of required memory cells do not grow very fast as size of input increased – What operations to count? – What is input size (e. g. number of digits for representing input or input number itself)? – Different input of the same size can cause different running time. 24
Efficient algorithms cont’d • Input size: usually the size of input representation (number of bits/digits) • Usually interested in worst case running time , i. e. number of operations for worst case • Usually we count primitive operations (arithmetic and/or comparisons) 25
Complexity Analysis • Reminder: – Space complexity - number of deferred operations(Scheme) – Time complexity - number of primitive steps • Method I – Qualitative Analysis – Example: Factorial – Each recursive call produce one deferred operation(multiplication) which takes constant time to perform – Total number of operations(time complexity) (n) – Maximum number of deferred operations (n) during the last call • Method II – Recurrence Equations 26
Recurrence Equations Code (define (sum-squares n) (if (= n 1) 1 (+ (square n) (sum-squares (- n 1)) ))) Recurrence T(n) = T(n-1) + (1) Solve T(n) T(n-1) + c T(n-2) + c T(n-3) + c + c … = T(1) + c + … + c = nc 27 O(n)
Recurrence Equations Code (define (fib n) (cond ((= n 0) 0) ((= n 1) 1) (else (+ (fib (- n 1)) (fib (- n 2)) )))) Recurrence T(n) = T(n-1) + T(n-2) + (1) Solve For simplicity, we will solve T(n) = 2 T(n-1) + (1), Which will give us an upper bound 28
Recursion Trees (1) T(n-1) T(n-2) T(1) T(n-2) 2 (1) T(n-2) T(1) 4 (1) 2 n-1 (1) n) Total: (2 29
Fibonacci - Complexity • Time complexity – T(n) < 2 T(n-1)+ (1) – Upper Bound: O(2 n) – Tight Bound: ( n) where =( 5+1)/2 – Space Complexity: (n) 30
Common Recurrences T(n) = T(n-1) + (1) T(n) = T(n-1) + (n) T(n) = T(n/2) + (1) T(n) = T(n/2) + (n) T(n) = 2 T(n-1) + (1) T(n) = 2 T(n/2) + (n) (n 2) (logn) (2 n) (nlogn) 31
Back to GCD (define (gcd a b) (if (= b 0) a (gcd b (remainder a b)))) • Claim: The small number (b) is at least halved every second iteration • For “double iterations” we have T(b) T(b/2) + (1) • T(a) O(log(b)) 32
- Slides: 32