Complexity Analysis 2 Complexity The complexity of an

  • Slides: 18
Download presentation
Complexity Analysis

Complexity Analysis

2 Complexity The complexity of an algorithm quantifies the resources needed as a function

2 Complexity The complexity of an algorithm quantifies the resources needed as a function of the amount of input data size. The resource measured is usually time (cpu cycles), or sometimes space (memory). Complexity is not an absolute measure, but a bounding function characterizing the behavior of the algorithm as the size of the data set increases.

3 Usefulness It allows the comparison of algorithms for efficiency, and predicts their behavior

3 Usefulness It allows the comparison of algorithms for efficiency, and predicts their behavior as data size increases. A particular algorithm may take years to compute. If you know beforehand, you won’t just wait for it to finish

4 Complexity defined Big-Oh notation indicates the family to which an algorithm belongs. Formally,

4 Complexity defined Big-Oh notation indicates the family to which an algorithm belongs. Formally, T(n) is O(f(n)) if there exist positive constants k and n 0 such that |T(n)| k |f(n)| for all n > n 0. Examples T(n) n n linear quadratic

5 Complexity Classes There is a hierarchy of complexities. Here’s some of the hierarchy:

5 Complexity Classes There is a hierarchy of complexities. Here’s some of the hierarchy: O(1) < O(log n) < O(n logn) < O(n 2) < O(n 3) < O(2 n) < O(n!) An algorithm with smaller complexity is normally preferable to one with larger complexity. Tuning an algorithm without affecting its complexity is usually not as good as finding an algorithm with better complexity.

6 Importance of Complexity consider each operation (step) takes 1 ms: n log 2

6 Importance of Complexity consider each operation (step) takes 1 ms: n log 2 n n 2 2 n -----------------------------------2 1 ms 2 ms 4 ms 16 4 ms 64 ms 256 ms 65 ms 256 8 ms 2 ms 65 ms 4 x 1063 years 1024 10 ms 1 sec 6 x 10294 years

7 But computers are getting faster, right ? CPU speed 1. 5 faster every

7 But computers are getting faster, right ? CPU speed 1. 5 faster every 18 months (True fact re: circuit density “Moore’s law” 1965)

8 Computers are getting faster, but … consider each operation (step) takes 1 ms:

8 Computers are getting faster, but … consider each operation (step) takes 1 ms: n log 2 n n 2 2 n -----------------------------------2 1 ms 2 ms 4 ms 16 4 ms 64 ms 256 ms 65 ms 256 8 ms 2 ms 65 ms 4 x 1063 years 1024 10 ms 1 sec 6 x 10294 years NP-complete – class of problems best known alg. O(2 n)

9 Importance of Complexity consider each operation (step) takes 1 ms: n = 1

9 Importance of Complexity consider each operation (step) takes 1 ms: n = 1 million Bubble sort will take O(n 2) or ~1012 operations or 277 hours. Quicksort, mergesort will take O(n logn) or ~107 operations or 10 sec. Sequential search will take O(n) or 500000 comparisons (average). Binary search will take O(log n) or 20 comparisons on average.

10 Importance of Complexity First tune the algorithm, then tune the code ! Trivia:

10 Importance of Complexity First tune the algorithm, then tune the code ! Trivia: first binary search algo published in 1946 first correct binary search algo published in 1962

11 Deriving f(n) from T(n) Reduce to the dominant term: As n becomes very

11 Deriving f(n) from T(n) Reduce to the dominant term: As n becomes very large, what is the approximation of T(n)? E. g. O(n 2 + 106 n) = O(n 2). Eliminate multiplicative constants: It’s the characteristic “shape” of f(n) that matters. E. g. O(3 n) = O(n/3).

12 Bubble. Sort (non-recursive) for (i = n; i >= 1; i--) { for

12 Bubble. Sort (non-recursive) for (i = n; i >= 1; i--) { for (j = 1; j < i; j++) { if (a[j-1] > a[j]) { temp = a[j-1]; a[j-1] = a[j]; a[j] = temp; } } } /* Line 1 */ /* 2 */ /* 3 */ /* 4 */ /* 5 */ /* 6 */

13 Bubble. Sort for (i = n; i >= 1; i--) { for (j

13 Bubble. Sort for (i = n; i >= 1; i--) { for (j = 1; j < i; j++) { if (a[j-1] > a[j]) { temp = a[j-1]; a[j-1] = a[j]; a[j] = temp; } } } /* Line 1 */ /* 2 */ /* 3 */ /* 4 */ /* 5 */ /* 6 */ Line 1: Executed n times. Lines 2 & 3: Executed n + (n-1) + (n-2) + … + 2 + 1 = n(n + 1) / 2 times. O((n 2 + n)/2) = O(n 2)

14 Factorial (recursive functions) int fact (int n) { if (n == 0) return

14 Factorial (recursive functions) int fact (int n) { if (n == 0) return 1; return n * fact (n – 1); } T(n) = T(n - 1) + 1 T(1) = 0 T(n) = 1+ T(n - 1) = 1 + T(n - 2) = 1 + 1 + … = n O(n)

15 More Formally: Solutions for Recursion Method 1: Induction - guess solution (constant k

15 More Formally: Solutions for Recursion Method 1: Induction - guess solution (constant k and function f(n) such that |T(n)| k |f(n)| for all n > n 0 - prove by induction Method 2: Iterate T(n) recurrence, then prove by induction Method 3: Recursion trees

16 Recurrence Equations (Method 2) Merge. Sort T(n) - time for merge sort =

16 Recurrence Equations (Method 2) Merge. Sort T(n) - time for merge sort = ? Calls itself recursively on the two halves (so 2 T(n/2) Merges the two halves in n steps T(n) = n + 2 T(n/2) T(1) = 1

17 Recurrence Equations (Method 2) T(n) – time for merge sort T(n) = n

17 Recurrence Equations (Method 2) T(n) – time for merge sort T(n) = n + 2 T(n/2) = n + 2 (n/2 + 2 T(n/4)) = = n + 4 T(n/4) = n + 4(n/4 + 2 T(n/8)) = = n + n + 8(n/8 + 2 T(n/16)) = … = n logn

18 Recurrence Trees: Method 3 int fibonacci (int n) { if (n <= 1)

18 Recurrence Trees: Method 3 int fibonacci (int n) { if (n <= 1) return 1; return fibonacci(n – 1) + fibonacci (n - 2); } 1 1 2 3 5 8 13 …