CS 6045 Advanced Algorithms Greedy Algorithms Greedy Algorithms
![CS 6045: Advanced Algorithms Greedy Algorithms CS 6045: Advanced Algorithms Greedy Algorithms](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-1.jpg)
![Greedy Algorithms • Main Concept – Divide the problem into multiple steps (sub-problems) – Greedy Algorithms • Main Concept – Divide the problem into multiple steps (sub-problems) –](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-2.jpg)
![Greedy Algorithms • A greedy algorithm always makes the choice that looks best at Greedy Algorithms • A greedy algorithm always makes the choice that looks best at](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-3.jpg)
![Activity-Selection Problem • Problem: get your money’s worth out of a carnival – Buy Activity-Selection Problem • Problem: get your money’s worth out of a carnival – Buy](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-4.jpg)
![Activity-Selection • Formally: – Given a set S of n activities S = {a Activity-Selection • Formally: – Given a set S of n activities S = {a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-5.jpg)
![Example • Maximum-size mutually compatible set: {a 1, a 3, a 6, a 8}. Example • Maximum-size mutually compatible set: {a 1, a 3, a 6, a 8}.](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-6.jpg)
![Activity Selection: Optimal Substructure • Activity Selection: Optimal Substructure •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-7.jpg)
![Activity Selection: Optimal Substructure • Activity Selection: Optimal Substructure •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-8.jpg)
![Activity Selection: Dynamic Programming • Activity Selection: Dynamic Programming •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-9.jpg)
![Greedy Choice Property • Dynamic programming – Solve all the sub-problems • Activity selection Greedy Choice Property • Dynamic programming – Solve all the sub-problems • Activity selection](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-10.jpg)
![Activity Selection: A Greedy Algorithm • So actual algorithm is simple: – Sort the Activity Selection: A Greedy Algorithm • So actual algorithm is simple: – Sort the](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-11.jpg)
![Activity Selection: A Greedy Algorithm Greedy Choice: Select the next best activity (Local Optimal) Activity Selection: A Greedy Algorithm Greedy Choice: Select the next best activity (Local Optimal)](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-12.jpg)
![Greedy Algorithm Correctness • Theorem: – If Sk (activities that start after ak finishes) Greedy Algorithm Correctness • Theorem: – If Sk (activities that start after ak finishes)](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-13.jpg)
![Example We mapped S’ to S and • S: {a 1, a 3, a Example We mapped S’ to S and • S: {a 1, a 3, a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-14.jpg)
![Recursive Solution Two arrays containing the start and end times (Assumption: they are sorted Recursive Solution Two arrays containing the start and end times (Assumption: they are sorted](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-15.jpg)
![Iterative Solution Two arrays containing the start and end times (Assumption: they are sorted Iterative Solution Two arrays containing the start and end times (Assumption: they are sorted](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-16.jpg)
![Elements Of Greedy Algorithms • Greedy-Choice Property – At each step, we do a Elements Of Greedy Algorithms • Greedy-Choice Property – At each step, we do a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-17.jpg)
![Elements Of Greedy Algorithms • Proving a greedy solution is optimal – Remember: Not Elements Of Greedy Algorithms • Proving a greedy solution is optimal – Remember: Not](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-18.jpg)
![Review: The Knapsack Problem • 0 -1 knapsack problem: – The thief must choose Review: The Knapsack Problem • 0 -1 knapsack problem: – The thief must choose](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-19.jpg)
![Review: The Knapsack Problem And Optimal Substructure • Both variations exhibit optimal substructure • Review: The Knapsack Problem And Optimal Substructure • Both variations exhibit optimal substructure •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-20.jpg)
![Solving The Knapsack Problem • The optimal solution to the 0 -1 problem cannot Solving The Knapsack Problem • The optimal solution to the 0 -1 problem cannot](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-21.jpg)
![0 -1 Knapsack - Greedy Strategy Does Not Work 30 $120 Item 3 50 0 -1 Knapsack - Greedy Strategy Does Not Work 30 $120 Item 3 50](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-22.jpg)
![Solving The Knapsack Problem • The optimal solution to the fractional knapsack problem can Solving The Knapsack Problem • The optimal solution to the fractional knapsack problem can](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-23.jpg)
![The Knapsack Problem: Greedy Vs. Dynamic • The fractional problem can be solved greedily The Knapsack Problem: Greedy Vs. Dynamic • The fractional problem can be solved greedily](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-24.jpg)
![Huffman code • Computer Data Encoding: – How do we represent data in binary? Huffman code • Computer Data Encoding: – How do we represent data in binary?](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-25.jpg)
![American Standard Code for Information Interchange American Standard Code for Information Interchange](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-26.jpg)
![ASCII Example: AABCAA A A B C A A 1000001 1000010 1000011 1000001 ASCII Example: AABCAA A A B C A A 1000001 1000010 1000011 1000001](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-27.jpg)
![Total space usage in bits: Assume an ℓ bit fixed length code. For a Total space usage in bits: Assume an ℓ bit fixed length code. For a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-28.jpg)
![Variable Length codes • Idea: In order to save space, use less bits for Variable Length codes • Idea: In order to save space, use less bits for](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-29.jpg)
![Variable Length codes - example Suppose the frequency distribution of the characters is: Encode: Variable Length codes - example Suppose the frequency distribution of the characters is: Encode:](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-30.jpg)
![Total space usage in bits: Fixed code: 1, 000 x 2 = 2, 000 Total space usage in bits: Fixed code: 1, 000 x 2 = 2, 000](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-31.jpg)
![How do we decode? In the fixed length, we know where every character starts, How do we decode? In the fixed length, we know where every character starts,](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-32.jpg)
![How do we decode? In the variable length code, we use an idea called How do we decode? In the variable length code, we use an idea called](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-33.jpg)
![How do we decode? Example: A = 0 B = 10 C = 11 How do we decode? Example: A = 0 B = 10 C = 11](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-34.jpg)
![Prefix Code Example: A = 0 B = 10 C = 11 Decode the Prefix Code Example: A = 0 B = 10 C = 11 Decode the](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-35.jpg)
![Desiderata: Construct a variable length code for a given file with the following properties: Desiderata: Construct a variable length code for a given file with the following properties:](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-36.jpg)
![Idea Consider a binary tree, with: 0 meaning a left branch 1 meaning a Idea Consider a binary tree, with: 0 meaning a left branch 1 meaning a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-37.jpg)
![Idea Consider the paths from the root to each of the leaves A, B, Idea Consider the paths from the root to each of the leaves A, B,](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-38.jpg)
![Observe: 1. This is a prefix code, since each of the leaves has a Observe: 1. This is a prefix code, since each of the leaves has a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-39.jpg)
![Greedy Algorithm: 1. Consider all pairs: <frequency, symbol>. 2. Choose the two lowest frequencies, Greedy Algorithm: 1. Consider all pairs: <frequency, symbol>. 2. Choose the two lowest frequencies,](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-40.jpg)
![Greedy Algorithm Example: Alphabet: A, B, C, D, E, F Frequency table: A B Greedy Algorithm Example: Alphabet: A, B, C, D, E, F Frequency table: A B](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-41.jpg)
![Algorithm Run: A 10 B 20 C 30 D 40 E 50 F 60 Algorithm Run: A 10 B 20 C 30 D 40 E 50 F 60](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-42.jpg)
![Algorithm Run: A X 30 10 B C 20 30 D 40 E 50 Algorithm Run: A X 30 10 B C 20 30 D 40 E 50](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-43.jpg)
![Algorithm Run: Y A X 30 10 B 60 D C 20 30 40 Algorithm Run: Y A X 30 10 B 60 D C 20 30 40](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-44.jpg)
![Algorithm Run: D 40 E 50 Y A X 30 10 B 60 F Algorithm Run: D 40 E 50 Y A X 30 10 B 60 F](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-45.jpg)
![Algorithm Run: D Z 90 40 E Y 50 A X 30 10 B Algorithm Run: D Z 90 40 E Y 50 A X 30 10 B](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-46.jpg)
![Algorithm Run: Y A X 30 10 B 60 F C 20 30 60 Algorithm Run: Y A X 30 10 B 60 F C 20 30 60](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-47.jpg)
![Algorithm Run: W 120 Y A X 30 10 B 60 F C 20 Algorithm Run: W 120 Y A X 30 10 B 60 F C 20](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-48.jpg)
![Algorithm Run: D Z 90 40 E W 120 50 Y A X 30 Algorithm Run: D Z 90 40 E W 120 50 Y A X 30](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-49.jpg)
![Algorithm Run: V 210 0 Z 90 0 D 40 E 1 W 120 Algorithm Run: V 210 0 Z 90 0 D 40 E 1 W 120](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-50.jpg)
![The Huffman encoding: A: B: C: D: E: F: 1000 1001 101 00 01 The Huffman encoding: A: B: C: D: E: F: 1000 1001 101 00 01](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-51.jpg)
![Note the savings: • The Huffman code: • Required 510 bits for the file. Note the savings: • The Huffman code: • Required 510 bits for the file.](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-52.jpg)
![Greedy Algorithm • Initialize trees of a single node each. • Keep the roots Greedy Algorithm • Initialize trees of a single node each. • Keep the roots](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-53.jpg)
![Huffman Algorithm Total run time: (n lgn) Huffman(C) 1. n = |C| 2. Q Huffman Algorithm Total run time: (n lgn) Huffman(C) 1. n = |C| 2. Q](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-54.jpg)
![Algorithm correctness: Need to prove two things for greedy algorithms: Greedy Choice Property: The Algorithm correctness: Need to prove two things for greedy algorithms: Greedy Choice Property: The](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-55.jpg)
![Huffman Algorithm correctness: Greedy Choice Property: There exists a minimum cost prefix tree where Huffman Algorithm correctness: Greedy Choice Property: There exists a minimum cost prefix tree where](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-56.jpg)
![Algorithm correctness: Optimal Substructure Property: An optimal solution to the problem once we choose Algorithm correctness: Optimal Substructure Property: An optimal solution to the problem once we choose](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-57.jpg)
![Algorithm correctness: • Greedy Choice Property: • There exists a minimum cost tree where Algorithm correctness: • Greedy Choice Property: • There exists a minimum cost tree where](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-58.jpg)
![Algorithm correctness: CT da dy da ≤ d y a x We know about Algorithm correctness: CT da dy da ≤ d y a x We know about](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-59.jpg)
![Algorithm correctness: CT We also know about code tree CT: da dy a x Algorithm correctness: CT We also know about code tree CT: da dy a x](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-60.jpg)
![Algorithm correctness: Cost(CT) = ∑fσdσ = CT’ σ ∑fσdσ+fada+fydy≥ da dy σ≠a, y y Algorithm correctness: Cost(CT) = ∑fσdσ = CT’ σ ∑fσdσ+fada+fydy≥ da dy σ≠a, y y](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-61.jpg)
![Algorithm correctness: CT db dx b x a Now do the same thing for Algorithm correctness: CT db dx b x a Now do the same thing for](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-62.jpg)
![Algorithm correctness: CT” db dx x b a And get an optimal code tree Algorithm correctness: CT” db dx x b a And get an optimal code tree](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-63.jpg)
![Algorithm correctness: Optimal substructure property: Let a, b be the symbols with the smallest Algorithm correctness: Optimal substructure property: Let a, b be the symbols with the smallest](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-64.jpg)
![Algorithm correctness: CT CT’ x fx = f a + f b x a Algorithm correctness: CT CT’ x fx = f a + f b x a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-65.jpg)
![Algorithm correctness: cost(CT’)=∑fσd’σ = ∑fσd’σ + fad’a + fbd’b= σ σ≠a, b ∑fσd’σ + Algorithm correctness: cost(CT’)=∑fσd’σ = ∑fσd’σ + fad’a + fbd’b= σ σ≠a, b ∑fσd’σ +](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-66.jpg)
![Algorithm correctness: CT CT’ x fx = f a + f b x cost(CT)+fx Algorithm correctness: CT CT’ x fx = f a + f b x cost(CT)+fx](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-67.jpg)
![Algorithm correctness: Assume CT’ is not optimal. By the previous lemma there is a Algorithm correctness: Assume CT’ is not optimal. By the previous lemma there is a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-68.jpg)
![Consider Algorithm correctness: CT’’’ CT” x fx = f a + f b By Consider Algorithm correctness: CT’’’ CT” x fx = f a + f b By](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-69.jpg)
![Algorithm correctness: We get: cost(CT’’’) = cost(CT”) – fx < cost(CT’) – fx = Algorithm correctness: We get: cost(CT’’’) = cost(CT”) – fx < cost(CT’) – fx =](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-70.jpg)
![Greedy vs. Dynamic • Greedy Algorithms – Can assemble a globally optimal solution by Greedy vs. Dynamic • Greedy Algorithms – Can assemble a globally optimal solution by](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-71.jpg)
- Slides: 71
![CS 6045 Advanced Algorithms Greedy Algorithms CS 6045: Advanced Algorithms Greedy Algorithms](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-1.jpg)
CS 6045: Advanced Algorithms Greedy Algorithms
![Greedy Algorithms Main Concept Divide the problem into multiple steps subproblems Greedy Algorithms • Main Concept – Divide the problem into multiple steps (sub-problems) –](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-2.jpg)
Greedy Algorithms • Main Concept – Divide the problem into multiple steps (sub-problems) – For each step take the best choice at the current moment (Local optimal) (Greedy choice) – A greedy algorithm always makes the choice that looks best at the moment – The hope: A locally optimal choice will lead to a globally optimal solution • For some problems, it works. For others, it does not
![Greedy Algorithms A greedy algorithm always makes the choice that looks best at Greedy Algorithms • A greedy algorithm always makes the choice that looks best at](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-3.jpg)
Greedy Algorithms • A greedy algorithm always makes the choice that looks best at the moment – The hope: a locally optimal choice will lead to a globally optimal solution – For some problems, it works • Activity-Selection Problem • Huffman Codes • Dynamic programming can be overkill (slow); greedy algorithms tend to be easier to code
![ActivitySelection Problem Problem get your moneys worth out of a carnival Buy Activity-Selection Problem • Problem: get your money’s worth out of a carnival – Buy](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-4.jpg)
Activity-Selection Problem • Problem: get your money’s worth out of a carnival – Buy a wristband that lets you onto any ride – Lots of rides, each starting and ending at different times – Your goal: ride as many rides as possible • Another, alternative goal that we don’t solve here: maximize time spent on rides • Welcome to the activity selection problem
![ActivitySelection Formally Given a set S of n activities S a Activity-Selection • Formally: – Given a set S of n activities S = {a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-5.jpg)
Activity-Selection • Formally: – Given a set S of n activities S = {a 1, …, an} si = start time of activity i fi = finish time of activity i – Find max-size subset A of compatible (non-overlapping ) activities 3 1 n 2 4 6 5 Assume that f 1 f 2 … fn
![Example Maximumsize mutually compatible set a 1 a 3 a 6 a 8 Example • Maximum-size mutually compatible set: {a 1, a 3, a 6, a 8}.](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-6.jpg)
Example • Maximum-size mutually compatible set: {a 1, a 3, a 6, a 8}. • Not unique: also {a 2, a 5, a 7, a 9}.
![Activity Selection Optimal Substructure Activity Selection: Optimal Substructure •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-7.jpg)
Activity Selection: Optimal Substructure •
![Activity Selection Optimal Substructure Activity Selection: Optimal Substructure •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-8.jpg)
Activity Selection: Optimal Substructure •
![Activity Selection Dynamic Programming Activity Selection: Dynamic Programming •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-9.jpg)
Activity Selection: Dynamic Programming •
![Greedy Choice Property Dynamic programming Solve all the subproblems Activity selection Greedy Choice Property • Dynamic programming – Solve all the sub-problems • Activity selection](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-10.jpg)
Greedy Choice Property • Dynamic programming – Solve all the sub-problems • Activity selection problem also exhibits the greedy choice property: – We should choose an activity that leaves the resource available for as many other activities as possible – The first greedy choice is a 1, since f 1 f 2 … fn
![Activity Selection A Greedy Algorithm So actual algorithm is simple Sort the Activity Selection: A Greedy Algorithm • So actual algorithm is simple: – Sort the](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-11.jpg)
Activity Selection: A Greedy Algorithm • So actual algorithm is simple: – Sort the activities by finish time – Schedule the first activity – Then schedule the next activity in sorted list which starts after previous activity finishes – Repeat until no more activities • Intuition is even more simple: – Always pick the shortest ride available at the time
![Activity Selection A Greedy Algorithm Greedy Choice Select the next best activity Local Optimal Activity Selection: A Greedy Algorithm Greedy Choice: Select the next best activity (Local Optimal)](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-12.jpg)
Activity Selection: A Greedy Algorithm Greedy Choice: Select the next best activity (Local Optimal) • Select the activity that ends first (smallest end time) – Intuition: it leaves the largest possible empty space for more activities • Once selected an activity – Delete all non-compatible activities – They cannot be selected • Repeat the algorithm for the remaining activities – Either using iterations or recursion Sub-problem: We created one sub-problem to solve (Find the optimal schedule after the selected activity) Hopefully when we merge the local optimal + the subproblem optimal solution we get a global optimal
![Greedy Algorithm Correctness Theorem If Sk activities that start after ak finishes Greedy Algorithm Correctness • Theorem: – If Sk (activities that start after ak finishes)](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-13.jpg)
Greedy Algorithm Correctness • Theorem: – If Sk (activities that start after ak finishes) is nonempty and am has the earliest finish time in Sk, then am is included in some optimal solution. • How to prove it? – We can convert any other optimal solution (S’) to the greedy algorithm solution (S) • Idea: – Compare the activities in S’ and S from left-to-right – If they match in the selected activity skip – If they do not match, we can replace the activity in S’ by that in S because the one in S finishes first
![Example We mapped S to S and S a 1 a 3 a Example We mapped S’ to S and • S: {a 1, a 3, a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-14.jpg)
Example We mapped S’ to S and • S: {a 1, a 3, a 6, a 8}. showed that S is even better • S’: {a 2, a 5, a 7, a 9}. • a 2, a 5, a 7, a 9 in S’ can be replaced by a 1, a 3, a 6, a 8 from S (finishes earlier)
![Recursive Solution Two arrays containing the start and end times Assumption they are sorted Recursive Solution Two arrays containing the start and end times (Assumption: they are sorted](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-15.jpg)
Recursive Solution Two arrays containing the start and end times (Assumption: they are sorted based on end times) The activity chosen in the last call The problem size Recursive-Activity-Selection(s, f, k, n) m = k +1 Find the next activity starting after While (m <= n) && ( s[m] < f[k]) the end of k m++; If (m <= n) return {Am} U Recursive-Activity-Selection(s, f, m, n) Else return Φ Time Complexity: O(n) (Assuming arrays are already sorted, otherwise we add O(n Log n)
![Iterative Solution Two arrays containing the start and end times Assumption they are sorted Iterative Solution Two arrays containing the start and end times (Assumption: they are sorted](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-16.jpg)
Iterative Solution Two arrays containing the start and end times (Assumption: they are sorted based on end times) Iterative-Activity-Selection(s, f) n = s. length A = {a 1} k = 1 for (m = 2 to n) if (S[m] >= f[k]) A = A U {am} k = m Return A
![Elements Of Greedy Algorithms GreedyChoice Property At each step we do a Elements Of Greedy Algorithms • Greedy-Choice Property – At each step, we do a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-17.jpg)
Elements Of Greedy Algorithms • Greedy-Choice Property – At each step, we do a greedy (local optimal) choice • Top-Down Solution – The greedy choice is usually done independent of the sub-problems – Usually done “before” solving the sub-problem • Optimal Substructure – The global optimal solution can be composed from the local optimal of the sub-problems
![Elements Of Greedy Algorithms Proving a greedy solution is optimal Remember Not Elements Of Greedy Algorithms • Proving a greedy solution is optimal – Remember: Not](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-18.jpg)
Elements Of Greedy Algorithms • Proving a greedy solution is optimal – Remember: Not all problems have optimal greedy solution – If it does, you need to prove it – Usually the proof includes mapping or converting any other optimal solution to the greedy solution
![Review The Knapsack Problem 0 1 knapsack problem The thief must choose Review: The Knapsack Problem • 0 -1 knapsack problem: – The thief must choose](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-19.jpg)
Review: The Knapsack Problem • 0 -1 knapsack problem: – The thief must choose among n items, where the ith item worth vi dollars and weighs wi pounds – Carrying at most W pounds, maximize value • Note: assume vi, wi, and W are all integers • “ 0 -1” b/c each item must be taken or left in entirety • A variation, the fractional knapsack problem: – Thief can take fractions of items – Think of items in 0 -1 problem as gold ingots, in fractional problem as buckets of gold dust
![Review The Knapsack Problem And Optimal Substructure Both variations exhibit optimal substructure Review: The Knapsack Problem And Optimal Substructure • Both variations exhibit optimal substructure •](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-20.jpg)
Review: The Knapsack Problem And Optimal Substructure • Both variations exhibit optimal substructure • To show this for the 0 -1 problem, consider the most valuable load weighing at most W pounds – If we remove item j from the load, what do we know about the remaining load? – A: remainder must be the most valuable load weighing at most W - wj that thief could take from museum, excluding item j
![Solving The Knapsack Problem The optimal solution to the 0 1 problem cannot Solving The Knapsack Problem • The optimal solution to the 0 -1 problem cannot](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-21.jpg)
Solving The Knapsack Problem • The optimal solution to the 0 -1 problem cannot be found with the same greedy strategy – Greedy strategy: take in order of dollars/pound – Example: 3 items weighing 10, 20, and 30 pounds, knapsack can hold 50 pounds • Suppose 3 items are worth $60, $100, and $120. • Will greedy strategy work?
![0 1 Knapsack Greedy Strategy Does Not Work 30 120 Item 3 50 0 -1 Knapsack - Greedy Strategy Does Not Work 30 $120 Item 3 50](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-22.jpg)
0 -1 Knapsack - Greedy Strategy Does Not Work 30 $120 Item 3 50 Item 2 Item 1 30 $6/pound • 20 $100 + 20 10 $60 50 10 $100 $5/pound $120 W 20 + $100 $60 $160 $4/pound Greedy choice: – Compute the benefit per pound – Sort the items based on these values 50 Not optimal $220
![Solving The Knapsack Problem The optimal solution to the fractional knapsack problem can Solving The Knapsack Problem • The optimal solution to the fractional knapsack problem can](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-23.jpg)
Solving The Knapsack Problem • The optimal solution to the fractional knapsack problem can be found with a greedy algorithm 2/3 Of 30 Item 3 50 Item 2 Item 1 30 $6/pound • 20 $100 + 10 $100 $5/pound $120 + 50 20 10 $60 $80 W $60 $240 $4/pound Greedy choice: Optimal – Compute the benefit per pound – Sort the items based on these values – Take as much as you can from the top items in the list
![The Knapsack Problem Greedy Vs Dynamic The fractional problem can be solved greedily The Knapsack Problem: Greedy Vs. Dynamic • The fractional problem can be solved greedily](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-24.jpg)
The Knapsack Problem: Greedy Vs. Dynamic • The fractional problem can be solved greedily • The 0 -1 problem cannot be solved with a greedy approach – As you have seen, however, it can be solved with dynamic programming
![Huffman code Computer Data Encoding How do we represent data in binary Huffman code • Computer Data Encoding: – How do we represent data in binary?](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-25.jpg)
Huffman code • Computer Data Encoding: – How do we represent data in binary? • Historical Solution: – Fixed length codes – Encode every symbol by a unique binary string of a fixed length. – Examples: ASCII (7 bit code), – EBCDIC (8 bit code), …
![American Standard Code for Information Interchange American Standard Code for Information Interchange](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-26.jpg)
American Standard Code for Information Interchange
![ASCII Example AABCAA A A B C A A 1000001 1000010 1000011 1000001 ASCII Example: AABCAA A A B C A A 1000001 1000010 1000011 1000001](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-27.jpg)
ASCII Example: AABCAA A A B C A A 1000001 1000010 1000011 1000001
![Total space usage in bits Assume an ℓ bit fixed length code For a Total space usage in bits: Assume an ℓ bit fixed length code. For a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-28.jpg)
Total space usage in bits: Assume an ℓ bit fixed length code. For a file of n characters Need nℓ bits.
![Variable Length codes Idea In order to save space use less bits for Variable Length codes • Idea: In order to save space, use less bits for](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-29.jpg)
Variable Length codes • Idea: In order to save space, use less bits for frequent characters and more bits for rare characters. • Example: suppose alphabet of 3 symbols: { A, B, C }. suppose in file: 1, 000 characters. • Need 2 bits for a fixed length code for a total of 2, 000 bits.
![Variable Length codes example Suppose the frequency distribution of the characters is Encode Variable Length codes - example Suppose the frequency distribution of the characters is: Encode:](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-30.jpg)
Variable Length codes - example Suppose the frequency distribution of the characters is: Encode: A B C 999, 000 500 A 0 B 10 C 11 Note that the code of A is of length 1, and the codes for B and C are of length 2
![Total space usage in bits Fixed code 1 000 x 2 2 000 Total space usage in bits: Fixed code: 1, 000 x 2 = 2, 000](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-31.jpg)
Total space usage in bits: Fixed code: 1, 000 x 2 = 2, 000 Variable code: 999, 000 x 1 + 500 x 2 1, 000 A savings of almost 50%
![How do we decode In the fixed length we know where every character starts How do we decode? In the fixed length, we know where every character starts,](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-32.jpg)
How do we decode? In the fixed length, we know where every character starts, since they all have the same number of bits. Example: A = 00 B = 01 C = 10 00000001011010100100001010 A A A BB C CC B A A CC
![How do we decode In the variable length code we use an idea called How do we decode? In the variable length code, we use an idea called](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-33.jpg)
How do we decode? In the variable length code, we use an idea called Prefix code, where no code is a prefix of another. Example: A = 0 B = 10 C = 11 None of the above codes is a prefix of another.
![How do we decode Example A 0 B 10 C 11 How do we decode? Example: A = 0 B = 10 C = 11](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-34.jpg)
How do we decode? Example: A = 0 B = 10 C = 11 So, for the string: AAABBC C CBC BAACC 0 0 0101011111110 0 01111 the encoding:
![Prefix Code Example A 0 B 10 C 11 Decode the Prefix Code Example: A = 0 B = 10 C = 11 Decode the](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-35.jpg)
Prefix Code Example: A = 0 B = 10 C = 11 Decode the string 0 0 0101011111110 0 01111 A A A B B C C C B A AC C
![Desiderata Construct a variable length code for a given file with the following properties Desiderata: Construct a variable length code for a given file with the following properties:](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-36.jpg)
Desiderata: Construct a variable length code for a given file with the following properties: 1. Prefix code. 2. Using shortest possible codes. 3. Efficient.
![Idea Consider a binary tree with 0 meaning a left branch 1 meaning a Idea Consider a binary tree, with: 0 meaning a left branch 1 meaning a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-37.jpg)
Idea Consider a binary tree, with: 0 meaning a left branch 1 meaning a right branch 0 A 1 0 B 1 0 C 1 D
![Idea Consider the paths from the root to each of the leaves A B Idea Consider the paths from the root to each of the leaves A, B,](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-38.jpg)
Idea Consider the paths from the root to each of the leaves A, B, C, D: A: 0 B : 10 C : 110 D : 111 0 1 A 0 B 1 0 C 1 D
![Observe 1 This is a prefix code since each of the leaves has a Observe: 1. This is a prefix code, since each of the leaves has a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-39.jpg)
Observe: 1. This is a prefix code, since each of the leaves has a path ending in it, without continuation. 2. If the tree is full then we are not “wasting” bits. 3. If we make sure that the more frequent symbols are closer to the root then they will have a smaller code. 0 1 A 0 B 1 0 C 1 D
![Greedy Algorithm 1 Consider all pairs frequency symbol 2 Choose the two lowest frequencies Greedy Algorithm: 1. Consider all pairs: <frequency, symbol>. 2. Choose the two lowest frequencies,](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-40.jpg)
Greedy Algorithm: 1. Consider all pairs: <frequency, symbol>. 2. Choose the two lowest frequencies, and make them brothers, with the root having the combined frequency. 3. Iterate.
![Greedy Algorithm Example Alphabet A B C D E F Frequency table A B Greedy Algorithm Example: Alphabet: A, B, C, D, E, F Frequency table: A B](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-41.jpg)
Greedy Algorithm Example: Alphabet: A, B, C, D, E, F Frequency table: A B C D E F 10 20 30 40 50 60 Total File Length: 210
![Algorithm Run A 10 B 20 C 30 D 40 E 50 F 60 Algorithm Run: A 10 B 20 C 30 D 40 E 50 F 60](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-42.jpg)
Algorithm Run: A 10 B 20 C 30 D 40 E 50 F 60
![Algorithm Run A X 30 10 B C 20 30 D 40 E 50 Algorithm Run: A X 30 10 B C 20 30 D 40 E 50](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-43.jpg)
Algorithm Run: A X 30 10 B C 20 30 D 40 E 50 F 60
![Algorithm Run Y A X 30 10 B 60 D C 20 30 40 Algorithm Run: Y A X 30 10 B 60 D C 20 30 40](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-44.jpg)
Algorithm Run: Y A X 30 10 B 60 D C 20 30 40 E 50 F 60
![Algorithm Run D 40 E 50 Y A X 30 10 B 60 F Algorithm Run: D 40 E 50 Y A X 30 10 B 60 F](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-45.jpg)
Algorithm Run: D 40 E 50 Y A X 30 10 B 60 F C 20 30 60
![Algorithm Run D Z 90 40 E Y 50 A X 30 10 B Algorithm Run: D Z 90 40 E Y 50 A X 30 10 B](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-46.jpg)
Algorithm Run: D Z 90 40 E Y 50 A X 30 10 B 60 F C 20 30 60
![Algorithm Run Y A X 30 10 B 60 F C 20 30 60 Algorithm Run: Y A X 30 10 B 60 F C 20 30 60](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-47.jpg)
Algorithm Run: Y A X 30 10 B 60 F C 20 30 60 D Z 90 40 E 50
![Algorithm Run W 120 Y A X 30 10 B 60 F C 20 Algorithm Run: W 120 Y A X 30 10 B 60 F C 20](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-48.jpg)
Algorithm Run: W 120 Y A X 30 10 B 60 F C 20 30 60 D Z 90 40 E 50
![Algorithm Run D Z 90 40 E W 120 50 Y A X 30 Algorithm Run: D Z 90 40 E W 120 50 Y A X 30](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-49.jpg)
Algorithm Run: D Z 90 40 E W 120 50 Y A X 30 10 B 60 F C 20 30 60
![Algorithm Run V 210 0 Z 90 0 D 40 E 1 W 120 Algorithm Run: V 210 0 Z 90 0 D 40 E 1 W 120](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-50.jpg)
Algorithm Run: V 210 0 Z 90 0 D 40 E 1 W 120 1 50 Y 60 0 X 30 0 A 1 0 C 1 10 B 1 20 F 30 60
![The Huffman encoding A B C D E F 1000 1001 101 00 01 The Huffman encoding: A: B: C: D: E: F: 1000 1001 101 00 01](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-51.jpg)
The Huffman encoding: A: B: C: D: E: F: 1000 1001 101 00 01 11 V 210 0 Z 90 0 D 40 E 1 W 120 1 50 Y 60 0 X 30 0 A 1 0 F 1 C 30 1 10 B 20 File Size: 10 x 4 + 20 x 4 + 30 x 3 + 40 x 2 + 50 x 2 + 60 x 2 = 40 + 80 + 90 + 80 + 100 + 120 = 510 bits 60
![Note the savings The Huffman code Required 510 bits for the file Note the savings: • The Huffman code: • Required 510 bits for the file.](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-52.jpg)
Note the savings: • The Huffman code: • Required 510 bits for the file. • Fixed length code: • Need 3 bits for 6 characters. • File has 210 characters. • Total: 630 bits for the file.
![Greedy Algorithm Initialize trees of a single node each Keep the roots Greedy Algorithm • Initialize trees of a single node each. • Keep the roots](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-53.jpg)
Greedy Algorithm • Initialize trees of a single node each. • Keep the roots of all subtrees in a priority queue. • Iterate until only one tree left: • Merge the two smallest frequency subtrees into a single subtree with two children, and insert into priority queue.
![Huffman Algorithm Total run time n lgn HuffmanC 1 n C 2 Q Huffman Algorithm Total run time: (n lgn) Huffman(C) 1. n = |C| 2. Q](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-54.jpg)
Huffman Algorithm Total run time: (n lgn) Huffman(C) 1. n = |C| 2. Q = C // Q is a binary Min-Heap; (n) Build-Heap 3. for i = 1 to n-1 4. z = Allocate-Node() 5. x = Extract-Min(Q) // (lgn), (n) times 6. y = Extract-Min(Q) // (lgn), (n) times 7. left(z) = x 8. right(z) = y 9. f(z) = f(x) + f(y) 10. Insert(Q, z) // (lgn), (n) times 11. return Extract-Min(Q) // return the root of the tree
![Algorithm correctness Need to prove two things for greedy algorithms Greedy Choice Property The Algorithm correctness: Need to prove two things for greedy algorithms: Greedy Choice Property: The](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-55.jpg)
Algorithm correctness: Need to prove two things for greedy algorithms: Greedy Choice Property: The choice of local optimum is indeed part of a global optimum. Optimal Substructure Property: When we recursive on the remaining and combine it with the local optimum of the greedy choice, we get a global optimum.
![Huffman Algorithm correctness Greedy Choice Property There exists a minimum cost prefix tree where Huffman Algorithm correctness: Greedy Choice Property: There exists a minimum cost prefix tree where](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-56.jpg)
Huffman Algorithm correctness: Greedy Choice Property: There exists a minimum cost prefix tree where the two smallest frequency characters are indeed siblings with the longest path from root. This means that the greedy choice does not hurt finding the optimum.
![Algorithm correctness Optimal Substructure Property An optimal solution to the problem once we choose Algorithm correctness: Optimal Substructure Property: An optimal solution to the problem once we choose](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-57.jpg)
Algorithm correctness: Optimal Substructure Property: An optimal solution to the problem once we choose the two least frequent elements and combine them to produce a smaller problem, is indeed a solution to the problem when the two elements are added.
![Algorithm correctness Greedy Choice Property There exists a minimum cost tree where Algorithm correctness: • Greedy Choice Property: • There exists a minimum cost tree where](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-58.jpg)
Algorithm correctness: • Greedy Choice Property: • There exists a minimum cost tree where the minimum frequency elements are longest path siblings: • Proof by contradiction: • Assume that is not the situation. Then there are two elements in the longest path. • Say a, b are the elements with smallest frequency and x, y the elements in the longest path
![Algorithm correctness CT da dy da d y a x We know about Algorithm correctness: CT da dy da ≤ d y a x We know about](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-59.jpg)
Algorithm correctness: CT da dy da ≤ d y a x We know about depth and frequency: y fa ≤ f y
![Algorithm correctness CT We also know about code tree CT da dy a x Algorithm correctness: CT We also know about code tree CT: da dy a x](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-60.jpg)
Algorithm correctness: CT We also know about code tree CT: da dy a x y Now exchange a and y. ∑fσdσ σ is smallest possible.
![Algorithm correctness CostCT fσdσ CT σ fσdσfadafydy da dy σa y y Algorithm correctness: Cost(CT) = ∑fσdσ = CT’ σ ∑fσdσ+fada+fydy≥ da dy σ≠a, y y](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-61.jpg)
Algorithm correctness: Cost(CT) = ∑fσdσ = CT’ σ ∑fσdσ+fada+fydy≥ da dy σ≠a, y y x a (da ≤ dy, fa ≤ fy Therefore fada ≥fyda and fydy ≥fady ) ∑fσdσ+fyda+fady= σ≠a, y cost(CT’)
![Algorithm correctness CT db dx b x a Now do the same thing for Algorithm correctness: CT db dx b x a Now do the same thing for](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-62.jpg)
Algorithm correctness: CT db dx b x a Now do the same thing for b and x
![Algorithm correctness CT db dx x b a And get an optimal code tree Algorithm correctness: CT” db dx x b a And get an optimal code tree](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-63.jpg)
Algorithm correctness: CT” db dx x b a And get an optimal code tree where a and b are sibling with the longest paths
![Algorithm correctness Optimal substructure property Let a b be the symbols with the smallest Algorithm correctness: Optimal substructure property: Let a, b be the symbols with the smallest](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-64.jpg)
Algorithm correctness: Optimal substructure property: Let a, b be the symbols with the smallest frequency. Let x be a new symbol whose frequency is fx =fa +fb. Delete characters a and b, and find the optimal code tree CT for the reduced alphabet. Then CT’ = CT U {a, b} is an optimal tree for the original alphabet.
![Algorithm correctness CT CT x fx f a f b x a Algorithm correctness: CT CT’ x fx = f a + f b x a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-65.jpg)
Algorithm correctness: CT CT’ x fx = f a + f b x a b
![Algorithm correctness costCTfσdσ fσdσ fada fbdb σ σa b fσdσ Algorithm correctness: cost(CT’)=∑fσd’σ = ∑fσd’σ + fad’a + fbd’b= σ σ≠a, b ∑fσd’σ +](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-66.jpg)
Algorithm correctness: cost(CT’)=∑fσd’σ = ∑fσd’σ + fad’a + fbd’b= σ σ≠a, b ∑fσd’σ + fa(dx+1) + fb (dx+1) = σ≠a, b ∑fσd’σ+(fa + fb)(dx+1)= σ≠a, b ∑fσdσ+fx(dx+1)+fx = cost(CT) + fx σ≠a, b
![Algorithm correctness CT CT x fx f a f b x costCTfx Algorithm correctness: CT CT’ x fx = f a + f b x cost(CT)+fx](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-67.jpg)
Algorithm correctness: CT CT’ x fx = f a + f b x cost(CT)+fx = cost(CT’) a b
![Algorithm correctness Assume CT is not optimal By the previous lemma there is a Algorithm correctness: Assume CT’ is not optimal. By the previous lemma there is a](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-68.jpg)
Algorithm correctness: Assume CT’ is not optimal. By the previous lemma there is a tree CT” that is optimal, and where a and b are siblings. So cost(CT”) < cost(CT’)
![Consider Algorithm correctness CT CT x fx f a f b By Consider Algorithm correctness: CT’’’ CT” x fx = f a + f b By](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-69.jpg)
Consider Algorithm correctness: CT’’’ CT” x fx = f a + f b By a similar argument: cost(CT’’’)+fx = cost(CT”) x a b
![Algorithm correctness We get costCT costCT fx costCT fx Algorithm correctness: We get: cost(CT’’’) = cost(CT”) – fx < cost(CT’) – fx =](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-70.jpg)
Algorithm correctness: We get: cost(CT’’’) = cost(CT”) – fx < cost(CT’) – fx = cost(CT) and this contradicts the minimality of cost(CT).
![Greedy vs Dynamic Greedy Algorithms Can assemble a globally optimal solution by Greedy vs. Dynamic • Greedy Algorithms – Can assemble a globally optimal solution by](https://slidetodoc.com/presentation_image_h2/b40cf3d794d2876bb5d243647aa96c91/image-71.jpg)
Greedy vs. Dynamic • Greedy Algorithms – Can assemble a globally optimal solution by making locally optimal choices – Making the choice before solving the sub-problems – Top-down (simpler and more efficient) – Can solve some problems optimally • Dynamic Programming – Choice depends on knowing optimal solutions to subproblems. Solve all sub-problems – Bottom-up (slow) – Can solve more problems optimally
Huffman coding - greedy algorithm
Greedy algorithms
Advantages and disadvantages of greedy algorithm
List of greedy algorithms
Advanced search algorithms
Greedy algorithm
Greedy bank method
Workplace problems
Greedy horse
Algoritmo greedy
Character traits for adventurous
Greedy snake algorithm
Exchange argument greedy
General principle of greedy algorithm
Kassim and the greedy pirate
Kassim and the greedy pirate
Penyelesaian knapsack problem dengan kriteria greedy
What is gpsr
Knapsack method
Hufflepuff first year timetable
"greedy kings are evil" is represented as
Greedy best first search example
Greedy layer wise training of deep networks
Greedy approach
Greedy
Greedy algorithm
Jurnal algoritma greedy
Reservation table in computer architecture
Where is greedy marker
Greedy activity selector
Greedy search
Kassim and the greedy pirates treasure
Matroid greedy
Greedy heuristic
Contoh soal algoritma greedy
Top down greedy approach
Algoritmul greedy
Contractive autoencoder
Greedy property
Backtracking vs greedy
Coin change greedy algorithm proof
01 knapsack
Tsp greedy
Greedy algoritmus
Activity selection problem greedy algorithm
Blind search
Qallow
Interval scheduling greedy
Greedy search
Greedy algorithm definition
Obss router
Kassim and the greedy dragon
Offline caching greedy algorithm
Thomas putnam accusations quotes
Pairwise exchange method
Activity selection problem greedy algorithm example
Mystery code
Greedy one
Greedy algorithm time complexity
Algoritma greedy
Rezolvare
Greedy vs dynamic programming
Greedy algorithm for job sequencing with deadlines
Huffman coding greedy algorithm
Amihood amir
Straightforward algorithm
Introduction of design and analysis of algorithms
Computer arithmetic: algorithms and hardware designs
Algorithm efficiency
Data structures and algorithms tutorial
Data structures and algorithms bits pilani
Bsort