# Greedy Algorithms

Earlier this year, I took Stanford’s Coursera class on Algorithms Design and Analysis Part 1, and I was looking forward to Part 2. This week Part 2 of the class began, and I just completed Week 1 lectures and homework quiz and programming assignment. The class is expertly taught by Professor Tim Roughgarden.

The first week of lectures consisted of an optional refresher for content covered in the previous class (Part 1), followed by an introduction to greedy algorithms, basic scheduling algorithms, and finally Prim’s Minimum Spanning Tree algorithm. The homework consisted of a multiple choice quiz, and a programming assignment: a question around scheduling and a question on Prim’s algorithm.

Similar greedy algorithms were compared like Dijkstra’s shortest path, Prim’s and Kruskal’s algorithms because they are all greedy, myopic choice algorithms. Although greedy algorithms are generally easy to estimate running time, although they are difficult to prove correct. The optimal caching algorithm defined by Laszlo Belady [1960s] is a greedy algorithm by using the heuristic that the “farthest in the future” next access of a cache item is the one that should be evicted.

Job scheduling of a single resource was simply done by sorting in descending order the ratio of the importance of the job over the time for the job to complete. This approach was proved to always be correct as the optimal way to schedule jobs because it minimized the sum-product of the importance multiplied by the completion time of each job.

Finally, Minimum Spanning Tree concepts were covered again. O(m log n) running time for Prim’s and Kruskals where n=vertices and m=edges. The simpler algorithm can be done in O(m*n), while with the use of heaps (Java’s PriorityQueues) can attain the asymptotically better running time.