0% found this document useful (0 votes)
3 views

Lecture Notes 4 Analysis of Algorithms

The document discusses three algorithmic approaches: greedy algorithms, divide and conquer, and dynamic programming. Greedy algorithms make localized decisions for optimal solutions but may fail for global optimization, while divide and conquer breaks problems into smaller sub-problems to solve them independently and merge results. Dynamic programming, on the other hand, solves overlapping sub-problems using memorization for overall optimization, contrasting with the other two approaches.

Uploaded by

sirodaniel48
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Lecture Notes 4 Analysis of Algorithms

The document discusses three algorithmic approaches: greedy algorithms, divide and conquer, and dynamic programming. Greedy algorithms make localized decisions for optimal solutions but may fail for global optimization, while divide and conquer breaks problems into smaller sub-problems to solve them independently and merge results. Dynamic programming, on the other hand, solves overlapping sub-problems using memorization for overall optimization, contrasting with the other two approaches.

Uploaded by

sirodaniel48
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

ANALYSIS OF ALGORITHMS

GREEDY ALGORITHMS

An algorithm is designed to achieve optimum solution for given problem. In greedy algorithm
approach, decisions are made from the given solution domain. As being greedy, the closest
solution that seems to provide optimum solution is chosen.

Greedy algorithms tries to find localized optimum solution which may eventually land in
globally optimized solutions. But generally greedy algorithms do not provide globally optimized
solutions.

Counting Coins
This problem is to count to a desired value by chosing least possible coins and greedy approach
forces the algorithm to pick the largest possible coin. If we are provided coins of € 1, 2, 5 and 10
and we are asked to count € 18 then the greedy procedure will be −

 1 − Select one € 10 coin, remaining count is 8


 2 − Then select one € 5 coin, remaining count is 3
 3 − Then select one € 2 coin, remaining count is 1
 3 − And finally selection of one € 1 coins solves the problem

Though, it seems to be working fine, for this count we need to pick only 4 coins. But if we
slightly change the problem then the same approach may not be able to produce the same
optimum result.

For currency system, where we have coins of 1, 7, 10 value, counting coins for value 18 will be
absolutely optimum but for count like 15, it may use more coins then necessary. For example −
greedy approach will use 10 + 1 + 1 + 1 + 1 + 1 total 6 coins. Where the same problem could be
solved by using only 3 coins (7 + 7 + 1)

Hence, we may conclude that greedy approach picks immediate optimized solution and may fail
where global optimization is major concern.

Examples

Most networking algorithms uses greedy approach. Here is the list of few of them −

 Travelling Salesman Problem


 Prim's Minimal Spanning Tree Algorithm
 Kruskal's Minimal Spanning Tree Algorithm
 Dijkstra's Minimal Spanning Tree Algorithm
 Graph - Map Coloring
 Graph - Vertex Cover
 Knapsack Problem
 Job Scheduling Problem

These and there are lots of similar problems which uses greedy approach to find an optimum
solution.

DIVIDE AND CONQUER

In divide and conquer approach, the problem in hand, is divided into smaller sub-problems and
then each problem is solved independently. When we keep on dividing the sub-problems into
even smaller sub-problems, we may eventually reach at a stage where no more dividation is
possible. Those "atomic" smallest possible sub-problem (fractions) are solved. The solution of all
sub-problems is finally merged in order to obtain the solution of original problem.

Broadly, we can understand divide-and-conquer approach as three step process.

Divide/Break

 This step involves breaking the problem into smaller sub-problems. Sub-problems should
represent as a part of original problem. This step generally takes recursive approach to
divide the problem until no sub-problem is further dividable. At this stage, sub-problems
become atomic in nature but still represents some part of actual problem.
Conquer/Solve

 This step receives lot of smaller sub-problem to be solved. Generally at this level,
problems are considered 'solved' on their own.

Merge/Combine

 When the smaller sub-problems are solved, this stage recursively combines them until
they formulate solution of the original problem.

This algorithmic approach works recursively and conquer & merge steps works so close that they
appear as one.

Examples

The following computer algorithms are based on divide-and-conquer programming approach −

 Merge Sort
 Quick Sort
 Binary Search
 Strassen's Matrix Multiplication
 Closest pair (points)

There are various ways available to solve any computer problem, but the mentioned are a good
example of divide and conquer approach.

DYNAMIC PROGRAMMING

Dynamic programming approach is similar to divide and conquer in breaking down the problem
in smaller and yet smaller possible sub-problems. But unlike, divide and conquer, these sub-
problems are not solved independently. Rather, results of these smaller sub-problems are
remembered and used for similar or overlapping sub-problems.

Dynamic programming is used where we have problems which can be divided in similar sub-
problems, so that their results can be re-used. Mostly, these algorithms are used for optimization.
Before solving the in-hand sub-problem, dynamic algorithm will try to examine the results of
previously solved sub-problems. The solutions of sub-problems are combined in order to achieve
the best solution.

So we can say that −

 The problem should be able to be divided in to smaller overlapping sub-problem.


 The optimum solution can be achieved by using optimum solution of smaller sub-
problems.
 Dynamic algorithms use memorization.

Comparison
In contrast to greedy algorithms, where local optimization is addressed, dynamic algorithms are
motivated for overall optimization of the problem.

In contrast to divide and conquer algorithms, where solutions are combined to achieve overall
solution, dynamic algorithms uses the output of smaller sub-problem and then try to optimize
bigger sub-problem. Dynamic algorithms uses memorization to remember the output of already
solved sub-problems.

Example

The following computer problems can be solved using dynamic programming approach −

 Fibonacci number series


 Knapsack problem
 Tower of Hanoi
 All pair shortest path by Floyd-Warshall
 Shortest path by Dijkstra
 Project scheduling

Dynamic programming can be used in both top-down and bottom-up manner. And of course,
most of the times, referring to previous solution output is cheaper than re-computing in terms of
CPU cycles.

GRAPH

A graph is a pictorial representation of a set of objects where some pairs of objects are connected
by links. The interconnected objects are represented by points termed as vertices, and the links
that connect the vertices are called edges.

Formally, a graph is a pair of sets (V, E), where V is the set of vertices and E is the set of edges,
connecting the pairs of vertices. Take a look at the following graph −
In the above graph,

V = {a, b, c, d, e}

E = {ab, ac, bd, cd, de}

You might also like