Unit 1 Notes Lecture 1
Unit 1 Notes Lecture 1
An algorithm is simply a set of logical instructions for carrying out a predefined task.
Problems
This concept of all problems behaving like mathematical functions might not
match your intuition for the behavior of computer programs. You might know of
programs to which you can give the same input value on two separate
occasions, and two different outputs will result. For example, if you type date to
a typical Linux command line prompt, you will get the current date. Naturally the
date will be different on different days, even though the same command is given.
However, there is obviously more to the input for the date program than the
command that you type to run the program. The date program computes a
function. In other words, on any particular day there can only be a single answer
returned by a properly running date program on a completely specified input. For
all computer programs, the output is completely determined by the program's
full set of inputs. Even a "random number generator" is completely determined
by its inputs (although some random number generating systems appear to
get around this by accepting a random input from a physical process beyond t h e
user’s control). The limits to what functions can be implemented by programs
is part of the domain of Computability.
Algorithm: - a step-by-step procedure for performing some task in finite time (is a step
list that solves a problem)
The more you think algorithmically, the easier you solve real world issues
Complex algorithms constructed on simple ones
Algorithms
By definition, something can only be called an algorithm if it has all of the following
properties.
Programs
The requirement that an algorithm must terminate means that not all computer
programs meet the technical definition of an algorithm. Your operating system is
one such program. However, you can think of the various tasks for an operating
system (each with associated inputs and outputs) as individual problems, each
solved by specific algorithms implemented by a part of the operating system
program, and each one of which terminates once its output is produced.
• Conceptual simplicity
• Difficulty to implement
• Difficulty to modify
• Running time
• Space complexity: amount of memory/ number of memory cells that the algorithm
needs.
• Best case: Least amount of time needed by the algorithm to solve an instance of
the problem of size n.
• Worst case: Largest amount of time needed by the algorithm to solve an instance of
the problem of size n.
• Average case: time to solve instance 1 of size n + time to solve instance 2 of size n
+ time to solvelast instance of size n/ number of instances of size n
• The asymptotic time complexity can be denoted with a big “O” letter. OR
Landau’s symbols O and ѳ (theta) instead of using c a constant for notation
Big-O Notation
This notation gives the tight upper bound of the given function. Generally, it is
represented as f(n) = O(g(n)). That means, at larger values of n, the upper
bound of f(n) is g(n).
Omega-Ω Notation
Theta-Θ Notation
This notation decides whether the upper and lower bounds of a given function
(algorithm) are the same. The average running time of an algorithm is always between
the lower bound and the upper bound. If the upper bound (O)
and lower bound (Ω) give the same result, then the Θ
notation will also have the same rate of growth.
Examples of complexity
2
• O(n ) – square time
n
• (2 ) – exponential time
Examples of Data Structures and the Associated time complexity
Greedy Algorithms
Brute Force
Dynamic programming
Backtracking
Genetic Algorithms
GREEDY ALGORITHM
-is an algorithm that follows the problem solving [method/process] heuristic of making
the locally optimal choice at each stage with the hope of finding a global optimum.
"Take what you can get now" strategy. Work in phases. In each phase the currently best
decision is made
The greedy method is a general algorithm design paradigm, built on the following
elements:
Dijkstra's algorithm
Prim's algorithm,
Kruskal's algorithm
int num;
if(denomination[i]<=amount){
num =amount/denomination[i];
amount=amount -(num*denomination[i]);
}
Huffman Trees
Job scheduling problem on a computer: you have 9 jobs and 3 processors so you might
start with longest-running jobs first on whatever processor available OR alternatively
start with the shortest job first.
Reduce the problem to smaller problems (by a factor of at least 2) solved recursively
and then combine the solutions
Divide: divide the input data S into two or more disjoint subsets S 1 and S2
Conquer: Combine the solutions for S1, S2… Sn for S. e.g pow(x,y) and log (n, base)
- This approach can lead us to an algorithm which is more efficient than the brute-
force algorithm
E.g find the closest pair of points:
(a) Divide/split into left half and a right half of approximately n/2 points each
[points by X, the x-axis]
(b) Recursively find the closest pair on each of these two sub-problems
(c) Combine sub-problems solutions to find closest pair in given point set.
Points to be permutations of each other. Find the closest pair (never duplicate
points, just reference them)
Always return the smaller of two values for x
Distance(d) = min (dL, dR)
*No point in the right half closest to the left half vice-versa so not to worry about
the sub-division not in the sub-division XL or XR
Combine steps:
Construct Y-strip to reference all points in the order of points by Y coordinate (sorted by
Y coordinate).
Step1:
With X-coordinate >(greater)XR- d && <(less) XL + d
Step2:
Find any pair within Ystrip with distance < d (one from each side). Consider each point
in Ystrip looking upwards.
*As you’re processing the points recursively, SORT them based on the Y-coordinate
*Compare/selected points with Y-coordinate
Examples of Divide and Conquer Algorithms:
Binary Search
Merge sort
Quick sort
Tree traversal
Reduce the problem to smaller problems solved recursively and then combine
the solutions
BRUTE FORCE
Enumerates all possible problem solving techniques and tests for validity in problem
solving until the correct one is found. Algorithms using Brute force are not always
efficient.
e.g the asymptotic time complexity for brute force analysis of comparing all pairs of
points. Compute the distance and compare with the best so far.
E.g points 1-------------> compare with n- 1 (points which follows), look at others not
point 1
. 1
n…………… 0, nothing to compare with because all points are exhausted
*take the ends and get a sum of 1+2+…(n-2)+(n-1) which gives us an iteration equal to
nC2
Examples:
Sequential search
Factorial (n)
F1
FOR i 1, n Do
F f*i
End for
Return f