DAA Lecture-1
DAA Lecture-1
Design
Name:-
M Asif Mahmood
Lecturer GDC , D.I.KHAN
E-Mail:-
[email protected]
1
Covered Topics
• Algorithm Analysis
• Algorithm Related Issues
• Algorithm Design strategies
• Analysis of algorithms
• Order of Growth
2
Introduction of Algorithm
• An algorithm is a sequence of unambiguous instructions for
solving a problem, i.e., for obtaining a required output for
any legitimate input in a finite amount of time.
•Recipe, process, method, technique, procedure, routine,
… with following requirements:
•Finiteness
terminates after a finite number of steps
•Definiteness
rigorously and unambiguously specified
•Input
valid inputs are clearly specified
•Output
can be proved to produce the correct output given a
valid input
•Effectiveness
steps are sufficiently simple and basic
• 3
Introduction of Algorithm
problem
algorithm
4
Basic Issues Related to
Algorithms
• How to design algorithms
• Proving correctness
• Efficiency
Theoretical analysis
Empirical analysis
Optimality
5
Algorithm design
strategies
• Brute force
• Divide and conquer
• Decrease and conquer
• Transform and conquer
• Greedy approach
• Dynamic programming
• Backtracking and Branch and bound
• Space and time tradeoffs
6
Factors affecting/ masking
speed
• Number of input values
• Processor speed
• Number of simultaneous programs
etc
• The location of items in a data
structure
• Different implementations (Software
differences)
7
Analysis of Algorithms
• How good is the algorithm?
1.Correctness
2.Time efficiency
3.Space efficiency
8
Theoretical analysis of time
efficiency
• Time efficiency is analyzed by determining
the number of repetitions of the basic
operation as a function of input size
9
Theoretical analysis of time
efficiency (Cont !!!!)
Input Size and basic operation
•
examples are.
Input size Basic
Problem
measure operation
Search for key Number of
Key comparison
in list of n items items in list n
Multiply two
matrices of Dimensions of Floating point
floating point matrices multiplication
numbers
Floating point
Compute an n
multiplication
Visiting a vertex
#vertices
Graph problem or traversing an
and/or edges 10
Theoretical analysis of time
efficiency (Cont !!!!)
•Type of formulas to count basic
operations
1) Exact formula
e.g., C(n) = n(n-1)/2
12
Time efficiency of non-
recursive algorithms
Steps in mathematical analysis of non-
recursive algorithms:
19
Order of Growth (Asymptotic
Complexity)
Running time of an algorithm as a
function of input size n for large n.
Expressed using only the highest-
order term in the expression for
the exact running time.
◦ Instead of exact running time, say
(n2).
Describes behavior of function in the
limit.
20
Asymptotic growth rate
A way of comparing functions that ignores
constant factors and small input sizes
O(g(n)):
class of functions t(n) that grow no faster
than g(n)
21
Asymptotic Notation
, O, , o,
Defined for functions over the natural
numbers.
◦ Ex: f(n) = (n2).
◦ Describes how f(n) grows in comparison
to n2.
Define a set of functions; in practice used
to compare two function sizes.
The notations describe different rate-of-
growth relations between the defining
function and the defined set of functions.
22
-notation
For function g(n), we define
(g(n)), big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2,
and n0, such that n n0,
we have 0 c1g(n) f(n)
c2g(n)
} Intuitively: Set of all functions that
have the same rate of growth as g(n).
23
-notation
For function g(n), we define
(g(n)), big-Theta of n, as the set:
(g(n)) = {f(n) :
positive constants c1, c2,
and n0, such that n n0,
we have 0 c1g(n) f(n)
c2g(n)
}
Technically, f(n) (g(n)).
Older usage, f(n) = (g(n)).
I’ll accept either…
26
-notation
For function g(n), we define
(g(n)), big-Omega of n, as the
set:
(g(n)) = {f(n) :
positive constants c and n0,
such that n n0,
we have 0 cg(n) f(n)}
Intuitively: Set of all functions whose
rate of growth is the same as or higher
than that of g(n).
27
Relations Between , O,
28
Relations Between , , O
Theorem
Theorem :: For For any
any two
two functions
functions g(n)
g(n) and
and
f(n),
f(n),
f(n) == (g(n))
f(n) (g(n)) iff
iff
f(n)
f(n) == O(g(n))
O(g(n)) and f(n) == (g(n)).
and f(n) (g(n)).
I.e., (g(n)) = O(g(n)) (g(n))
29
Running Times
“Running time is O(f(n))” Worst case is O(f(n))
O(f(n)) bound on the worst-case running time
O(f(n)) bound on the running time of every
input.
(f(n)) bound on the worst-case running time
(f(n)) bound on the running time of every
input.
“Running time is (f(n))” Best case is (f(n))
Can still say “Worst-case running time is (f(n))”
◦ Means worst-case running time is given by some
unspecified function g(n) (f(n)).
30
Asymptotic Notation in
Equations
Can use asymptotic notation in equations
to replace expressions containing lower-
order terms.
For example,
4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n)
= 4n3 + (n2) = (n3). How to interpret?
Inequations, (f(n)) always stands for an
anonymous function g(n) (f(n))
◦ In the example above, (n2) stands for
3n2 + 2n + 1.
31
o -notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n): c > 0, n0 > 0 such that
n n0, we have 0 f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n)
as n approaches infinity:
lim [f(n) / g(n)] = 0
n
32
-notation
For a given function g(n), the set little-omega:
33
Comparison of Functions
fg ab
f (n) = O(g(n)) a b
f (n) = (g(n)) a b
f (n) = (g(n)) a = b
f (n) = o(g(n)) a < b
f (n) = (g(n)) a > b
34