Cs 316: Algorithms (Introduction) : SPRING 2015
Cs 316: Algorithms (Introduction) : SPRING 2015
(INTRODUCTION)
Dr. Ensaf Hussein
Department Of Computer Science
SPRING 2015
2/10/2015
1
GRADING POLICY
2/10/2015
2
RESOURCES
Textbook:
Thomas Cormen, Charles Leiserson, Ronald Rivest, and
Clifford Stein. Introduction to Algorithms. 3rd ed. MIT
Press, 2009.
Anany Levitin, Introduction to the design and analysis of
algorithms 2nd Edition, 2007.
Handouts
2/10/2015
3
COURSE SYLLABUS
• Algorithms Design and Analysis
• Asymptotic Notations
• Compute Complexity for :
• Non-recursive algorithms
• Recursive Algorithms
• Substitution Method
• Iteration Tree
• Master Method
• Divide and Conquer Algorithms ( Merge Sort- Quick Sort)
• Greedy Approach
• Graph
• BFS - DFS
• MST – Prim - Kruskal
• Shortest path – Dijkstra – Bellman Ford
• Hash Tables
• Backtracking
COURSE OBJECTIVES
Design algorithms using Psedocode.
algorithm
2/10/2015
7
Understand
the problem
Design
algorithm
Prove
correctness
Analyze the
algorithm
infinite
Code the
algorithm
finite
2/10/2015
8
ALGORITHM DESIGN:
3- An identifier begins with a letter. The data types of variables are not explicitly declared.
Node. Record
node * link;
Here link is a pointer to the record type node. Individual data items of a
record can be accessed with and period.
5- Assignment of values to variables is done using the assignment statement.
<statement-n>
}
For Loop:
For variable: = value-1 to value-2 step step do
{
<statement-1>
<statement-n>
}
repeat-until:
repeat
<statement-1>
<statement-n>
until<condition>
8- A conditional statement has the following forms.
If <condition> then <statement>
If <condition> then <statement-1>
Else <statement-1>
Case statement:
select case(expression)
{
case 1 : <statement-1>
case n : <statement-n>
default : <statement-n+1>
}
9- Input and output are done using the instructions read &
write.
Time Complexity
Space Complexity
16
SPACE & TIME COMPLEXITIES
Time Complexity
Depends on:
- Machine Speed
- Size of Data and
Number of Operations
needed (n)
Space Complexity
Depends on:
- Size of Data
- Size of Program
17
TIME COMPLEXITY
18
NUMBER OF OPERATIONS
Each "simple" operation (+, -, =, <, >=) is one operation.
Loops and function calls are not simple operations, but
depend upon the size of the data and the contents of a
function. We do not want ``sort'' to be a single step
operation.
Each memory access is one operation.
We measure T(n) of an algorithm by counting the
number of operations.
19
Number of Operations T(n)
factorial (n)
{
f = 1;
if ( n > 0 )
for (i = 1 to n) f = f * i ;
return f ;
}
20
COMPLEXITY OF THE FACTORIAL
ALGORITHM
Because T(n) = n always, then T(n) = Θ(n)
T(n)
Θ(n)
21
NUMBER OF OPERATIONS T(N)
Example (2): Linear Search in an array a[ ]
linSearch (a, key, n)
{
for (i = 0 to n-1)
if (a[i ]== key) return i;
return -1;
}
T(n) = number of array element comparisons.
Best case: T(n) = 1
Worst case: T(n) = n
22
COMPLEXITY OF THE LINEAR SEARCH
ALGORITHM
T(n)
O(n)
Ω(1)
n
23
EIGHT GROWTH FUNCTIONS
Eight functions O(n) that occur frequently in the analysis of
algorithms (in order of increasing rate of growth relative to
n):
• Constant ≈ 1
• Logarithmic ≈ log n
• Linear ≈ n
• Log Linear ≈ n log n
• Quadratic ≈ n 2
• Cubic ≈ n 3
• Exponential ≈ 2n
• Factorial≈ n !
24
GROWTH RATES COMPARED
n=1 n=2 n=4 n=8 n=16 n=32
1 1 1 1 1 1 1
logn 0 1 2 3 4 5
n 1 2 4 8 16 32
nlogn 0 2 8 24 64 160
n2 1 4 16 64 256 1024
n3 1 8 64 512 4096 32768
2n 2 4 16 256 65536 4294967296
n! 1 2 24 40320 20.9T Don’t ask!
25
A Display of the Growth of Functions Commonly Used
in Big-O Estimates.
BOUNDS (ASYMPTOTIC NOTATIONS)
Ο Θ Ω
Bounds describe the limiting behavior of algorithm
complexity at large (n). They are:
27
O-NOTATION
DEFINITION
The notation O( g(n) ) does not imply that g(n) is the worse
running time; it simply means that worst running time
would never exceed upper limit determined by g(n).
O-NOTATION
BASIC METHOD
Example(1): Using basic definition, we show that
3n2 + 10n ϵ O(n2)
Consider, 10 ≤ n for n ≥ 10 ( obvious !)
10n ≤ n2 for n ≥ 10 ( Multiplying both sides with n )
3n2+10n ≤ 3n2 + n2 for n ≥ 10 ( Adding 3n2 to both sides )
=4n2 (Simplifying )
3n2 + 10 n ≤ c.n2 for n ≥ n0 where c =4 and n0= 10 (Solution)
Therefore, it follows from the basic definition that
3n2 +10n = O(n2)
The choice of constant c is not unique. However, for each different c there
is a corresponding value of n0 which satisfies the basic relation .This
behavior is illustrated by the next example
Example(2): In the preceding example it was shown that
3n2 + 10n ≤ c.n2 for c=4, n0=10.
We now show that the relation holds true for a different value of c
and corresponding n0.
Consider n ≤ n2 for n ≥ 1 ( Obvious)
10n ≤ 10n2 for n≥ 1 (Multiplying both sides with 10 )
3n2 +10 n ≤ 3n2 + 10n2 for n≥ 1 ( Adding 3n2 on both sides)
3n2 +10n ≤ 13n2 for n ≥ 1 ( Simplifying )
Or,
3n2 +10n ≤ c.n2, for n ≥ n0 where c=13, n0=1 ( Solution )
Therefore, by basic definition,
3n2 + 10n = O(n2)
O-NOTATION EXAMPLE
COMPARISON OF GROWTH RATES
For, n3 ≥ n2 for n ≥ 0