0% found this document useful (0 votes)
202 views44 pages

Cs 316: Algorithms (Introduction) : SPRING 2015

This document provides information about an algorithms course, including: - The grading policy splits the grade between a midterm, assignments, practical task, and final exam. - Textbooks and resources for the course are listed, including books on algorithms design and analysis. - The course syllabus outlines topics to be covered like algorithm design, analysis, sorting, searching, and graph algorithms. - Course objectives are defined as designing algorithms, analyzing complexity, and applying standard algorithms.

Uploaded by

Ahmed Khairy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
202 views44 pages

Cs 316: Algorithms (Introduction) : SPRING 2015

This document provides information about an algorithms course, including: - The grading policy splits the grade between a midterm, assignments, practical task, and final exam. - Textbooks and resources for the course are listed, including books on algorithms design and analysis. - The course syllabus outlines topics to be covered like algorithm design, analysis, sorting, searching, and graph algorithms. - Course objectives are defined as designing algorithms, analyzing complexity, and applying standard algorithms.

Uploaded by

Ahmed Khairy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

CS 316: ALGORITHMS

(INTRODUCTION)
Dr. Ensaf Hussein
Department Of Computer Science
SPRING 2015

2/10/2015
1
GRADING POLICY

• Mid-term Exam 20%


• Assignments 10 %
• Practical Task 10%
• Final-Exam (Written Exam) 60 %

2/10/2015
2
RESOURCES

Textbook:
Thomas Cormen, Charles Leiserson, Ronald Rivest, and
Clifford Stein. Introduction to Algorithms. 3rd ed. MIT
Press, 2009.
Anany Levitin, Introduction to the design and analysis of
algorithms 2nd Edition, 2007.

Handouts

2/10/2015
3
COURSE SYLLABUS
• Algorithms Design and Analysis
• Asymptotic Notations
• Compute Complexity for :
• Non-recursive algorithms
• Recursive Algorithms
• Substitution Method
• Iteration Tree
• Master Method
• Divide and Conquer Algorithms ( Merge Sort- Quick Sort)
• Greedy Approach
• Graph
• BFS - DFS
• MST – Prim - Kruskal
• Shortest path – Dijkstra – Bellman Ford
• Hash Tables
• Backtracking
COURSE OBJECTIVES
 Design algorithms using Psedocode.

 Demonstrate and study asymptotic notations in best, worst


and average case.

 Define and analyze the complexity of recursive and non-


recursive algorithms.

 Understand, analyze and apply standard algorithms


involving searching, sorting, tree, graph, greedy,
backtracking and dynamic programming algorithms.
ALGORITHMS
ABU JA‘FAR MOHAMMED IBN MUSA
AL-KHOWARIZMI (C. 780 – C. 850)
al-Khowarizmi, an astronomer and mathematician, was a member of the
House of Wisdom, an academy of scientists in Baghdad.
The name al-Khowarizmi means “from the town of Kowarzizm,” which was
then part of Persia, but is now called Khiva and is part of Uzbekistan. al-
Khowarizmi wrote books on mathematics, astronomy, and geography.
Western Europeans first learned about algebra from his works. The word
algebra comes from al-jabr, part of the title of his book Kitab al-jabr w’al
muquabala. This book was translated into Latin and was a widely used
textbook. His book on the use of Hindu numerals describes procedures
for arithmetic operations using these numerals. European authors used a
Latin corruption of his name, which later evolved to the word algorithm,
to describe the subject of arithmetic with Hindu numerals.
ALGORITHMS
An algorithm is a sequence of unambiguous
instructions for solving a computational problem,
i.e., for obtaining a required output for any
legitimate input in a finite amount of time.
problem

algorithm

input “computer” output

2/10/2015
7
Understand
the problem

Decide on: exact vs.


approximate solving,
data structures

Design
algorithm

Prove
correctness

Analyze the
algorithm
infinite
Code the
algorithm
finite
2/10/2015
8
ALGORITHM DESIGN:

Algorithm can be described in three ways.


1- Natural language like English:
When this way is chosen care should be taken, we should
ensure that each & every statement is definite.
2- Graphic representation called flowchart:
This method will work well when the algorithm is small&
simple.
3- Pseudo-code Method:
In this method, we should typically describe algorithms as
program, which resembles language like Pascal & algol.
PSEUDO-CODE CONVENTIONS:
1- Comments begin with // and continue until the end of line.

2- Blocks are indicated with matching braces {and}.

3- An identifier begins with a letter. The data types of variables are not explicitly declared.

4- Compound data types can be formed with records. Here is an example,

Node. Record

data type – 1 data-1;

data type – n data – n;

node * link;

Here link is a pointer to the record type node. Individual data items of a
record can be accessed with  and period.
5- Assignment of values to variables is done using the assignment statement.

<Variable>:= <expression>; Or <Variable> ← <expression>;

6- There are two Boolean values TRUE and FALSE.


 Logical Operators AND, OR, NOT
Relational Operators <, <=,>,>=, =, !=
7- The following looping statements are employed.

For, while and repeat-until


While Loop:
While < condition > do
{
<statement-1>

<statement-n>
}
For Loop:
For variable: = value-1 to value-2 step step do
{
<statement-1>
<statement-n>
}
repeat-until:

repeat
<statement-1>

<statement-n>
until<condition>
8- A conditional statement has the following forms.
 If <condition> then <statement>
 If <condition> then <statement-1>
Else <statement-1>

Case statement:
select case(expression)
{
case 1 : <statement-1>

case n : <statement-n>
default : <statement-n+1>
}
9- Input and output are done using the instructions read &
write.

10- There is only one type of procedure:


Algorithm, the heading takes the form,

Algorithm Name (Parameter lists)

 As an example, the following algorithm fields & returns the


maximum of ‘n’ given numbers:
EXAMPLE 1
1. algorithm Max(A,n)
2. // A is an array of size n
3. {
4. Result := A[1];
5. for I ← 2 to n do
6. if A[I] > Result then
7. Result ← A[I];
8. return Result;
9. }

In this algorithm (named Max), A & n are procedure parameters.


Result & I are Local variables.
ANALYSIS OF ALGORITHMS

 The main goal is to determine the cost of running an


algorithm and how to reduce that cost. Cost is
expressed as Complexity

 Time Complexity

 Space Complexity

16
SPACE & TIME COMPLEXITIES

 Time Complexity
Depends on:
- Machine Speed
- Size of Data and
Number of Operations
needed (n)

 Space Complexity
Depends on:
- Size of Data
- Size of Program

17
TIME COMPLEXITY

 Expressed as T(n) = number of operations required.

 (n) is the Problem Size:


n could be the number of specific operations, or the
size of data (e.g. an array) or both.

18
NUMBER OF OPERATIONS
Each "simple" operation (+, -, =, <, >=) is one operation.
Loops and function calls are not simple operations, but
depend upon the size of the data and the contents of a
function. We do not want ``sort'' to be a single step
operation.
Each memory access is one operation.
We measure T(n) of an algorithm by counting the
number of operations.

19
Number of Operations T(n)

Example (1): Factorial Function

factorial (n)
{
f = 1;
if ( n > 0 )
for (i = 1 to n) f = f * i ;
return f ;
}

Let T(n) = Number of multiplications.


For a given n , then T(n) = n (always)

20
COMPLEXITY OF THE FACTORIAL
ALGORITHM
Because T(n) = n always, then T(n) = Θ(n)

T(n)

Θ(n)

21
NUMBER OF OPERATIONS T(N)
Example (2): Linear Search in an array a[ ]
linSearch (a, key, n)
{
for (i = 0 to n-1)
if (a[i ]== key) return i;
return -1;
}
T(n) = number of array element comparisons.
Best case: T(n) = 1
Worst case: T(n) = n
22
COMPLEXITY OF THE LINEAR SEARCH
ALGORITHM

T(n) = 1 in the best case. T(n) = n in the worst case


We write that as: T(n) = Ω(1) and T(n) = O(n)

T(n)
O(n)

Ω(1)
n
23
EIGHT GROWTH FUNCTIONS
Eight functions O(n) that occur frequently in the analysis of
algorithms (in order of increasing rate of growth relative to
n):
• Constant ≈ 1
• Logarithmic ≈ log n
• Linear ≈ n
• Log Linear ≈ n log n
• Quadratic ≈ n 2
• Cubic ≈ n 3
• Exponential ≈ 2n
• Factorial≈ n !

24
GROWTH RATES COMPARED
n=1 n=2 n=4 n=8 n=16 n=32
1 1 1 1 1 1 1
logn 0 1 2 3 4 5
n 1 2 4 8 16 32
nlogn 0 2 8 24 64 160
n2 1 4 16 64 256 1024
n3 1 8 64 512 4096 32768
2n 2 4 16 256 65536 4294967296
n! 1 2 24 40320 20.9T Don’t ask!
25
A Display of the Growth of Functions Commonly Used
in Big-O Estimates.
BOUNDS (ASYMPTOTIC NOTATIONS)

Ο Θ Ω
Bounds describe the limiting behavior of algorithm
complexity at large (n). They are:

Worst Case: Upper Bound (Big O complexity)


Best Case: Lower Bound (Big Ω complexity)
Exact (Big Θ complexity)

27
O-NOTATION
DEFINITION

If f(n) is running time of an algorithm, and g(n) is some


standard growth function such that for some positive
constants c and integer n0 ,
f(n) ≤ c.g(n) for all n ≥ n0

then f(n) = O(g(n)) (Read f(n) is Big-Oh of g(n) )


The behavior of f(n) and g(n) is portrayed in the
diagram. It follows that for n<n0, f(n) may lie above or
below g(n), but for all n ≥ n0, f(n) falls consistently
below g(n)..
O-NOTATION
DEFINITION

Trend of running time


O-NOTATION
ASYMPTOTIC UPPER BOUND

If f(n) = O(g(n)), then the function g(n) is called asymptotic


upper bound of f(n)

Since the worst-case running time of an algorithm is the


maximum running time for any input, it would follow that
g(n) provides an upper bound on the worst running time

The notation O( g(n) ) does not imply that g(n) is the worse
running time; it simply means that worst running time
would never exceed upper limit determined by g(n).
O-NOTATION
BASIC METHOD
Example(1): Using basic definition, we show that
3n2 + 10n ϵ O(n2)
Consider, 10 ≤ n for n ≥ 10 ( obvious !)
10n ≤ n2 for n ≥ 10 ( Multiplying both sides with n )
3n2+10n ≤ 3n2 + n2 for n ≥ 10 ( Adding 3n2 to both sides )
=4n2 (Simplifying )
3n2 + 10 n ≤ c.n2 for n ≥ n0 where c =4 and n0= 10 (Solution)
Therefore, it follows from the basic definition that
3n2 +10n = O(n2)
The choice of constant c is not unique. However, for each different c there
is a corresponding value of n0 which satisfies the basic relation .This
behavior is illustrated by the next example
Example(2): In the preceding example it was shown that
3n2 + 10n ≤ c.n2 for c=4, n0=10.
We now show that the relation holds true for a different value of c
and corresponding n0.
Consider n ≤ n2 for n ≥ 1 ( Obvious)
10n ≤ 10n2 for n≥ 1 (Multiplying both sides with 10 )
3n2 +10 n ≤ 3n2 + 10n2 for n≥ 1 ( Adding 3n2 on both sides)
3n2 +10n ≤ 13n2 for n ≥ 1 ( Simplifying )
Or,
3n2 +10n ≤ c.n2, for n ≥ n0 where c=13, n0=1 ( Solution )
Therefore, by basic definition,
3n2 + 10n = O(n2)
O-NOTATION EXAMPLE
COMPARISON OF GROWTH RATES

The results of analysis in the preceding examples are


plotted in the diagram below.

It can be seen that the function cg(n) =4n2 ( c= 4)


overshoots the function f(n)= 3n2 + 10n

for n0=10. Also, the function cg(n) =13n2 ( c =13 )


grows faster than f(n) = 3n2 + 10n for n0=1
Growth of functions 13n2 and 4n2 versus the function 3n2 + 10 n
Ω-NOTATION
DEFINITION
If f(n) is running time of an algorithm, and g(n) is some
standard growth function such that for some positive constants
c, positive integer n0 ,

c.g(n) ≤ f(n) for all n ≥ n0

then f(n) = Ω(g(n)) (Read f(n) is Big-Omega of g(n) )


The behavior of f(n) and g(n) is portrayed in the graph. It
follows that for n < n0, f(n) may lie above or below g(n), but for
all n ≥ n0, f(n) falls consistently above g(n). It also implies that
g(n) grows slower than f(n)
Ω-NOTATION
ASYMPTOTIC LOWER BOUND
Ω-NOTATION
ASYMPTOTIC LOWER BOUND

If f(n) = Ω(g(n)), then the function g(n) is called


asymptotic lower bound of f(n)

Since the best-case running time of an algorithm is the


minimum running time for any input, it would follow
that g(n) provides a lower bound on best running time

As before, the notation Ω( g(n) ) does not imply that


g(n) is the best running time; it simply means that best
running time would never be lower than g(n).
Ω-NOTATION
BASIC METHOD
Example(1) : Using basic definition, we show that
n3 ϵ Ω(n2).

For, n3 ≥ n2 for n ≥ 0

i.e. we can select c=1 and n0= 0


Θ-NOTATION
DEFINITION
If f(n) is running time of an algorithm, and g(n) is some standard
growth function
such that for some positive constants c1 , c2 and positive integer n0
,
0 < c2.g(n) ≤ f(n) ≤ c1.g(n) for all n ≥ n0
then f(n) = θ(g(n)) (Read f(n) is theta of g(n) )
The behavior of f(n) and g(n) is portrayed in the graph. It follows
that for n < n0, f(n) may be above or below g(n), but for all n ≥ n0,
f(n) falls consistently between c1.g(n) and c2.g(n).
It also implies that g(n) grows as fast as f(n) The function g(n) is
said to be the asymptotic tight bound for f(n)
Θ-NOTATION
DEFINITION
Example: We show that ½ n (n-1) ϵ θ(n2).
We need to prove that
c2 g(n) ≤ F(n) ≤ c1 g(n) for all n ≥ no
first, consider the upper bound,
F(n) ≤ c1 g(n)
½ n(n-1) = ½ n2 – ½ n ≤ ½ n2 for all n ≥ 0.

Second, consider the lower bound,


c2 g(n) ≤ F(n)
½ n(n-1) = ½ n2 – ½ n ≥ ½ n2 – ½ n ½ n for all n ≥ 2.
= ¼ n2.
Hence we can select c2 = ¼ , c1= ½ , and n0 = 2.
COMPARISON OF GROWTH RATES
REFERENCES

Anany Levitin, Introduction to the design and analysis of


algorithms, 2nd Edition.
Chapter 1, sections 1.1, 1.2
Chapter 2, Sections 2.1,2.2
Kenneth H. Rosen, Discrete Mathematics and its applications,
7th Edition .
Chapter 3 Algorithms

You might also like