Advanced Midterms
Advanced Midterms
DR.S.S.Shehaby
S:1
Why Analyze Algorithms?
An algorithm can be analyzed in terms of time efficiency or
space utilization. We will consider only the former right now.
The running time of an algorithm is influenced by several
factors:
• Speed of the machine running the program
• Language in which the program was written. For
example, programs written in assembly language
generally run faster than those written in C or C++,
which in turn tend to run faster than those written in
Java.
• Efficiency of the compiler that created the program
• The size of the input: processing 1000 records will take
more time than processing 10 records.
• Organization of the input: if the item we are searching
for is at the top of the list, it will take less time to find it
than if it is at the bottom.
S:2
Generalizing Running Time
S:3
Analyzing Running Time
T(n), or the running time of a particular algorithm on input of size n, is taken to be the
number of times the instructions in the algorithm are executed. Pseudo code algorithm
illustrates the calculation of the mean (average) of a set of n numbers:
1. n = read input from user
2. sum = 0
3. i = 0
4. while i < n
5. number = read input from user
6. sum = sum + number
7. i = i + 1
8. mean = sum / n
The computing time for this algorithm in terms on input size n is: T(n) = 4n + 5.
The basic idea of big-Oh notation is this: Suppose f and g are both
real-valued functions of a real variable x. If, for large values of x, the
graph of f lies closer to the horizontal axis than the graph of some
multiple of g, then f is of order g, i.e., f(x) = O(g(x)). So, g(x)
represents an upper bound on f(x).
S:5
Comparison Notations
If there is a positive integer C such that f(n) OPERATOR C * g(n) for all
positive integers n > k, where k is some number, the following
definitions are given:
if OPERATOR is < then : f(n)=o (g(n)
if OPERATOR is >= then : f(n)= (g(n)
if OPERATOR is > then : f(n)= (g(n)
if OPERATOR is <= then : f(n)= O (g(n)
if OPERATOR is == t hen : f(n)= (g(n)
The basic idea of big-Oh notation is this: Suppose f and g are both
real-valued functions of a real variable x. If, for large values of x, the
graph of f lies closer to the horizontal axis than the graph of some
multiple of g, then f is of order g, i.e., f(x) = O(g(x)). So, g(x)
represents an upper bound on f(x).
S:6
Conventional methods to prove O( )
• n2+100n+5=O(n3)
n2+100n+5<n2 +105(n) for n>1
<n2 +105(n2) for n>1
<n2 +106n2=O(n3)
• n2+3n-1=O(n2)
n2+3n-1≤ n2 + 3n ≤ 4 n2 =O(n2)
• 2n7 - 6n5 + 10n2 – 5 = O(n7)
2n7 - 6n5 + 10n2 – 5 < 2n7 + 6n5 + 10n2
≤ 2n7+ 6n7 + 10n7 ≤18n7
≤O(n7)
• To Prove f= (g) prove: f=O(g) and g=O(f):
n2+3n-1=O(n2) and n2=O(n2+3n-1) hence n2+3n-1= (n2)
However n3 O(n2) hence n3 (n2)
S:7
Prove O( ) using lim
𝑛→∞
for (j=1;j<=i;j++) n i
for (k=1;k<=j;k++)count++; = j
i=1 j=1
printf("\nCount=%d [N(N+1)(2N+1)/3 n
=1/2 i2 +n(n+1)/2
i=1
=1/2[n(n + 1)(2n + 1)/6+ n(n+1)/2]
count=0;N=128;
= 𝒏(𝒏 + 𝟏)(𝒏 + 𝟐)/𝟔
//for (i=1;i<=N;i*=2) count++;
for (i=N;i>=1;i/=2) count++;
printf("\nCount=%d [log(n)+1=%d]\n",count,(int)round((log(N)+1)/log(2)));
Count=8 [log(n)+1=8]
S:10
Don’t panic
Integration approximation
f(x)
𝒏 𝒏+𝟏
𝒇 𝒌 ≥ 𝒇 𝒙 𝒅𝒙
𝒌=𝟏 𝟏
𝒏 𝒏
𝒇 𝒌 ≤ 𝒇 𝒙 𝒅𝒙
𝒌=𝟐 𝟏
S:11
Don’t panic
Integration approximation
𝒏+𝟏 𝒏 𝒏+𝟏
𝒇 𝒙 𝒅𝒙 + 𝒇(𝟏) ≥ 𝒇 𝒌 ≥ 𝒇 𝒙 𝒅𝒙
𝟐 𝒌=𝟏 𝟏
𝒏
𝒇 𝒌 =𝑶 𝒈 𝒏+𝟏 ,
𝒌=𝟏 𝒏
𝒘𝒉𝒆𝒓𝒆 𝒈 𝒏 = 𝒇 𝒙 𝒅𝒙
𝟏
(𝒏+𝟏)𝟐
𝟏+𝟐+𝟑+⋯= 𝒊=𝟏,𝒏 𝒊 = 𝑶( )= 𝑶(𝒏𝟐 )
𝟐
(𝒏+𝟏)𝟑
𝟏+𝟒+𝟗+⋯= 𝒊=𝟏,𝒏 𝒊𝟐 = 𝑶( )=O(𝒏𝟑 )
𝟑
𝒏𝟑
𝟏+𝟑+𝟔+⋯= 𝒊=𝟏,𝒏 𝒊(𝒊 + 𝟏)/𝟐 = 𝑶( )= 𝑶(𝒏𝟐 )
𝟔
S:12
Examples
S:13
Thanks
S:14