0% found this document useful (0 votes)
1 views14 pages

Advanced Midterms

The document discusses the analysis of algorithms focusing on time complexity, highlighting factors that influence running time such as input size and organization. It introduces Big-Oh notation and various comparison notations to describe the growth of functions. Additionally, it provides examples of time complexity calculations for different algorithms and methods to prove time complexity using limits and integration approximations.

Uploaded by

kiyodal984
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views14 pages

Advanced Midterms

The document discusses the analysis of algorithms focusing on time complexity, highlighting factors that influence running time such as input size and organization. It introduces Big-Oh notation and various comparison notations to describe the growth of functions. Additionally, it provides examples of time complexity calculations for different algorithms and methods to prove time complexity using limits and integration approximations.

Uploaded by

kiyodal984
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Time Complexity

DR.S.S.Shehaby

S:1
Why Analyze Algorithms?
An algorithm can be analyzed in terms of time efficiency or
space utilization. We will consider only the former right now.
The running time of an algorithm is influenced by several
factors:
• Speed of the machine running the program
• Language in which the program was written. For
example, programs written in assembly language
generally run faster than those written in C or C++,
which in turn tend to run faster than those written in
Java.
• Efficiency of the compiler that created the program
• The size of the input: processing 1000 records will take
more time than processing 10 records.
• Organization of the input: if the item we are searching
for is at the top of the list, it will take less time to find it
than if it is at the bottom.

S:2
Generalizing Running Time

Input Size = n log n n n log n n² n³ 2ⁿ


5 3 5 15 25 125 32
10 4 10 33 100 10³ 10³
100 7 100 664 104 106 1030
1000 10 1000 104 106 109 10300
10000 13 10000 105 108 1012 103000

Comparing the growth of the running time as the input grows to


the growth of known functions.

S:3
Analyzing Running Time
T(n), or the running time of a particular algorithm on input of size n, is taken to be the
number of times the instructions in the algorithm are executed. Pseudo code algorithm
illustrates the calculation of the mean (average) of a set of n numbers:
1. n = read input from user
2. sum = 0
3. i = 0
4. while i < n
5. number = read input from user
6. sum = sum + number
7. i = i + 1
8. mean = sum / n
The computing time for this algorithm in terms on input size n is: T(n) = 4n + 5.

Statement Number of times Statement Number of times


1 1 5 n
2 1 6 n
3 1 7 n
4 n+1 8 1
S:4
Big-Oh Notation

Definition 1: Let f(n) and g(n) be two functions. We write:


f(n) = O(g(n)) or f = O(g) or f(n)
(read "f of n is big oh of g of n" or "f is big oh of g") if there is a positive
integer C such that f(n) <= C * g(n) for all positive integers n > k,
where k is some number.

The basic idea of big-Oh notation is this: Suppose f and g are both
real-valued functions of a real variable x. If, for large values of x, the
graph of f lies closer to the horizontal axis than the graph of some
multiple of g, then f is of order g, i.e., f(x) = O(g(x)). So, g(x)
represents an upper bound on f(x).

S:5
Comparison Notations

If there is a positive integer C such that f(n) OPERATOR C * g(n) for all
positive integers n > k, where k is some number, the following
definitions are given:
if OPERATOR is < then : f(n)=o (g(n)
if OPERATOR is >= then : f(n)=  (g(n)
if OPERATOR is > then : f(n)=  (g(n)
if OPERATOR is <= then : f(n)= O (g(n)
if OPERATOR is == t hen : f(n)=  (g(n)

The basic idea of big-Oh notation is this: Suppose f and g are both
real-valued functions of a real variable x. If, for large values of x, the
graph of f lies closer to the horizontal axis than the graph of some
multiple of g, then f is of order g, i.e., f(x) = O(g(x)). So, g(x)
represents an upper bound on f(x).
S:6
Conventional methods to prove O( )
• n2+100n+5=O(n3)
 n2+100n+5<n2 +105(n) for n>1
 <n2 +105(n2) for n>1
 <n2 +106n2=O(n3)
• n2+3n-1=O(n2)
 n2+3n-1≤ n2 + 3n ≤ 4 n2 =O(n2)
• 2n7 - 6n5 + 10n2 – 5 = O(n7)
 2n7 - 6n5 + 10n2 – 5 < 2n7 + 6n5 + 10n2
 ≤ 2n7+ 6n7 + 10n7 ≤18n7
 ≤O(n7)
• To Prove f=  (g) prove: f=O(g) and g=O(f):
 n2+3n-1=O(n2) and n2=O(n2+3n-1) hence n2+3n-1=  (n2)
 However n3  O(n2) hence n3   (n2)

S:7
Prove O( ) using lim
𝑛→∞

Easy way to prove:


O(1) = o(𝒍𝒐𝒈𝜶n) = o(𝒏𝜷 ) = o(𝟐𝒏 ) = o(𝟑𝒏 ) = ⋯ = o(𝒏𝒏 )
𝒘𝒉𝒆𝒓𝒆 𝜶 𝒊𝒔 𝒂𝒏𝒚 𝒑𝒐𝒔𝒊𝒕𝒊𝒗𝒆 𝒓𝒆𝒂𝒍 𝒏𝒖𝒎𝒃𝒆𝒓 > 𝟏 𝒂𝒏𝒅
𝜷𝒊𝒔 𝒂𝒏𝒚 𝒑𝒐𝒔𝒊𝒕𝒊𝒗𝒆 𝒓𝒆𝒂𝒍 𝒏𝒖𝒎𝒃𝒆𝒓 > 𝟎
Also: o(𝒍𝒐𝒈𝜶𝟏 n) =  (𝒍𝒐𝒈𝜶𝟐 n)
S:8
Loop Examples
int count=0,N=10,i,j,k; n i

for (i=1;i<=N;i++) T n = O(1)


i=1 j=1
for (j=1;j<=i;j++) count++; n
printf("Count=%d [N(N+1)/2= O(N^2)]\n“ = i
,count); i=1
(1 + 2 + 3 + ⋯ + 10, 𝑓𝑜𝑟 𝑛 = 10)
Count=55 [N(N+1)/2=O(N^2)]
=n(n+1)/2
n i i
count=0;for (i=1;i<=N;i++)
T n = O(1)
for (j=1;j<=i;j++) i=1 j=1 k=1
for (k=1;k<=i;k++) count++; n i
printf("\nCount=%d [N(N+1)(2N+1)/6 = = i
O(N^3) %d]\n “, count,N*(N+1)*(2*N+1)/6); i=1 j=1
n
Count=385 [N(N+1)(2N+1)/6=O(N^3)=385] = 𝑖2
i=1
(1 + 4 + 9 + ⋯ + 100, 𝑓𝑜𝑟 𝑛 = 10)
=n(n+1)(2n+1)/6
S:9
Loop Examples
n i j
count=0;
Time Complexity
of the algorithm T n = O(1)
for (i=1;i<=N;i++) with size n i=1 j=1 k=1

for (j=1;j<=i;j++) n i

for (k=1;k<=j;k++)count++; = j
i=1 j=1
printf("\nCount=%d [N(N+1)(2N+1)/3 n

=O(N^6)=%d]\n", count, = i(i + 1)/2


N*(N+1)*(N+2)/6); i=1
(1+3+6+…+45+55, 𝑓𝑜𝑟 𝑛 = 10)
Count=220 [N(N+1)(2N+1)/3=O(N^6)=220] n

=1/2 i2 +n(n+1)/2
i=1
=1/2[n(n + 1)(2n + 1)/6+ n(n+1)/2]
count=0;N=128;
= 𝒏(𝒏 + 𝟏)(𝒏 + 𝟐)/𝟔
//for (i=1;i<=N;i*=2) count++;
for (i=N;i>=1;i/=2) count++;
printf("\nCount=%d [log(n)+1=%d]\n",count,(int)round((log(N)+1)/log(2)));

Count=8 [log(n)+1=8]
S:10
Don’t panic
Integration approximation
f(x)

𝒏 𝒏+𝟏
𝒇 𝒌 ≥ 𝒇 𝒙 𝒅𝒙
𝒌=𝟏 𝟏

𝒏 𝒏
𝒇 𝒌 ≤ 𝒇 𝒙 𝒅𝒙
𝒌=𝟐 𝟏

S:11
Don’t panic
Integration approximation
𝒏+𝟏 𝒏 𝒏+𝟏
𝒇 𝒙 𝒅𝒙 + 𝒇(𝟏) ≥ 𝒇 𝒌 ≥ 𝒇 𝒙 𝒅𝒙
𝟐 𝒌=𝟏 𝟏
𝒏

𝒇 𝒌 =𝑶 𝒈 𝒏+𝟏 ,
𝒌=𝟏 𝒏
𝒘𝒉𝒆𝒓𝒆 𝒈 𝒏 = 𝒇 𝒙 𝒅𝒙
𝟏

(𝒏+𝟏)𝟐
𝟏+𝟐+𝟑+⋯= 𝒊=𝟏,𝒏 𝒊 = 𝑶( )= 𝑶(𝒏𝟐 )
𝟐
(𝒏+𝟏)𝟑
𝟏+𝟒+𝟗+⋯= 𝒊=𝟏,𝒏 𝒊𝟐 = 𝑶( )=O(𝒏𝟑 )
𝟑
𝒏𝟑
𝟏+𝟑+𝟔+⋯= 𝒊=𝟏,𝒏 𝒊(𝒊 + 𝟏)/𝟐 = 𝑶( )= 𝑶(𝒏𝟐 )
𝟔

S:12
Examples

O(1)=o(log n)=o(n)=o(n log n)=o(n2)=o(n3)=o(2n)


O(g1(n)) + O(g2(n)) )=O(g1(n)) + g2(n)) )=O(g1(n)) if g2(n)=o(g1(n))
O(g1(n)) × O(g2(n)) )=O(g1(n)) × g2(n)) )
• for (i=0;i<n ;i+=d) O(g(n)); //(n/d)= (n *g(n))
• for (i=0;i<=10000 ;i+=d) O(g(n)); //(g(n))
• for (i=1;i<=n ;i*=d) O(g(n)); //(logd (n ) g(n))
• for (i=2;i<=n ;i++) O(g(n)); //(n ½ g(n))
• for (i = 2; i < n; i *= i ) O(g(n)); //(log2 log2(n)g(n))
• for(int i = 0; i < n; i++)
for (int j = 1; j < n; j *= 2) O(1) //(n log2 (n )

S:13
Thanks

S:14

You might also like