0% found this document useful (0 votes)
228 views9 pages

Order of Growth Explained

The document summarizes asymptotic notations O(big-O), Ω(big-Omega), and Θ(theta) that are used to describe the time complexity of algorithms. It defines each notation and provides examples to illustrate their meanings. Specifically, O(big-O) provides an upper bound, Ω(big-Omega) provides a lower bound, and Θ(theta) provides both an upper and lower bound for the running time of an algorithm. The document also presents some useful theorems for working with these notations to determine the complexity class of functions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
228 views9 pages

Order of Growth Explained

The document summarizes asymptotic notations O(big-O), Ω(big-Omega), and Θ(theta) that are used to describe the time complexity of algorithms. It defines each notation and provides examples to illustrate their meanings. Specifically, O(big-O) provides an upper bound, Ω(big-Omega) provides a lower bound, and Θ(theta) provides both an upper and lower bound for the running time of an algorithm. The document also presents some useful theorems for working with these notations to determine the complexity class of functions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Introduction to The following table summarizes the time complexity of Binary search in

Algorithm various cases:

Cases Suppose Element (say x) present in Element x, not


Array A present in A
Best: 1 step required
Worst:
Average

1.7 ASYMPTOTIC NOTATIONS (O, Ω, and θ)


Asymptotic notations have been used in earlier sections in brief. In this
section we will elaborate these notations in detail. They will be further taken
up in the next unit of this block.

These notations are used to describe the Running time of an algorithm, in


terms of functions, whose domains are the set of natural numbers,
N = {1, 2, ……}. Such notations are convenient for describing the worst case
running time function. T(n) (problem size input size).

The complexity function can be also be used to compare two algorithms P and
Q that perform the same task.

The basic Asymptotic Notations are:

1. O(Big-“Oh”) Notation. [Maximum number of steps to solve a problem,


(upper bound)]

2. Ω (Big-“Omega”) Notation [Minimum number of steps to solve a


problem, (lower bound)]

3. (Theta) Notation [Average number of steps to solve a problem, (used


to express both upper and lower bound of a
given )

1.7.1 THE NOTATION O (BIG ‘Oh’)


Big „Oh‟ Notation is used to express an asymptotic upper bound (maximum
steps) for a given function . Let and are two positive
functions , each from the set of natural numbers (domain) to the positive real
numbers.

We say that the function [read as “f of n is big “Oh” of g of


n”], if there exist two positive constants C and n0 such that

: n ≥ n0

20
The intuition behind O- notation is shown in Figure 1. Basics of an Algorithm

C.g(n
No )
f(n)
Matter

F(n)

Figure 1
n →n0
Figure 1

For all values of n to the right of n0, the value of f(n) is always lies on or
below

To understand O-Notation let us consider the following examples:

Example1.1: For the function defined by : show that

Solution:

(i) To show that ; we have to show that

; since we have found the required


constant C and . Hence .

Remark: The value of C and is not unique. For example, to satisfy the
above equation (1), we can also take . So depending on the
value of C, the value of is also changes. Thus any value of C and which
satisfy the given inequality is a valid solution.

(ii) To show that ; we have to show that

; Let C=3

21
Introduction to
Algorithm
Value of n

6 3

15 24

27 81

….. ….. …..

; Since we have found the required


constant C and . Hence .

Do questions (iii), (iv) and (v) as assignment.

1.7.2 THE NOTATION Ω (BIG ‘Omega’)


O- Notation provides an asymptotic upper bound for a function; Ω-Notation,
provides an asymptotic lower-bound for a given function.

We say that the function [read as “f of n is big “Omega” of


g of n”], if and only if there exist two positive constants C and n0 such that

: n ≥ n0

22
Basics of an Algorithm

The intuition behind Ω- notation is shown in Figure 2.

f(n) No
f(n) = Ω g(m)
Matter

n0
n
Figure 2

Note that for all values of f(n) always lies on or above g(n).

To understand Ω-Notation let us consider the following examples:

Example1.1: For the function defined by

: show that

Solution:

(i) To show that ; we have to show that

Since we have found the required constant C and .

Hence .

(ii) To show that ; we have to show that no value of C and


is there which satisfy the following equation (1). We can prove this result
by contradiction.

23
Introduction to Let then there exist a
Algorithm positive constant C and such that

But for any ; the inequality (1) is not satisfied ⇒ Contradiction.

Thus .

Do questions (iii), (iv) and (v) as assignment.

1.7.3 THE NOTATION (Theta)


Θ-Notation provides simultaneous both asymptotic lower bound and
asymptotic upper bound for a given function.

Let and are two positive functions , each from the set of natural
numbers (domain) to the positive real numbers. In some cases, we have

We say that the function [read as “f of n is Theta” of g of


n”], if and only if there exist three positive constants such that

24
(Note that this inequality (1) represents two conditions to be satisfied Basics of an Algorithm
simultaneously viz and

The following figure -1 shows the intuition behind the Θ-Notation.

c2g(n)

f(n)
c1g(n)

f(n) = (g(n))

n
n0
Figure 3

Note that for all values of n to the right of the n0 the value of f(n) lies at or
above C1g(n) and at or below C2.g(n).

To understand Ω-Notation let us consider the following examples:

Example1.1: For the function defined by

: show that

Solution: To show that ; we have to show that

To satisfy this inequality (1) simultaneously, we have to find the value


of , and , using the following inequality

Inequality (2) is satisfied for

Inequality (3) is satisfied for


25
Introduction to Hence inequality (1) simultaneously satisfied for
Algorithm
Hence

(ii) We can prove this by contradiction that no value of

Let

Left side inequality: ; is satisfied for

Right side inequality:

for this inequality is not satisfied.

Thus .

(iii) Do yourself.

1.8 SOME USEFUL THEOREMS FOR

The following theorems are quite useful when you are dealing

(or solving problems) with

Theorem1: If

Proof:

for

Let us assume

Then .

Theorem2: If

Proof:

Since

26
Basics of an Algorithm

Theorem3: If

Proof: From Theorem1 and Theorem2,

…….(1)

From (1) and (2) we can say that .

Example1: By applying theorem, find out the O-notation, Ω- notation and Θ-


notation for the following functions.

(i)

(ii)

Solution:

(i) Here , So
by Theorem1,2 and 3:

, and .

(ii) , So by
Theorem1,2 and 3:

, and .

 Check Your Progress 1


Q.1 , Where O(n) stands for order n is:
c) d) none of these

Q.2: What is the running time to retrieve an element from an array of size n
(in worst case):
c) d) none of these

Q.3: The time complexity for the following function is:

c) d)

Q.4: The time complexity for the following function is:

27
Introduction to
Algorithm ………

c) d)

Q.5: Define an algorithm? What are the various properties of an


algorithm?

Q.6: What are the various fundamental techniques used to design an


algorithm efficiently? Also write two problems for each that
follows these techniques?

Q.7: Differentiate between profiling and debugging?

Q.8: Differentiate between asymptotic notations ?

Q.9: Define time complexity. Explain how time complexity of an


algorithm is computed?
Q.10: Let f(n) and g(n) be two asymptotically positive functions. Prove
or disprove the following (using the basic definition of O, Ω and Θ):

g)

k) where

28

You might also like