0% found this document useful (0 votes)
22 views

Module1 Lecture2

The document discusses algorithms and problem solving. It covers introduction to analyzing algorithms, asymptotic analysis using big O, big omega and big theta notations. It also discusses growth of functions and solving recurrences. The document provides examples of insertion sort analysis and using asymptotic notations to analyze algorithms.

Uploaded by

Clash Of Clanes
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views

Module1 Lecture2

The document discusses algorithms and problem solving. It covers introduction to analyzing algorithms, asymptotic analysis using big O, big omega and big theta notations. It also discusses growth of functions and solving recurrences. The document provides examples of insertion sort analysis and using asymptotic notations to analyze algorithms.

Uploaded by

Clash Of Clanes
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

Algorithms and Problem Solving (15B17CI411)

Module 1: Lecture 2

Jaypee Institute of Information Technology (JIIT)


A-10, Sector 62, Noida
Module 1
• Introduction to problem solving approach; Asymptotic Analysis:
Growth of Functions and Solving Recurrences; Notations- Big O, big
omega, big theta, little o; Empirical analysis of sorting and searching
algorithms – Merge sort, Quick sort, Heap sort, Radix sort, Count
sort, Binary search, and Median search
Outline – Introduction to Course

1. Algorithms
2. How to analyze an Algorithm
3. Growth of Functions - Asymptotic Notations
Algorithms
• Expectation from an algorithm
Algorithms
• Expectation from an algorithm
• Correctness
• Less resource usage
Algorithms
• Expectation from an algorithm
❖ Correctness
⮚ Correct: Algorithms must produce correct result.
⮚ Produce an incorrect answer: Even if it fails to give correct results all the time still
there is a control on how often it gives wrong result.
⮚ Approximation algorithm: Exact solution is not found, but near optimal solution can be
found out.
❖ Less resource usage
Algorithms
• Time taken by an algorithm
o Performance measurement or Apostoriori Analysis: Implementing the
algorithm in a machine and then calculating the time taken by the system to
execute the program successfully.
o Performance Evaluation or Apriori Analysis. Before implementing the
algorithm in a system.
❑ How long the algorithm takes :-will be represented as a function of the size of the
input. f(n)→how long it takes if ‘n’ is the size of input.
❑ How fast the function that characterizes the running time grows with the input size.
“Rate of growth of running time”.
The algorithm with less rate of growth of running time is considered better.
Algorithms & Technology
• Latest processor Vs Good Algorithm
How to analyze an algorithm??
Example: Insertion Sort
Pseudo code:
for j=2 to A length -------------------------------------------------- C1
key=A[j]-----------------------------------------------------------------
C2 //Insert A[j] into sorted Array A[1.....j-1]------------------------
C3 i=j-1------------------------------------------------------------------------
C4 while i>0 & A[j]>key---------------------------------------------------
C5 A[i+1]=A[i]---------------------------------------------------------------
C6 i=i-1------------------------------------------------------------------------
C7 A[i+1]=key----------------------------------------------------------------
C8
How to analyze an algorithm??
Example: Insertion Sort
How to analyze an algorithm??
Example: Insertion Sort
Let Ci be the cost of ith line.
Since comment lines will not incur any cost C3=0
Cost No. Of times Executed
C1 n
C2 n-1
C3 0
C4 n-1
C5
C6
C7
C8 n-1
How to analyze an algorithm??
Example: Insertion Sort
Run time = C1(n) + C2 (n-1) + 0 (n-1) + C4 (n-1) + C5( ) + C6 ( ) + C7 ( ) + C8 (n-1)
How to analyze an algorithm??
Example: Insertion Sort
Run time = C1(n) + C2 (n-1) + 0 (n-1) + C4 (n-1) + C5( ) + C6 ( ) + C7 ( ) + C8 (n-1)

Best Case: When the array is sorted.


(All tj values are 1)

Worst Case: It occurs when Array is reverse sorted, and tj =j


How to analyze an algorithm??
Why consider worst-case running time???
• The worst-case running time gives a
guaranteed upper bound on the running time
for any input.
• For some algorithms, the worst case occurs
often. For example, when searching, the worst
case often occurs when the item being
searched for is not present, and searches for
absent items may be frequent.
How to analyze an algorithm??
Why consider worst-case running time???
• The worst-case running time gives a
guaranteed upper bound on the running time
for any input.
• For some algorithms, the worst case occurs
often. For example, when searching, the worst
case often occurs when the item being
searched for is not present, and searches for
absent items may be frequent.

Why not analyze the average case?

Because it’s often about as bad as the worst


case.
Growth of Functions - Asymptotic Notations
• It is a way to describe the characteristics of a function in the limit.
• It describes the rate of growth of functions.
• It is a way to compare “sizes” of functions
Growth of Functions - Asymptotic Notations
Given functions f(n) and g(n), we say
that f(n) is O(g(n)) if there are positive
constants
c and n0 such that
f(n)  cg(n) for n  n0

Example : 7n-2 is O(n)


f(n) =7n-2 and g(n) = n
f(n) =O(n) means f(n) <=c.n
7n-2<=cn ( need c > 0 and n0  1 such that 7n-2  c•n for n  n0 )
this is true for c = 7 and n0 = 1
Growth of Functions - Asymptotic Notations
To prove using Big-O:
Example 2: 3 − 100n + 6 = O() 1. Determine f(n) and g(n)
2. Write the equation based on the
definition
f(n) = 3 − 100n + 6 3. Choose a c such that the equation
g(n) = is true.
4. If you can find a d, then
⇒ 3 − 100n + 6 ≤ c · for some c f(n) = O(g(n)).
If c = 3 , then 3 − 100n + 6 ≤ 3 5. If not, then
f(n) O(g(n)).
Hence, 3 − 100n + 6 = O()

Example 4: 3 − 100n + 6 O(n)


Example 3: 3 − 100n + 6 = O()
f(n) = 3 − 100n + 6 f(n) = 3 − 100n + 6
g(n) = g(n) =
for any c, cn < 3
⇒ 3 − 100n + 6 ≤ c · for some c
⇒ If c = 1 : 3 − 100n + 6 ≤ Hence 3 − 100n + 6 O(n)
Big-Oh and Growth Rate
• The big-Oh notation gives an upper bound on the growth rate of a
function
• The statement “f(n) is O(g(n))” means that the growth rate of f(n) is
no more than the growth rate of g(n)
• We can use the big-Oh notation to rank functions according to their
growth rate

f(n) is O(g(n)) g(n) is O(f(n))


g(n) grows more Yes No
f(n) grows more No Yes
Same growth Yes Yes
21
Big-Oh Rules
• If is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
• Use the smallest possible class of functions
• Say “2n is O(n)” instead of “2n is O(n2)”
• Use the simplest expression of the class
• Say “2n + 10 is O(n)” instead of “2n + 10 is O(2n)”

22
Math you need to Review
Summations
Logarithms and Exponents • properties of logarithms:
logb(xy) = logbx + logby
logb (x/y) = logbx - logby
logb = alogbx
logba = logxa/logxb
• properties of exponentials:
a(b+c) = aba c
abc = (ab)c
Proof techniques ab /ac = a(b-c)
Basic probability b = a logab
bc = a c*logab

23
Growth of Functions - Asymptotic Notations
big-Omega
f(n) is (g(n))
if there is a constant c > 0 and an integer
constant n0  1 such that
f(n)  c•g(n) for n  n0
E show that f(n)=

To show that f(n)=, we have to show that

f(n)n

Þ The eqn(1)satisfied for C=1 and for all n


Þ Hence, f(n)=
Example 2: 3 − 100n + 6 = ()

f(n) = 3 − 100n + 6
g(n) =
⇒ 3 − 100n + 6 c · for some c
If c = 2 , then 3 − 100n + 6 2
Hence, 3 − 100n + 6 = ()
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations
Theta - notation
A function f(n) is (g(n))
if there exists positive constants , and such that
g(n)f(n) g(n )
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations
Growth of Functions - Asymptotic Notations
Upper and lower bounds

Big O O(g) – Upper Bound f(n) ≤ c∙g(n)


Omega Ω(g) – Lower Bound f(n) ≥ c∙g(n)
Theta Θ(g) – Exact limit:c1∙g(n) ≤ f(n) ≤ c2∙g(n)

You might also like