02 Analysis
02 Analysis
cse.iitkgp.ac
Algorithms – I (CS21203)
Analysis of Algorithms
Computer Science and Engineering| Indian Institute of Technology Kharagp
cse.iitkgp.ac
Analyzing Algorithms
• Predict how your algorithm performs in practice
• By analyzing several candidate algorithms for a problem
we can identify efficient ones
• Criteria
• Running time
• Space usage
• Cache I/O
• Main memory I/O
• Lines of codes
denotes the
number of times
line 5 gets
executed for that
value of .
• Major reasons
• Gives an upper bound on the running time for any input
• For some algorithms, worst case occurs fairly often, e.g.,
searching
• Average case is often roughly as bad as the worst case.
Order of Growth
• In analyzing running time for `insertion sort’, we started
with constants to represent the cost of each statement
• Then we observed that they give more detail than we
need and we discarded them
• We shall go ahead with more simplifying abstraction:
Rate/Order of Growth
• For the function we care when is large enough. When is
small, is small anyway
• The constant factors and lower order terms doesn’t affect
the growth of the function
• One algorithm is more efficient than another if its worst-
case running time has a lower order of growth
Order of Growth
𝑛
𝑔 2 ( 𝑛 ) =0.1 × 2
2
𝑔 1 ( 𝑛 ) =𝑛
𝑓 2 ( 𝑛 )=9 𝑛
𝑓 1 (𝑛 )=3 𝑛
Order of Growth
2
𝑔 1 ( 𝑛 ) =𝑛
2
𝑔 2 ( 𝑛 ) =𝑛 − 6 𝑛
𝑓 ( 𝑛 )=6 𝑛
Asymptotic Notations
• These notations are used to describe the asymptotic
running time of an algorithm
• However, asymptotic notations can apply to other
functions that have nothing to do whatsoever with
algorithms
O (Big-O)
𝑂 ( 𝑔 ( 𝑛 ) )={ 𝑓 ( 𝑛 ) : ∃𝑐> 0 , 𝑛0 > 0 , 𝑠𝑢𝑐h 𝑡h𝑎𝑡 0 ≤ 𝑓 (𝑛)≤ 𝑐𝑔 ( 𝑛 ) ∀ 𝑛 ≥ 𝑛0 }
After some constant • Asymptotic upper bound
𝑓 ( 𝑛 ) ≤ 𝑐𝑔(𝑛) • Note the : can be of the
same order, but can be
smaller as well
( 2
) 2
( 2
) 3 2
𝑛=𝑂 𝑛 , 𝑛 =𝑂 𝑛 ,𝑛 ≠ 𝑂(𝑛 )
Jul 25, 26, 2024 11
CS21203 / Algorithms - I | Introduction
Computer Science and Engineering| Indian Institute of Technology Kharagp
cse.iitkgp.ac
O (Big-O) []
𝑂 ( 𝑔 ( 𝑛 ) )={ 𝑓 ( 𝑛 ) : ∃𝑐> 0 , 𝑛0 > 0 , 𝑠𝑢𝑐h 𝑡h𝑎𝑡 0 ≤ 𝑓 (𝑛)≤ 𝑐𝑔 ( 𝑛 ) ∀ 𝑛 ≥ 𝑛0 }
After some constant
𝑓 ( 𝑛 ) ≤ 𝑐𝑔(𝑛)
• How can we show,
• Let
• , so
• Similarly, let
Always larger than
• , so
Ω (Big- Ω) []
Ω ( 𝑔 ( 𝑛 ) ) ={ 𝑓 ( 𝑛 ) : ∃𝑐 >0 ,𝑛 0 >0 , 𝑠𝑢𝑐h 𝑡h𝑎𝑡 0 ≤ 𝑐𝑔( 𝑛) ≤ 𝑓 (𝑛) ∀ 𝑛 ≥ 𝑛 0 }
𝑛 ≠ Ω ( 𝑛2 ) , 𝑛2 =Ω ( 𝑛2 ) , 𝑛3 =Ω(𝑛 2)
Jul 25, 26, 2024 13
CS21203 / Algorithms - I | Introduction
Computer Science and Engineering| Indian Institute of Technology Kharagp
cse.iitkgp.ac
Θ (Big- theta) []
Θ ( 𝑔 ( 𝑛 ) )={ 𝑓 ( 𝑛 ) :∃𝑐 1 , 𝑐 2 >0 , 𝑛0 >0 , 𝑠 . 𝑡 . 0 ≤ 𝑐 1 𝑔 (𝑛)≤ 𝑓 (𝑛)≤ 𝑐 2 𝑔 ( 𝑛 ) ∀ 𝑛≥ 𝑛0 }
𝑛 ≠ Θ ( 𝑛2 ) , 𝑛2=Θ ( 𝑛 2 ) ,𝑛 3 ≠ Θ(𝑛 2)
Jul 25, 26, 2024 14
CS21203 / Algorithms - I | Introduction
Computer Science and Engineering| Indian Institute of Technology Kharagp
cse.iitkgp.ac
Time Complexity
• Consider an algorithm running on a computer that can
execute ops/sec
• For , what amount of time will be required?
2
Θ(𝑛 )
2
𝑂(𝑛 )
2
𝑂(𝑛 )
• What can you tell about the complexity of the overall
algorithm?
Theorem
• If t1(n) = O(g1(n)) and t2(n) = O(g2(n)), then t1(n) +
t2(n) = O(max{g1(n), g2(n)})
• The algorithm’s overall efficiency will be determined
by the part with a larger order of growth, i.e., its
least efficient part.
• For example, 5n2 + 3nlogn = O(n2)
Proof. There exist constants such that
Thank You!!