0% found this document useful (0 votes)
87 views9 pages

Lecture 3. Growth of Functions Asymptotic Notation

This document discusses assessing the performance and correctness of algorithms. It covers asymptotic analysis notation (Big O, Theta, Omega) and how it can be used to analyze the running time of sorting algorithms like insertion sort and merge sort as the input size n approaches infinity. It also discusses analyzing the worst-case, average-case, and best-case performance of algorithms and how asymptotic analysis ignores machine dependencies to focus on the growth of the running time. The learning outcomes are to apply asymptotic notation to analyze algorithms and prove the correctness of an algorithm.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
87 views9 pages

Lecture 3. Growth of Functions Asymptotic Notation

This document discusses assessing the performance and correctness of algorithms. It covers asymptotic analysis notation (Big O, Theta, Omega) and how it can be used to analyze the running time of sorting algorithms like insertion sort and merge sort as the input size n approaches infinity. It also discusses analyzing the worst-case, average-case, and best-case performance of algorithms and how asymptotic analysis ignores machine dependencies to focus on the growth of the running time. The learning outcomes are to apply asymptotic notation to analyze algorithms and prove the correctness of an algorithm.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 9

Algorithmics

CT065-3-3

Assessing Algorithmic
Performance And Correctness
Level 3 – Computing (Software Engineering)
Topic and Structure of the Lesson

1. Running-time of algorithms
2. Kinds of performance analyses
3. Asymptotic Analysis
4. Analysis of Insertion and Merge Sort
5. Correctness of algorithms
6. Group Exercise
- Analysis of Selection Sort
- Q&A on correctness of algorithms
Module Code and Module Title Title of Slides Slide 2 (of 26)
Learning Outcomes

• Apply O, Θ, Ω notation to the analysis of


algorithms

• Prove the correctness of an algorithm

Module Code and Module Title Title of Slides Slide 3 (of 26)
Key Terms

1. Asymptote
2. Big O(O)
3. Theta(Θ)
4. Omega(Ω)

Module Code and Module Title Title of Slides Slide 4 (of 26)
Running Time of Algorithms
• The running time of an algorithm can be
indicative of its efficient implementation (slide
14, Lecture 1)
• For example, the Merge Sort had a faster
running time as n increased than Insertion and
Selection sort.
• Running time depends on the input: an already
sorted sequence is easier to sort!
• We generally seek upper bounds on the running
time, because we want a guarantee that the
algorithm will run within a certain time frame.

Module Code and Module Title Title of Slides Slide 5 (of 26)
Kinds of Performance Analyses
Worse-case: (usually)
• T(n) = maximum time of algorithm on any input
of size n.
Average-case: (sometimes)
• T(n) = expected time of algorithm over all
inputs of size n.
• Need assumption of statistical distribution of
inputs
Best-case: (bogus)
• Cheat with a slow algorithm that works fast on
some input.

Module Code and Module Title Title of Slides Slide 6 (of 26)
Asymptotic Analysis

• What if we looked at Insertion Sort’s


worst-case time?
– It would vary based on the speed of our
computer:
• relative speed (on the same machine)
• Absolute speed (on different machines)
– If we ignore machine-dependency, we can
look at the growth of T(n) as n ∞
• This is called Asymptotic Analysis
Module Code and Module Title Title of Slides Slide 7 (of 26)
Asymptotic Notation
• O – notation (upper bounds)
– f(n) = O(g(n)) if there exist constants c > 0,
n0 > 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0.
– Example:2n2 = O(n3) (c = 1, n0 = 2)
• Ω – notation (lower bounds)
– Ω(g(n))= { f(n) : there exist constants c > 0,
n0 > 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }
– Example: √n = Ω(lg n) (c = 1, n0 = 16)
• Θ – notation (tight bounds)
– Θ(g(n)) = O(g(n)) ∩ Ω(g(n))
– Example: ½ n2 – 2n = Θ (n2)
Module Code and Module Title Title of Slides Slide 8 (of 26)
Asymptotic Performance
When n gets large enough, a Θ(n2) algorithm
always beats a Θ(n3) algorithm.
• We should not ignore
asymptotically slower
algorithms, however.
• Real-world design
situations often call for
a balancing of
objectives
• Asymptotic analysis
is a useful tool to help
to structure our
thinking.
Module Code and Module Title Title of Slides Slide 9 (of 26)

You might also like