0% found this document useful (0 votes)
23 views

CEN 235 1. Algorithm Analysis

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

CEN 235 1. Algorithm Analysis

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

CEN 235 Data Structures

1. Algorithm Analysis

Assoc. Prof. Dr. Volkan TUNALI


Faculty of Engineering and Natural Sciences
Department of Software Engineering
Outline

▪ Algorithm
▪ Algorithm Analysis
▪ Asymptotic Notations
▪ O (Big-Oh) Notation
▪ Rules for Algorithm Analysis
▪ Examples
Algorithm

▪ What is an algorithm?
Algorithm

▪ Various but similar definitions:


▪ An algorithm is a clearly specified set of simple instructions to be
followed to solve a problem.
▪ An algorithm is a finite sequence of instructions that the computer
follows to solve a problem.
Algorithm Analysis

▪ Once an algorithm is given for a problem and decided (somehow) to be


correct, an important step is to determine its resource requirements.
▪ Why is that?
▪ To find out whether or not an algorithm is usable or relatively better than
another one for solving the same problem.
▪ The process of determining the resources of an algorithm is called algorithm
analysis.
▪ Two essential resources, hence, performance criteria of algorithms are:
▪ execution or running time
▪ memory space used.
Algorithm Analysis

▪ We shall discuss today:


▪ How to estimate the time required for a program.
▪ How to compare different algorithms designed to solve the same problem.

▪ Throughout this course:


▪ We shall explore and discuss variety of algorithms and data structures to
solve variety of problems.
▪ We shall understand the importance of and approaches to algorithm
design and analysis.
Algorithm Analysis

▪ Generally, there are several algorithmic ideas, and we would like to


eliminate the bad ones early, so an analysis is usually required.
▪ Furthermore, the ability to do an analysis usually provides insight
into designing efficient algorithms.
▪ The analysis also generally pinpoints the bottlenecks, which are
worth coding carefully.
Performance Analysis

▪ The actual execution time of an algorithm is hard to assess


▪ if one does not know the intimate details of the computer architecture, the operating
system, the compiler, the quality of the program, the current load of the system and other
factors.

▪ Execution time may be found for a given algorithm using some special
performance programs called benchmarks.
▪ Second alternative for performance assessment is to find the growth rate of
the execution time or the memory space need of an algorithm with the
growing input size.
▪ Here, we define the execution time or the memory space used as a function
of the input size such as the number of elements in an array, the number of
records in a file etc.
What to Analyse

▪ The most important resource to analyse is generally the running time.


▪ Several factors affect the running time of a program.
▪ We ignore the factors of the physical computer or environment.

▪ The other main factors are the algorithm used and the input to the algorithm.
▪ Typically, the size of the input is the main consideration.
▪ We usually use N to denote the size of the input.
▪ Number of elements in an array, etc.

▪ We use asymptotic notations to symbolize the asymptotic running time of an


algorithm in terms of the input size.
Asymptotic Notations

▪ We use asymptotic notations to symbolize the asymptotic running time


of an algorithm in terms of the input size.
▪ The following notations are frequently used in algorithm analysis:
▪ O (Big-Oh) Notation (asymptotic upper bound)
▪ Ω (Omega) Notation (asymptotic lower bound)
▪ Θ (Theta) Notation (asymptotic tight bound)
▪ o (little-Oh) Notation (upper bound that is not asymptotically tight)
▪ ω (little-Omega) Notation (lower bound that is not asymptotically tight)
▪ Goal is to find a function that asymptotically limits the execution time
or the memory space of an algorithm.
Asymptotic Notations

▪ O (Big-Oh) Notation (asymptotic upper bound)


▪ used to define the worst-case running time of an algorithm and is concerned with very large values of N.

▪ Ω (Omega) Notation (asymptotic lower bound)


▪ used to describe the best-case running time of algorithms and is concerned with very large values of N.

▪ Θ (Theta) Notation (asymptotic tight bound)


▪ used to define the exact complexity.

▪ o (little-Oh) Notation (upper bound that is not asymptotically tight)


▪ used to describe the worst-case analysis of algorithms and is concerned with small values of N.

▪ ω (little-Omega) Notation (lower bound that is not asymptotically tight)


▪ used to describe the best-case analysis of algorithms and is concerned with small values of N.
O (Big-Oh) Notation (asymptotic upper bound)

▪ In practice, we are mostly interested in the worst-cases (and


sometimes in average-cases).
▪ Big-Oh is the most common and useful notation for
algorithmic complexity analysis.
▪ Since Big-Oh is an upper bound, we must be careful never to
underestimate the running time of the program.
▪ In effect, the answer provided is a guarantee that the program will
terminate within a certain time period.
▪ The program may stop earlier than this, but never later.
A Simple Example

▪ Here is a simple piece of code to calculate σ𝑁


𝑖=1 𝑖 3
A Simple Example – Analysis

▪ The declarations count for no time.


▪ Lines 1 and 4 count for one unit each.
▪ Line 3 counts for four units per time executed (two multiplications, one
addition, and one assignment) and is executed N times, for a total of
4N units.
▪ Line 2 has the hidden costs of initializing i, testing i <= N, and
incrementing i. The total cost of all these is 1 to initialize, N + 1 for all
the tests, and N for all the increments, which is 2N + 2.
▪ We ignore the costs of calling the function and returning.
▪ For a total of 6N + 4, we say that this function is O(N).
A Simple Example – Analysis

▪ If we had to perform all this work every


time we needed to analyse a program,
the task would quickly become
infeasible.
▪ Fortunately, since we are giving the answer in terms of Big-Oh, there are lots
of shortcuts that can be taken without affecting the final answer.
▪ For instance, line 3 is obviously an O(1) statement (per execution), so it is silly
to count precisely whether it is two, three, or four units; it does not matter.
▪ Line 1 is obviously insignificant compared with the for loop, so it is silly to
waste time here.
▪ This leads to several general rules.
Rule 0: Simple Statement

▪ O(1), executed within a constant amount of time irresponsive to


any change in input size.
Rule 1: for Loops

▪ The running time of a for loop is at most the running time of the
statements inside the for loop (including tests) times the number
of iterations.
Rule 2: Nested for Loops

▪ Analyse these inside out.


▪ The total running time of a statement inside a group of nested
loops is the running time of the statement multiplied by the
product of the sizes of all the for loops.
▪ As an example, the following program fragment is O(N2):
Rule 3: Consecutive Statements

▪ These just add (which means that the maximum is the one that
counts).
▪ As an example, the following program fragment, which has O(N)
work followed by O(N2) work, is also O(N2) :
Rule 4: if/else

▪ The running time of an if/else statement is never more than the


running time of the condition test plus the larger of the running
times of S1 and S2 for the fragment below:
Examples

▪ Find the execution time t(N) in terms of N:


▪ for (i=0; i<=N; i++)
for (j=0; j<=N; j++)
statement block;
▪ for (i=0; i<=N; i++)
for (j=0; j<=i; j++)
statement block;
▪ for (i=0; i<=N; i++)
for (j=1; j<=N; j*=2)
statement block;
Exercises

▪ Find the number of times the statement block is executed.


▪ for (i=0; i<=N; i++)
for (j=1; j<=i; j*=2)
statement block;

▪ for (i=1; i<=N; i*=3)


for (j=1; j<=N; j*=2)
statement block;
Examples

▪ Assume running time of an algorithm is at most cN2 where c is some positive


constant.
▪ c becomes inconsequential as N gets bigger and bigger.
▪ specifying this constant does not bring about extra insight when comparing this function
with another one of different order.
▪ Therefore, we say that the running time of the algorithm above is of order N2 → O(N2)

▪ Assume f(N) = N2logN + 10N2 + N + 4


▪ Constant 4 is simply ignored.
▪ The larger the value of N the lesser the significance of the contribution of the lower order
terms 10N2 and N.
▪ Therefore, we say that the function f(n) above is of order N2logN → O(N2logN)
Growth of Some Typical Functions
What About Recursive Algorithms?

▪ Recursion and complexity analysis of recursive algorithms will be


covered next week.
Summary

▪ An algorithm is a finite sequence of instructions that the computer


follows to solve a problem.
▪ We analyse the growth rate of the execution time or the memory space
need of an algorithm with the growing input size.
▪ Here, we define the execution time or the memory space used as a
function of the input size such as the number of elements in an array,
the number of records in a file etc.
▪ In practice, we are mostly interested in the worst-cases (and sometimes
in average-cases).
▪ Big-Oh is the most common and useful notation for algorithmic
complexity analysis.
Thank you for your attention!

Assoc. Prof. Dr. Volkan TUNALI


Faculty of Engineering and Natural Sciences
Department of Software Engineering

You might also like