01 An Introduction To Algorithmic Analysis
01 An Introduction To Algorithmic Analysis
Algorithm Fundamentals
An Introduction to Algorithmic Analysis
Dr. Mourad Mars
Disclaimer: Original course by DR. Rajasekaran, Sultan Almuhammadi, and other references
Grading Policy - Weekly Hours-
Textbook
• E-learn - I do NOT accept e-mailed homework
• Late Homework will NOT be accepted.
40% Final 20 % • All programs must be written in python
Exam Medterm
• Self contained, and well-designed functions/classes
• Clean, well documented code
20% Quizes
• 2 x 50 mins. Lectures.
10% H.W 10% Lab • 3 x 50 mins. Lab.
• Objective:
On successful completion of this course students will be able to:
• Explain fundamental computing algorithms
• Analyse algorithms and identify key algorithmic strategies
• Use algorithms to help solve programming problem in code
• Performance Measures.
• Asymptotic Analysis
Algorithms
• An algorithm is nothing, but a technique used to solve a given problem. It is a finite set of precise instructions for
performing a computation or for solving a problem.
• while <condition> do
{ a sequence of statements }
Basic Concepts- Algorithms and Data Structures:
• A data Structure is the organization of data in a computer’s memory or in a disk file.
• Examples: Arrays, stacks, linked list
• Algorithms are the procedure; a software program uses to manipulate the data in this structure.
• Example: Sorting Algorithm Problem
• Array to store the input Data Structure
• For loop –Access to the array, ordered permutation of the input -Algorithm
Program (Write an algorithm)
• Algorithms and Data Structures look at
how data for computer programs can best be represented and processed.
A1 A2 etc.,..
Example Algorithms
• An elegant way to define a problem is to specify the input and output.
• How can we find such algorithm? Learn more and more algorithms
• How can one tell that it is more efficient than other algorithms?
• Comparing algorithms mainly in terms of running time and in terms of other factors (e.g.,
memory requirements, programmer's effort etc.)
Input Size
Performance Measures
• Algorithms Analysis: how to produce better code?
• When you write a program or subprogram you should be concerned about the resource needs of the
program.
• Resources: Time and Memory:
Performance Measures
• Two performance measures are popular:
• Time Complexity (also known as Run Time): How Long It Takes?
• Imagine you are solving a maze. A short maze = quick finish
• A huge maze = takes more time to solve. Algorithms are the same: the bigger the task, the more
time they may need to complete.
• Space Complexity: How Much Memory It Needs?
• Example: Imagine you are doing a jigsaw puzzle:
• Small puzzle: You can finish it on a small table.
• Huge puzzle: You need a bigger table to hold all the pieces.
• Space complexity is like the size of the "table" (memory) an algorithm needs to solve a problem.
Some tasks need just a little memory, while others require a LOT!
* Note: An algorithm is a sequence of basic operations.
Performance Measures
• Time complexity of an algorithm is defined as the total number of basic
operations performed in the algorithm.
•Basic operations include +,-,/, =, comparison, and memory lookup.
•The above definition is machine and programming language independent. We
could readily convert time complexity into seconds given a specific machine.
• Example:
• The time complexity of the minimum finding algorithm is . Here we consider only the
comparisons performed in the algorithm.
• Time complexity of (i.e., the total number of comparisons done in) the selection sort
algorithm is:
.
Performance Measures
• Space complexity refers to the total amount of memory an algorithm uses while running. We
assume that one element (such as a real number) occupies one memory cell.
• This includes:
• Input Space: Memory required to store the input data.
• Auxiliary Space: Extra memory used by the algorithm (e.g., temporary variables, recursive stack space).
• The above algorithm brings out the need for different types of time complexities. We can define the best case, the worst
case, and average case time complexities.
• The best-case time complexity refers to the least time complexity achievable across all possible inputs.
• The worst case is defined similarly.
• The average case time complexity is defined as follows. Let D be the set of possible inputs and let T I be the run time
of an algorithm on instance I (for any I ∈ D). Then, the average time complexity of this algorithm is defined as: More
generally, we can define as: where is the probability that I occurs as the input and is the time complexity of the
algorithm on input I (for any ).
• For the searching algorithm the best case run time is 1; the worst case run time is n
Different Time Complexities: Types of Analysis
Algorithm Complexity Notation
• Formal notational methods for stating the growth of resource needs of an algorithm,
• In summary
• Big-O: O(f(n)), represents the worst case
• Omega: Ω(f (n)), represents the best case
• Theta: Θ(f (n)), represents the average case
• we only interested in determining the worst case of a given algorithm, that is when the input size goes large what is the
time complexity of the algorithm
Performance Measures: How do we
compare algorithms? Arithmetic operations - 1 unit of time
Assignment operator - 1 unit of time
Comparison - 1 unit of time
Return statement - 1 unit of time
= f(n)=3
1 assign
1, n+1, n
n (sum), n assign
1 return
= f(n) = 4n + 4
Performance Measures: How do we
compare algorithms? Arithmetic operations - 1 unit of time
Assignment operator - 1 unit of time
Comparison - 1 unit of time
Return statement - 1 unit of time
1, n+1, n
~ n x n = n^2
~ n x n = n^2
= f(n) = 2n^2 + n
1
1
log(n)
i is ~ 2^k (2^k >= n)
2^k = n k = log(n)
log(n) + log(n)
= f(n) = 3 log(n) + 2
Performance Measures: How do we
compare algorithms?
• The amount of resource needs often depends on the amount of data you have
• It makes sense that if you have more data, you will need more space to store the data and it will take more time for
an algorithm to run.
• To compare algorithms, we express running time as a function of the input size n (i.e., f(n)).
• Algorithm analysis does not answer the question: How much resource will be used to represent n pieces of data?
• The real question it answers: How much more resources will it take to process n+1 pieces of data?
• What we really care about is the growth rate of resources with respect to data
• As the amount of data gets bigger, how much more resource will my algorithm require?
• best (least resource requirements ) to worst, the rankings are O(1), O(logn), O(n), O(nlogn), O(n2), O(n3), O(2n ).
Asymptotic Functions- Big-oh (O())
• Definition: Letand be any two non-negative integer functions of n. We say if there exist two constants and such that for
all .
• This basically means that for all sufficiently large values of n, the function f(n) is no more than a constant multiple of
• For given function where time t increases when the size of input increase what the worse case ?
• worst case: upper bound forfor all sufficiently large
• let find another function such that after some limit , where
Growth rate for same previous functions showing larger input sizes
Performance Measures
Constant Growth Rate Logarithmic Growth Rate Linear Growth Rate
Processing 1 piece of data takes the same amount of resource as the resource needs grows by one unit each time the data is doubled. This the resource needs and the amount of data is directly proportional
processing 1 million pieces of data. effectively means that as the amount of data gets bigger, the curve to each other.
describing the growth rate gets flatter.
27
Running Times for Different Sizes
of Inputs of Different Functions
Practical algorithm
• Suppose we have array of numbers, where n the size of input =8:
5 1 3 2 4 6 9 13
• We want to search about an element (x) in this array using liner search
algorithm, and we want to calculate, best, average worst time for this
algorithm
• this means we will compare x with each individual element of this array to
check wither x is exist or not.
• The best case if x=5, then we will do one comparison = Ω(1)
• The worst case if x=13, the we will do n comparisons = O(n) , where n =8.
• The average case =Θ(n/2) = Θ(n)
Algorithm Design
• Unfortunately, there are no standard recipes for designing optimal
algorithms.
• There are some commonly used algorithm design techniques: divide-
and-conquer, greedy, dynamic programming, backtracking, etc.
• There is no guarantee that using a subset of these techniques will
always yield optimal algorithms.
• In any algorithm there is typically an idea that does not fall under any
of these techniques. This idea is very much dependent on the problem
and the algorithm developer.
Running Times in Big O Notation
Linear search
Binary search
Sorting
Big-O Complexity Chart
Reference: http://bigocheatsheet.com/
Reference:
• https://cathyatseneca.gitbooks.io/data-structures-and-algorithms/conte
nt/
• Reviews:
• Rules of Logarithms, example:
• https://mathinsight.org/logarithm_basics
• https://www.chilimath.com/lessons/advanced-algebra/logarithm-rules/
• http://www.sosmath.com/algebra/logs/log4/log44/log44.html
• Etc.