0% found this document useful (0 votes)
36 views

Data Structure: Algorithms

The document discusses algorithms and complexity analysis. It provides an agenda for topics to be covered, including algorithms, complexity analysis, experimental approaches, worst-case versus average-case analysis, time complexity, asymptotic complexity, Big O notation, and comparing common growth functions. The focus of the course will be analyzing algorithms based on the number of primitive operations like assignments, comparisons, etc. rather than experimental testing. Worst-case analysis is prioritized over average or best-case as it provides an upper bound on running time.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Data Structure: Algorithms

The document discusses algorithms and complexity analysis. It provides an agenda for topics to be covered, including algorithms, complexity analysis, experimental approaches, worst-case versus average-case analysis, time complexity, asymptotic complexity, Big O notation, and comparing common growth functions. The focus of the course will be analyzing algorithms based on the number of primitive operations like assignments, comparisons, etc. rather than experimental testing. Worst-case analysis is prioritized over average or best-case as it provides an upper bound on running time.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 62

Data Structure

Algorithms

Instructor: Hasan Alkhatib


Agendas
● Algorithm
● Complexity Analysis
● Experimental Approach
● Algorithms and (Worst, average, best-case)
● Analysis of Algorithms
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Algorithms
The factors affecting the speed of software are:

1. Type of inputs to the program e.g., Memory vs. disk access


2. Type of program code e.g., High vs. low level languages
3. Processor speed
4. The complexity of an algorithm
The focus of this course will be on the last point

The focus of this course will be on the last point


Algorithms
In general, an algorithm is a set of steps that followed in order to accomplish a specific task or goal, like having
specific steps in order to make mashed potato.

In computer science, an algorithm is a step-by-step procedure for solving a problem in a finite amount of time.
Algorithms
● A data structure is a systematic way of organizing and accessing data,

● and an algorithm is a step-by-step procedure for performing some task in a finite


amount of time.

● These concepts are central to computing, but to be able to classify some data structures
and algorithms as “good,” we must have precise ways of analysing them.
Algorithm sorting(X, n)
Input array X of n integers
Output array X sorted in a non-decreasing order

for i = 0 to n - 1 do
for j = i+1 to n do
if (X[i]>X[j]) then
{ s=X[i];
X[i]=X[j];
X[j]=s; }
return X
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Complexity Analysis – Why we need it?
● There are often many different algorithms which can be used to solve the same problem.
● Thus, it makes sense to develop techniques that allow us to:
o Compare different algorithms with respect to their “efficiency”
o Choose the most efficient algorithm for the problem
Complexity Analysis
● Algorithm analysis or Complexity Analysis is a process of determining the amount of time, resource, etc.
required when executing an algorithm.
● The efficiency of any algorithmic solution to a problem is a measure of the:
○ Time efficiency: the time it takes to execute.
○ Space efficiency: the space (primary or secondary memory) it uses.

● We will focus on an algorithm’s efficiency with respect to time.


Complexity Analysis
● Problem: prefix averages
Given an array X, Compute the array A such that A[i] is the average of elements X[0] … X[i], for i=0..n-
1

For example:
for A[10] = (1,2,3,4,5,6,7,8,9,10)
X[10] =(1,1.5,2,2.5,3,3.5,4,4.5,5,5.5)
Complexity Analysis
● Sol 1
At each step i, compute the element X[i] by traversing the array A and determining the sum of its
elements, respectively the average

● Sol 2
1. At each step i update a sum of the elements in the array A
2. Compute the element X[i] as sum/I
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Experimental Approach
● Write a program that implements the algorithm
● Run the program with data sets of varying size.
● Determine the actual running time using a system call to measure time (e.g. system date)
Experimental Approach
● To measure time in Java use:
long before = System.currentTimeMillis();
● Code to measure performance
long after = System.currentTimeMillis();
System.out.println((after - before) + " milliseconds");
Experimental Approach - Example
● The first algorithm we consider performs repeated string concatenation, based on the +
operator.
Experimental Approach - Example
● The second algorithm relies on Java’s StringBuilder class
Experimental Approach - Example
● As an experiment, we used System.currentTimeMillis( ), in the style of Code above, to
measure the efficiency of both repeat1 and repeat2 for very large strings.
● We executed trials to compose strings of increasing lengths to explore the relationship
between the running time and the string length.
Experimental
Approach - Results
Experimental Approach - Problems
● It is necessary to implement and test the algorithm in order to determine its running
time.
● Experiments can be done only on a limited set of inputs, and may not be indicative of
the running time for other inputs.
● The same hardware and software should be used in order to compare two algorithms. –
condition very hard to achieve!
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Algorithm Analysis
● Machine independence
● Our goal is to develop an approach to analysing the efficiency of algorithms that:
1. Allows us to evaluate the relative efficiency of any two algorithms in a way that is
independent of the hardware and software environment.
2. Is performed by studying a high-level description of the algorithm without need for
implementation.
3. Takes into account all possible inputs.
Algorithm Analysis
● Counting Primitive Operations
● To analyse the running time of an algorithm without performing experiments, we
perform an analysis directly on a high-level description of the algorithm
● We define a set of primitive operations such as the following list (next slide):
Algorithm Analysis

Primitive Operations
● Assigning a value to a variable
● Following an object reference
● Performing an arithmetic operation (for example, adding two numbers)
● Comparing two numbers
● Accessing a single element of an array by index
● Calling a method
● Returning from a method
Algorithm Analysis
● The amount of time that any algorithm takes to run almost always depends on the
amount of input that it must process.
● We expect, for instance, that sorting 10,000 elements requires more time than sorting 10
elements.
● Hence, the efficiency of an algorithm is the number of primitive operations it performs.
● This number is a function of the input size n.
Algorithm Analysis
● We will associate, with each algorithm, a function f(n) that characterizes the number of
primitive operations that are performed as a function of the input size n.
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Worst- / average- / best-case

Worst-case running time of an algorithm T(n)


● The longest running time for any input of size n
● An upper bound on the running time for any input
 guarantee that the algorithm will never take longer
● Example: Sort a set of numbers in increasing order; and the data is in decreasing order
● E.g. in searching a database for a particular piece of information, the element at the end
of database table.
Worst- / average- / best-case

Best-case running time : B(n), the minimum time needed to execute an algorithm for an
input of size n
● Sort a set of numbers in increasing order; and the data is already in increasing order.

● Example: in searching a database for a particular piece of information, the element at


the begin of database table.
Worst- / average- / best-case

Average-case running time, the minimum time needed to execute an algorithm for an input
of size n
● Average-case complexity: A(n), the average time needed to execute an algorithm for an
input of size n
● May be difficult to define what “average” means.
Worst- / average- / best-case
Worst- / average- / best-case

Example: Linear Search Complexity


● Best Case : Item found at the beginning: One comparison
● Worst Case : Item found at the end: n comparisons

● Average Case :Item may be found at index 0, or 1, or 2, . . . or n - 1


○ Average number of comparisons is: (1 + 2 + . . . + n) / n = (n+1) / 2
Worst- / average- / best-case
● We are usually interested in the worst case complexity: what are the most operations that
might be performed for a given problem size.
● We will not discuss the other cases -- best and average case.
● Best case depends on the input
● Average case is difficult to compute
Worst- / average- / best-case

So we usually focus on worst case analysis:


● Easier to compute
● Usually close to the actual running time
● Crucial to real-time systems (e.g. air-traffic control)
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Time Complexity

Time Complexity T(n), or the running time of a particular algorithm on input of size n, is
taken to be the number of times the instructions in the algorithm are executed.
Time Complexity - Example

1. n = read input from user----------- 1 The computing time for this


2. sum = 0---------------------------- 1 algorithm in terms on input size n is:

3. i = 0-------------------------------- 1 T(n) = 1+1+1+(n * 5) + 1


1+1=5n + 6.
4. while i < n ------------------------ n+1
5. number = read input from user- n
6. sum = sum + number---------- n+n
7. i = i + 1---------------------------- n+n
8. mean = sum / n------------------- 1+1
Time Complexity - General Rules for Estimation

Loops: The running time of a loop is at most the running time of the statements inside of
that loop times the number of iterations.
Nested Loops: Running time of a nested loop containing a statement in the inner most loop
is the running time of statement multiplied by the product of the sized of all loops.
Consecutive Statements: Just add the running times of those consecutive statements.
If/Else: Never more than the running time of the test plus the larger of running times of S1
and S2.
Time Complexity - Loops
● We start by considering how to count operations in for-loops.
○ We use integer division throughout.

● First of all, we should know the number of iterations of the loop; say it is x.
○ Then the loop condition is executed x + 1 times.
○ Each of the statements in the loop body is executed x times
○ The loop-index update statement is executed x times.
We analyse Time complexity of getMax, Assume that the array is filled with n elements, where n >= 2

public int getMax(int[ ] array, int n){


int currentMax = array[0]; //1+1+1
for(int i = 1; i < n; i++){ // n + 1 - 1 (loop starts from 1)
if(array[i] > currentMax){ // 1+1 -> n + n
currentMax = array[i]; //1+1+1 -> n + n + n
}
}
return currentMax; //1
}

Hence we have the worst case time complexity:


T(n) = 1+1+1 + (n * 5) + 1 = 4n + 4

Time Complexity - Example


double x, y; //1+1
x = 2.5 ; y = 3.0; //1+1
for(int i = 0; i < n; i++){ //n+1
a[i] = x * y; //1+1+1
x = 2.5 * x; //1+1
y = y + a[i]; //1+1+1
}

• There are 2 assignments outside the loop => 2 operations.


• The for loop comprises:
- A test i < n that is executed n + 1
- Assignment =n , index =n, operations =n

Thus the total number of basic operations is


1+1+1+1 + n * (8) + 1 = 8n + 5

Time Complexity - Example


Simple Complexity Analysis: Examples
Suppose n is a multiple of 2. Determine the number of basic operations performed by of the
method myMethod():

See next slide..


Simple Complexity Analysis: Examples
static int myMethod(int n){ static int helper(int n){
for(int i = 0; i < n; i ++) int sum = 0;
System.out.print(helper(i)); for(int i = 0; i < n; i++)
} sum = sum + i;
return sum;
}

The number of basic operations for the helper method is:


1 + 1 + (n* (2) + 1) + 1 = 4 + 2n
The number of operations for the myMethod method:
n log(helper method) + 1 -> n log(4 + 2n) + 1
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Asymptotic Complexity
● Finding the exact complexity, T(n) = number of basic operations, of an algorithm is
difficult.

● We approximate T(n) by a function g(n) in a way that does not substantially change the
magnitude of T(n). --the function g(n) is sufficiently close to f(n) for large values of
the input size n.
Asymptotic Complexity
● This "approximate" measure of efficiency is called asymptotic complexity.
● Thus the asymptotic complexity measure does not give the exact number of operations
of an algorithm, but it shows how that number grows with the size of the input.
● This gives us a measure that will work for different operating systems, compilers and
CPUs.
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Big-O Notation
● The most commonly used notation for specifying asymptotic complexity is the big-O
notation.

● The Big-O notation, O(g(n)), is used to give an upper bound (worst-case) on a positive
runtime function f(n) where n is the input size.
Rules for using big-O
● For large values of input n , the constants and terms with lower degree of n are ignored.

1. Multiplicative Constants Rule: Ignoring constant factors.


O(c f(n)) = O(f(n)), where c is a constant;
Example:
O(20 n) = O(n)
3 3
Rules for using big-O

2. Addition Rule: Ignoring smaller terms.


If O(f(n)) < O(h(n)) then O(f(n) + h(n)) = O(h(n)).
Example:
O(n2 log n + n3) = O(n3)

3. Multiplication Rule: O(f(n) * h(n)) = O(f(n)) * O(h(n))


Big O: examples
● T(N)= N2 / 2 – 3N = O(N2)
● T(N)= 1 + 4N = O(N)
● T(N)= 7N2 + 10N + 3 = O(N2)
● T(N)= log N + N = O(N)
Big O - Constant Time

The Big O notation estimates the execution time of an algorithm in relation to the input
size. If the time is not related to the input size, the algorithm is said to take constant time
with the notation O(1).

For example, a method that retrieves an element at a given index in an array takes constant
time, because it does not grow as the size of the array increases.
Big O - Repetition: Simple Loops

Executed n times for (i = 1; i <= n; i++) {


k = k + 5;
constant time
}

Time Complexity T(n) = (1+1+1) * n = n = O(n)

Ignore multiplicative constants (e.g., “c”).


Big O - Repetition: Nested Loops

Executed n times for (i = 1; i <= n; i++) {


for (j = 1; j <= 20; j++) {
k = k + i + j;
inner loop
} executed
} 20 times

Time Complexity T(n) = (n * 20 * 3) + 1 = 60n + 1

Big O O(n) Ignore multiplicative constants (e.g., 20*c)


Big O - Repetition: Nested Loops

Loops: for, while, and do-while:


Complexity is determined by the number of iterations in the loop times the complexity of the
body of the loop.
Examples:
for (int i = 0; i < n; i++) O(n) i=1; O(log n)
sum = sum - i; while (i < n) {
sum = sum + i;
for (int i = 0; i < n * n; i++) O(n2) i = i*2
}
sum = sum + i;
Big O - Repetition: Nested Loops

Nested Loops:
Complexity of inner loop * complexity of outer loop.
Examples: i = 1;
while(i <= n) { O(n log n)
j = 1;
sum = 0 while(j <= n){
O(n2) statements of constant
for(int i = 0; i < n; i++)
for(int j = 0; j < n; j++) complexity
sum += i * j ; j = j*2;
}
i = i+1;
}
Big O - Sequence of statements

Use Addition rule


O(s1; s2; s3; … sk) = O(s1) + O(s2) + O(s3) + … + O(sk) = O(max(s1, s2, s3, . . . , sk))

for (int j = 0; j < n * n; j++)


sum = sum + j;
for (int k = 0; k < n; k++)
sum = sum - l;
System.out.print("sum is now ” + sum);

Complexity is O(n2) + O(n) +O(1) = O(n2)


Big O - Switch

Take the complexity of the most expensive case


char key;
int[] X = new int[n];
int[][] Y = new int[n][n];
switch(key) {
case 'a':
for(int i = 0; i < X.length; i++)
sum += X[i];
break;
case 'b':
for(int i = 0; i < Y.length; j++)
for(int j = 0; j < Y[0].length; j++)
sum += Y[i][j];
break;
} // End of switch block Complexity is = O(n2)
Big O - If
Take the complexity of the most expensive case
char key;
int[][] A = new int[n][n];
int[][] B = new int[n][n];
int[][] C = new int[n][n];
if(key == '+') {
for(int i = 0; i < n; i++) O(n2)
for(int j = 0; j < n; j++)
C[i][j] = A[i][j] + B[i][j];
} // End of if block

else if(key == 'x') O(n3)


C = matrixMult(A, B);

else
System.out.println("Error! Enter '+' or 'x'!"); O(n)
Complexity is = O(n3)
Big O – Bonus!!

int[] integers = new int[n];


........
if(hasPrimes(integers) == true)
integers[0] = 20;
else
integers[0] = -20;

public boolean hasPrimes(int[] arr) {


for(int i = 0; i < arr.length; i++)
..........
..........
} // End of hasPrimes()
Agendas
● Algorithms
● Complexity Analysis
● Experimental Approach
● Algorithm Analysis
● Worst- / average- / best-case
● Time Complexity
● Asymptotic Complexity
● Big-O Notation
● Comparing Common Growth Functions
Growth-Rate
Growth-Rate

You might also like