0% found this document useful (0 votes)
44 views42 pages

Data Structure - Chap 1

The document discusses various topics related to data structures and algorithms. It covers learning outcomes, grade distribution, introductions to algorithms and complexity analysis. It defines algorithms and complexity, and discusses different complexities such as constant, logarithmic, linear, linearithmic, quadratic, cubic and exponential. It also covers recursion, sorting algorithms, pointers, linked lists, stacks, queues and binary trees. Examples are provided to illustrate recursion, divide and conquer, and binary search algorithms.

Uploaded by

Elias Karam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views42 pages

Data Structure - Chap 1

The document discusses various topics related to data structures and algorithms. It covers learning outcomes, grade distribution, introductions to algorithms and complexity analysis. It defines algorithms and complexity, and discusses different complexities such as constant, logarithmic, linear, linearithmic, quadratic, cubic and exponential. It also covers recursion, sorting algorithms, pointers, linked lists, stacks, queues and binary trees. Examples are provided to illustrate recursion, divide and conquer, and binary search algorithms.

Uploaded by

Elias Karam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 42

Data Structure

Mr Zahi Al Chami
Chap 1: Algorithms, complexity and recursion
Overview
 The learning outcomes are:
 Algorithms, complexity and recursion
 Sorting algorithms
 Pointers and structures
 Singly Linked List
 Doubly Linked List
 Stack
 Queue
 Binary Tree

 All these notions are implemented in C++


Grade Distribution
 The grade is distributed as follows:

 Midterm exam ( 30% )

 Project ( 30% )

 Final exam ( 40% )


Introduction
 An algorithm is a finite series of instructions.

 These instructions are automatically executed.

 An algorithm takes data as input and produces a result.

 For example in C++:


int calculateAge( int age){

int DateOfBirth=2020-age;
return DateOfBirth;

}
Introduction
 To write a program, we process as follows:
 Obtain a description of the problem
 Analyze the problem
 Develop a pseudo-code
 Write the pseudo-code in a programming language.

 The three criteria to determine the quality of an algorithm


are:
 Reuse: if we can use it in a different context
 Result: if it is good or bad
 Complexity: Number of resources used
Example
 If we declare: int A [100] // static allocation
 We reserve in memory 100 elements even if we do not need more
than 10
 Testing the equality between two variables will require to reserve
more resources from the CPU

 These instructions will decrease the computer performance


and saturate its memory.

 It is mandatory to compare the complexity of different


algorithms.
 to keep only the most effective instructions
Complexity
 Given an algorithm, we call elementary operations:
 Arithmetic operations (+ / - )
 Multiplication, division, remainder, etc.
 Logical operations ( or, and, not, xor)
 Comparison
 Memory access ( to read or write in a variable)

 Algorithm complexity is noted as O: Landau notation.

 Big O describes the worst-case scenario, and can be used to


describe the execution time required or the space used by an
algorithm.
Complexity
 We consider f(n)>0 and g(n)>0. We say that f (n) has the
same order to that of g (n) if:
 There is an integer k such that f(n) <= k*g(n) -> we can then
write f = O (g) ( Landau Notation)

 For example:
 f(n)= n2 and g(n)= 3n2 + 5n + 4 -> these two functions have the
same order
Time Complexity
 Let us consider an algorithm ( f(n) ) which reads an array and
test each cell:
 if the array has 100 cells, the algorithm will perform 100 tests
 if the array has 5000 cells, the algorithm will perform 5000 tests.
 If the array has n cells, the algorithm will perform n tests-> We say that
its complexity is in O(n)

 Another example g(n): Algorithm that reads an array and tests


each cell if the value is positive and even. How many tests will
he perform?
 twice as much as the first algorithm, i.e. 2 * n -> f(n)<=2g(n)
 the algorithm f always has a complexity proportional to g and therefore
they are both in O (n).
Time Complexity
 Do all algorithms have complexity in O (n) ? The answer is
NO.

 A factor of 2, 3, 20, ...


 Can be considered negligible.
 Any constant factor can be considered negligible

 Let us take again our algorithm: it always processes an array


with n elements.
 But at each cell, it goes through the whole array from the beginning.
 To find out if there is another cell that has the same value -> How
many tests will he do ?
Time Complexity
 For each cell, he must do n tests.

 And since there are n cells, he will have to do n * n tests.

 The factor is not constant this time.

 This algorithm has a complexity of O(n2).


Recursion
 Recursion is the process of repeating instructions in a
similar way.

 A recursive function is a function that calls itself during its


execution.
Recursion
 A recursive function always has at least one base case and
one recursive call.
Recursion
How to find the complexity of a recursion
function
 First, you need to find the recurrence relation.

 Then, update the recurrence relation in each iteration.

 Lets take the factorial example:


F(n) is 1 comparison, 1 multiplication, 1 subtraction and time for f(n-1)

T(n) = T(n-1) + 3 -> first iteration


= T(n-2) + 6 ( because T(n-1) = T(n-2) + 3 ) - > second iteration
= T(n-3) + 9
= T(n-4) + 12
= ...
= T(n-k) + 3k as we know T(0) = 1 -> n iteration
we need to find the value of k for which n - k = 0, k = n , T(n) = T(0) + 3n , k = n
= 1 + 3n that gives us a time complexity of O(n)
Recursion: Exercises
 Write a recursive function that calculates the sum of the elements
between a and b ( they are given by parameters).

 Write a recursive function that calculates the power of a number.

 What is the output of the algorithm below?

Void test(int a){


if(a>0){
test(a/2);
cout<<a%2;
}
}
Recursion : Exercise
 Fibonacci is a sequence such that each number is the sum of the
two preceding ones, starting from 0 and 1.

 Write the Fibonacci series using the iterative function. What is its
complexity?
Recursion with arrays
Recursion: Dichotomic search
 A search algorithm that operates by selecting between two
distinct alternatives (dichotomies) at each step.

 It is a specific type of divide and conquer algorithm:


 Divide: This involves dividing the problem into some sub
problem.
 Conquer: Sub problem by calling recursively until sub
problem solved.
 Combine: The Sub problem Solved so that we will get find
problem solution.
Example Divide and Conquer Algorithm
Recursion: Binary search
 A well-known example is binary search, which search in a
sorted array by repeatedly dividing the search interval in
half. Begin with an interval covering the whole array.

 If the value of the search key is less than the item in the
middle of the interval, narrow the interval to the lower
half. Otherwise, narrow it to the upper half.

 Repeatedly check until the value is found or the interval is


empty.
Recursion : Binary search
Recursion
 Find the maximum element in an array using a recursive
function and the idea of divide and conquer.
 Divide the array into sub-arrays
 Calculate the max of each sub-array
Max=MaxRight=8

MaxLeft=7 MaxRight=8

Maxleft=5 maxRight=7 MaxRight=8


MaxLeft=4
Time Complexity
 The time complexity of the algorithm can be one of the following:

 O(1): Constant

 O(logn): Logarithmic

 O(n): Linear

 O(n*logn): Linearithmic

 O(n2): Quadratic

 O(n3): Cubic

 O(2n): Exponential

 O(n!): Factorial
Time Complexity: O(1)
 Time complexity of a function is considered as O(1)
if it doesn’t contain loop, recursion and call to any
other non-constant time function

 For example: set of non-recursive and non-loop


statements such as the swap function

 A loop or recursion that runs a constant number of


times is also considered as O(1)
Time Complexity: O(Logn)
 Logarithmic time complexities usually apply to algorithms that
divide problems in half every time.

 For instance, let’s say that we want to look for a person in an old
phone book. It has every name sorted alphabetically. There are at
least two ways to do it:

 Algorithm A
 Start at the beginning of the book and go in order until you find the contact you
are looking for. Run-time O(n)
 Algorithm B
 Open the book in the middle and check the first name on it.
 If the name that you are looking for is alphabetically bigger, then look to the right.
Otherwise, look in the left half
Time Complexity: Example O(Logn)
 The algorithm Binary Search has a complexity of Logn.

At Iteration 1,Length of array = n


At Iteration 2,Length of array = n⁄2
At Iteration 3,Length of array = (n⁄2)⁄2 = n⁄22
Therefore, after Iteration k,Length of array = n⁄2k
Also, we know that afterAfter k divisions, the length of array becomes 1
ThereforeLength of array = n⁄2k = 1 => n = 2k
Applying log function on both sides:=> log2 (n) = log2 (2k) => log2 (n) = k log2 (2)
As (loga (a) = 1)
Therefore,=> k = log2 (n)
Time Complexity: O(n)
 Linear time complexity O(n) means that as the input
grows, the algorithms take proportionally longer. A
function with a linear time complexity has a growth rate.

 For example:
 Get the max/min value in an array
 Find a given element in a collection
Time Complexity: Example O(n)

 If you get the time complexity it would be something like this:


 Line 2-3: 2 operations
 Line 4: a loop of size n
 Line 5-7: 3 operations

 So, this gets us 3(n) + 2 -> By leaving the most significant term, we
get n. And finally using the big O notation we get O(n)
Time Complexity: O(n*logn)
 Linearithmic time complexity is slightly slower than a
linear algorithm but still much better than a quadratic
algorithm.

 Example of linearithmic algorithms:


 Efficient sorting algorithms like merge sort, quick sort and
others.
Time Complexity: Example O(nlogn)
 We have already learned that whenever we divide a
number into half in every step, it can be represented
using a logarithmic function (logn)

 Also, we perform a single step operation to find


out the middle of any subarray, i.e O(1)

 Finally, to merge the n element of the


subarrays, the time required is O(n)

 Hence, the total time is O(n*logn)


Time Complexity: O(n2)
 A function with a quadratic time complexity has a growth
rate n2.

 Here are some examples of O(n2) quadratic algorithms:


 Check if an array has duplicated values
 Sorting elements in an array using bubble sort, insertion sort
and selection sort.
Time Complexity: O(nc)
 Polynomial running is represented as O(nc) when c>1. As
you already saw, two inner loops almost translate to O(n2)
since it has to go through the array twice in most cases.

 Are three nested loops cubic? In most cases, yes!

 Usually, we want to stay away from polynomial running


times (quadratic, cubic, O(nc)) since they take longer to
compute as the input grows fast.

 However, they are not the worst.


Time Complexity: O(2n)
 Exponential running time means that the calculations
performed by an algorithm double every time as the input
grows.

 Example of exponential runtime algorithms:


 Power set: finding all the subsets on a set
 Fibonacci
Time Complexity: Example O(2n)
 let’s imagine you are buying a pizza. The store has many
toppings that you can choose from like pepperoni,
mushrooms, bacon, and pineapple.

 Let’s call each topping A, B, C, D. What are your choices?

 You can select no topping (you are on a diet ;), you can
choose one topping or a combination of two or a
combination of three or all of them.
Time Complexity: Example O(2n)

 As expected, if you plot n and f(n), you


will notice that it would be exactly like
the function 2n.

 This algorithm has a running time of O(2 n).


Time Complexity: O(n!)
 Factorial is the multiplication of all positive integer
numbers less than itself.

 As you might guess, you want to stay away if possible


from algorithms that have this running time.

 Example of O(n!) factorial runtime algorithms:


 Permutations of a string
 Brute-force search
Time Complexity: Example O(n!)
All running complexities graphs
Space Complexity
 Space complexity is a measure of the amount of working
storage an algorithm needs.

 That means how much memory, in the worst case, is


needed at any point in the algorithm.

 The ability to calculate space complexity is essential in


considering an algorithm’s efficiency.
Space Complexity: Example
 Algorithm 1: int sum( int a, int b){

return a+b;
}
 In this particular method, three variables are used and allocated in memory:
 The first int argument, a
 The second int argument, b
 The returned sum result which is also an int

 In C++, a single integer variable occupies four bytes of memory. In this example, we
have three integer variables.

 This algorithm always takes 12 bytes of memory (3*4).

 We can clearly see that the space complexity is constant. So, it can be expressed in big-O
notation as O(1)
Space Complexity: Example
int sumArray (int array[ ], int size){
 Algorithm 2:
int sum=0;

for ( int i=0; i<size;i++)


sum+=array[i];

return sum;
}
 Let’s list all variables present in the above code:
 array – the space taken by the array is equal to 4*n bytes where n is the length of the array
 size – a 4 byte integer
 sum – a 4 byte integer
 i – a 4 byte integer

 The total space needed is: 4*n+4+4+4. The highest order is n. Thus the
space complexity is O(n).

You might also like