0% found this document useful (0 votes)
196 views

Data Structures An Introduction: William Fiset

A data structure is a way of organizing data to allow for effective use. Data structures are essential for creating fast algorithms and help manage and organize data to make code cleaner and easier to understand. Computational complexity analysis, using Big-O notation, provides an upper bound on the worst-case running time of an algorithm as input size increases. Common time complexities include constant O(1), linear O(n), quadratic O(n^2), and exponential O(2^n).

Uploaded by

jyoti
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
196 views

Data Structures An Introduction: William Fiset

A data structure is a way of organizing data to allow for effective use. Data structures are essential for creating fast algorithms and help manage and organize data to make code cleaner and easier to understand. Computational complexity analysis, using Big-O notation, provides an upper bound on the worst-case running time of an algorithm as input size increases. Common time complexities include constant O(1), linear O(n), quadratic O(n^2), and exponential O(2^n).

Uploaded by

jyoti
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Data Structures

an Introduction
William Fiset
What is a Data Structure?

A data structure (DS) is a way of


organizing data so that it can be
used effectively.
Why Data Structures?

They are essential ingredients in


creating fast and powerful algorithms.

They help to manage and organize data.

They make code cleaner and easier to


understand.
Abstract Data Types
vs. Data Structures
Abstract Data Type
An abstract data type (ADT) is an abstraction
of a data structure which provides only the
interface to which a data structure must
adhere to.

The interface does not give any specific


details about how something should be
implemented or in what programming language.
Examples
Abstraction (ADT) Implementation (DS)

Dynamic Array
List Linked List

Linked List based Queue


Queue Array based Queue
Stack based Queue

Tree Map
Map Hash Map / Hash Table

Golf Cart
Vehicle Bicycle
Smart Car
Computational
Complexity
William Fiset
Complexity Analysis
As programmers, we often find ourselves
asking the same two questions over and
over again:

How much time does this algorithm


need to finish?

How much space does this algorithm


need for its computation?
Big-O Notation
Big-O notation gives an upper bound
of the computational complexity of
an algorithm in the worst case.

This helps us quantify performance


of algorithms as the input size
becomes arbitrarily large.
Big-O Notation
n - The size of the input
Complexities ordered in from smallest to largest

Constant Time: O(1)


Logarithmic Time: O(log(n))
Linear Time: O(n)
Linearithmic Time: O(nlog(n))
Quadratic Time: O(n2)
Cubic Time: O(n3)
Exponential Time: O(bn), b > 1
Factorial Time: O(n!)
Big-O Properties
O(n + c) = O(n)
O(cn) = O(n), c > 0

Let f be a function that describes


the running time of a particular
algorithm for an input of size n:

f(n) = 7log(n)3 + 15n2 + 2n3 + 8

O(f(n)) = O(n3)

Practical examples coming up don’t worry :)


Big-O Examples

The following run in constant time: O(1)

a := 1 i := 0
b := 2 While i < 11 Do
c := a + 5*b i = i + 1
Big-O Examples

The following run in linear time: O(n)

i := 0 i := 0
While i < n Do While i < n Do
i = i + 1 i = i + 3
f(n) = n f(n) = n/3
O(f(n)) = O(n) O(f(n)) = O(n)
Big-O Examples
Both of the following run in quadratic time.
The first may be obvious since n work done
n times is n*n = O(n2), but what about the
second one?
For (i := 0 ; i < n; i = i + 1)
For (j := 0 ; j < n; j = j + 1)

f(n) = n*n = n2, O(f(n)) = O(n2)

For (i := 0 ; i < n; i = i + 1)
For (j := i ; j < n; j = j + 1)
^ replaced 0 with i
Big-O Examples
For a moment just focus on the second loop.
Since i goes from [0,n) the amount of looping
done is directly determined by what i is.
Remark that if i=0, we do n work, if i=1, we do
n-1 work, if i=2, we do n-2 work, etc…

So the question then becomes what is:


(n) + (n-1) + (n-2) + (n-3) + … + 3 + 2 + 1?
Remarkably this turns out to be n(n+1)/2, so
O(n(n+1)/2) = O(n2/2 + n/2) = O(n2)

For (i := 0 ; i < n; i = i + 1)
For (j := i ; j < n; j = j + 1)
Big-O Examples
Suppose we have a sorted array and we want to
find the index of a particular value in the
array, if it exists. What is the time complexity
of the following algorithm?
low := 0
Ans: O(log2(n)) = O(log(n))
high := n-1
While low <= high Do

mid := (low + high) / 2

If array[mid] == value: return mid


Else If array[mid] < value: lo = mid + 1
Else If array[mid] > value: hi = mid - 1

return -1 // Value not found


Big-O Examples
i := 0
While i < n Do
j = 0
While j < 3*n Do
j = j + 1
j = 0
While j < 2*n Do
j = j + 1
i = i + 1

f(n) = n * (3n + 2n) = 5n2


O(f(n)) = O(n2)
Big-O Examples
i := 0
While i < 3 * n Do
j := 10
While j <= 50 Do
j = j + 1
j = 0
While j < n*n*n Do
j = j + 2
i = i + 1

f(n) = 3n * (40 + n3/2) = 3n/40 + 3n4/2


O(f(n)) = O(n4)
Big-O Examples

Finding all subsets of a set - O(2n)

Finding all permutations of a string - O(n!)

Sorting using mergesort - O(nlog(n))

Iterating over all the cells in a matrix of


size n by m - O(nm)

You might also like