0% found this document useful (0 votes)
16 views62 pages

1 Introduction

Uploaded by

Monika Bansal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views62 pages

1 Introduction

Uploaded by

Monika Bansal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 62

Program: B.

Tech
Course Code: AIML303/AIDS303/IOT303
Course Name: Design & Analysis of Algorithms

By:
Dr. Monika Bansal
Course Objectives
Course Outcomes
Course Prerequisites

• Data Structure (AIML201/AIDS201/IOT201)


Grading Scheme
• End-Term Exam: 75 marks
• Internal Assessment: 25 marks
• Mid-Term Exam: 15 marks
• Continuous Evaluation: 10 marks
• Assignments
• MCQ
• Class Participation & Responses
Syllabus
Unit I: Introduction to Algorithms
Unit II: Divide and Conquer Algorithms, Greedy
Algorithms
Unit III: Dynamic Programming
Unit IV: Graph Algorithms, Computational Complexity
Syllabus
Syllabus
Books

Nptel Course:
https://nptel.ac.in/courses/106101060
Lab Exercises
Goals of the Course
• What is this course about?
• Algorithms
• Design - How do you create an algorithm?
• Analysis – How efficient is it?
• Correctness – How sure are you that it works for all
input?
• Data Structures
• Role in efficient algorithms
• Data structures for common problems
Algorithm

• An algorithm is a finite set of instructions to accomplishes a


particular task.
• well-defined computational procedure that takes some value, or
set of values, as input and produces some value, or set of values, as
output. A sequence of computational steps that transform the
input into the output.

Algorithm
Input Output

Typically, an algorithm must also halt.


Characteristics of Algorithm

1. Input – 0 or more quantities are externally


supplied.
2. Output – At least one quantity is produced.
3. Definiteness – Each instruction is clear and
unambiguous.
4. Finiteness – terminates after a finite number of
steps.
5. Effectiveness – Instructions must be basic and
feasible.
Design and Analysis of Algorithms

• Analysis: predict the cost of an algorithm in


terms of resources and performance

• Design: design algorithms which minimize the


cost
Program vs Algorithm

• Program – programming language dependent,


has not to satisfy finiteness condition. E.g.
Operating System
• Algorithm – programming language independent,
has to satisfy all 5 conditions.

• Algorithm Specification
• Flowchart – useful for small and simple algorithms.
• Pseudocode – English like statements follows some
conventions.
Pseudocode conventions
• Indentation - block structure
• Loops - while, for, and repeat-until, , interpretations is similar to
programming languages, such as C, C++, Java, Python, etc.
• Conditions - if-else, interpretations is similar to programming
languages, such as C, C++, Java, Python, etc.
• Comment – “//”

• return - immediately transfers control back to the point of call in the


calling procedure.
• The keyword error indicates that an error occurred.
Loops
• for

• while

• repeat-until
Conditional Statements
Algorithm
Pseudocode Example

• Algorithm to find and return maximum of n given numbers


Pseudocode Example

• Select Sort algorithm

1
2
3
4
5
6
7
Exercise-1

1. Write an iterative algorithm to calculate the factorial of a


number.
2. Write recursive algorithm to calculate the factorial of a number.
3. Write an algorithm to calculate the length of a string.
4. Write an algorithm to sort n elements in non-decreasing order
using insertion sort.
Algorithm Complexity

• Understand the Efficiency of Algorithms


• Time complexity + space complexity
• Time Complexity – The time complexity of an
algorithm is the amount of computer time it needs
to run to completion.
• Space Complexity - The space complexity of an
algorithm is the amount of computer memory it
needs to run to completion.
What do we mean by Analysis?

• Analysis is performed with respect to a


computational model
• We will usually use a generic uniprocessor
random-access machine (RAM)
• All memory is equally expensive to access
• No concurrent operations
• All reasonable instructions take unit time
• Except, of course, function calls
• Constant word size
• Unless we are explicitly manipulating bits
Our Machine Model

Generic Random Access Machine (RAM)


• Executes operations sequentially
• Set of primitive operations:
– Arithmetic. Logical, Comparisons, Function calls

• Simplifying assumption: all ops cost 1 unit


– Eliminates dependence on the speed of our computer,
otherwise impossible to verify and to compare
Space Complexity
• The space complexity of algorithm is the sum of two components: Fixed part and variable part

When analyzing the space complexity of an algorithm, we concentrate solely on estimating Sp


Space Complexity Computation: Example-1
Space Complexity Computation: Example-2
Space Complexity Computation: Example-3
Exercise-2

Find space requirement for all algorithms given in Exercise-1


Time Complexity
• The time complexity of algorithm is the sum of two
components: Compile time and run (execution) time.
• Compile time does not depend on the instance
characteristics.
• Also, we may assume that a compiled program will be run
several times without recompilation.
• Focus is on run time, denoted by tp (instance
characteristics).
Run-time Analysis

• Depends on
• input size
• input quality (partially ordered)
• Kinds of analysis
• Worst case (standard)
• Average case (sometimes)
• Best case (never)
Time Complexity Computation: Example-1
Time Complexity Computation: Example-2
Time Complexity Computation: Example-3
Exercise-3

Find run-time requirement for all algorithms given in Exercise-1


Order of Growth

• Order of Growth is a concept used to describe


how the time or space requirements of an
algorithm change relative to the size of the input.
• It provides a high-level understanding of the
algorithm's efficiency and performance as the
input size increases.
• The order of growth is typically expressed using
Big O notation, which categorizes algorithms
according to their upper bound time or space
complexity.
Common Functions
Name Big-Oh Comment
Constant O(1) Can’t beat it!

Log log O(loglogN) Extrapolation search

Logarithmic O(logN) Typical time for good searching


algorithms

Linear O(N) This is about the fastest that an


algorithm can run given that we need
O(n) just to read the input

N logN O(NlogN) Most sorting algorithms

Quadratic O(N2) Acceptable when the data size is


small (N<10000)
Cubic O(N3) Acceptable when the data size is
small (N<1000)
Exponential O(2N) Only good for really small input sizes
(n<=20)
Common Functions
Machine-independent time

BIG IDEAS:
• Ignore machine dependent constants,
otherwise impossible to verify and to compare algorithms

• Look at growth of T(n) as n → ∞ .

“Asymptotic Analysis”
Asymptotic Notation

• Asymptotic efficiency of algorithms


• To do so, we look at input sizes large enough to make
only the order of growth of the running time relevant
• That is, we are concerned with how the running time of
an algorithm increases with the size of the input in the
limit as the size of the input increases without bound.
• Usually an algorithm that is asymptotically more
efficient will be the best choice for all but very small
inputs
• Asymptotic notations
• Big O,   Notations
Big-Oh Notation: Asymptotic Upper Bound
For a given function g(n), denoted by O(g(n)) is the set of
functions

• T(n) = f(n) = O(g(n)) Want g(n) to be


• if 0<= f(n) <= c*g(n) for all n >= n0, simple.
where c & n0 are constants > 0
c*g(n)
f(n)

n0 n
Big-Oh Notation: Asymptotic Upper Bound

Want g(n) to be
• T(n) = f(n) = O(g(n)) simple.
• if 0 <= f(n) <= c*g(n) for all n >= n0,
where c & n0 are constants > 0
c*g(n)
f(n)

n0 n
• O-notation is used to give an upper bound on a function, to within a constant factor.
The figure shows the intuition behind O-notation. For all values of n at and to the
right of n0, the value of the function f(n) is on or below cg(n).
• We write f(n) = O(g(n) to indicate that a function f(n) is a member of the set O(g(n)).
Big-Oh Notation: Asymptotic Upper Bound

– Example: T(n) = 2n + 5 is O(n). Why?


– 2n+5 <= 3n, for all n >= 5, c = 3

– T(n) = 5*n2 + 3*n + 15 is O(n2). Why?


– 5*n2 + 3*n + 15 <= 6*n2, for all n >= 6,c = 6
 Notation: Asymptotic Lower Bound

• T(n) = f(n) = (g(n))


• if 0 <= f(n) >= c*g(n) for all n > n0, where c and n0 are constants > 0

f(n)
c*g(n)

n
n0

– Example: T(n) = 2n + 5 is (n). Why?


– 2n+5 >= 2n, for all n > 0, c = 2
– T(n) = 5*n2 - 3*n is (n2). Why?
– 5*n2 - 3*n >= 4*n2, for all n >= 3, c = 4
Exercise-4
1. Set upper boundary (Big Oh ‘O’) for the following functions, i.e. find function g(n),
and constants (c and no) such that condition f(n) = O(g(n)) is satisfied.

2. Set lower boundary (Big Omega ‘Ω’) for above functions, i.e. find function g(n),
and constants (c and no) such that condition f(n) = Ω(g(n)) is satisfied.
 Notation: Asymptotic Tight Bound

• T(n) = f(n) = (g(n))


• if 0<= c1*g(n) <= f(n) <= c2*g(n) for all n > n0, where c1, c2 and n0
are constants > 0
c2*g(n)
f(n)
c1*g(n)

n0 n
-notation

DEF:
(g(n)) = { f (n) : there exist positive constants c1, c2, and
n0 such that 0  c1 g(n)  f (n)  c2 g(n)
for all n  n0 }
Basic manipulations:
• Drop low-order terms; ignore leading constants.
• Example: 3n3 + 90n2 – 5n + 6046 = (n3)
 Notation: Asymptotic Tight Bound

– Example: T(n) = 2n + 5 is (n). Why?


2n <= 2n+5 <= 3n, for all n >= 5, c1 = 2, c2 = 3

– T(n) = 5*n2 - 3*n is (n2). Why?


– 4*n2 <= 5*n2 - 3*n <= 5*n2, for all n >= 3, c1 = 4 , c2
=5
Example-1
What is the time complexity of below code?
Matrix Multiplication
times
n+1
n(n+1)
n2
n2 (n+1)
n3

f(n) = 2n3 + 3n2 + 2n + 1

Ans: O(n3) (c=3, n0=4)

Will it be (n3) and/or θ(n3) also? Why or why not?


Practice Exercise-1

What is the time complexity of below code?

function func(n)
1. x ← 0;
2. for i ← 1 to n do
3. for j ← 1 to n do
4. x ← x + (i - j);
5. return(x);
Practice Exercise-2
What is the time complexity of below code?

function func(n)
1. x ← 0;
2. for i ← 1 to n do
3. for j ← 1 to i do
4. x ← x + (i - j);
5. return(x);
Practice Exercise-3

What is the time complexity of below code?

function func(n)
1. x ← 0;
2. for i ← 1 to n do
3. for j ← i to n do
4. x ← x + (i - j);
5. return(x);
Practice Exercise-4
What is the time complexity of below code?

function func(n)
1. x ← 0;
2. for i ← 1 to n do
3. for j ← 1 to  n  do
4. x ← x + (i - j);
5. return(x);
Practice Exercise-5
What is the time complexity of below code?

function func(n)
1. x ← 0;
2. for i ← 1 to n do
3. for j ← 1 to  i  do
4. x ← x + (i - j);
5. return(x);
Exercise-5

Write pseudocode and find upper bound, lower bound, and tight
bound (θ) for following problems:

1.Binary Search
2.Linear Search
3.Selection Sort
4.Bubble Sort
5.Matrix Multiplication (two and three dimensional)
Big-Oh, Theta, Omega

Tips to guide your intuition:


• Think of O(g(n)) as “greater than or equal to” f(n)
• Upper bound: “grows slower than or same rate as” f(n)

• Think of Ω(g(n)) as “less than or equal to” f(n)


• Lower bound: “grows faster than or same rate as” f(n)

• Think of Θ(g(n)) as “equal to” f(n)


• “Tight” bound: same growth rate

(True for large N and ignoring constant factors)


Notations which are not asymptotically
tight.
• Example: 2n = o(n2)
• Example: n2/2 = ω(n)
Asymptotic comparison of functions
Let f(n) and g(n) are asymptotically positive then following rules
will hold:
Asymptotic comparison of functions

You might also like