DS CHP1 Notes
DS CHP1 Notes
E.g.:- Array Data Structures that are created at run time are
called as Dynamic Data Structure.
Implementation Complexity Linear data structures are relatively Non-linear data structures require a
easier to implement. higher level of understanding and
are more complex to implement.
Levels All data elements in a linear data Data elements in a non-linear data
structure exist on a single level. structure span multiple levels.
Examples Linked List, Queue, Stack, Array. Tree, Graph, Hash Map.
COMPLEXITY OF ALGORITHM
Generally algorithms are measured in terms of time complexity and
space complexity.
1. Time Complexity
for i : 1 to length of A
if A[i] is equal to x return
TRUE return FALSE
Each of the operation in computer take approximately constant time. Let each
operation takes 𝑐 time. The number of lines of code executed is actually depends
on the value of 𝑥. During analyses of algorithm, mostly we will consider worst
case scenario, i.e., when 𝑥 is not present in the array 𝐴. In the worst case,
the if condition will run 𝑁 times where 𝑁 is the length of the array 𝐴. So in the
worst case, total execution time will be (𝑁∗𝑐+𝑐). 𝑁∗𝑐 for the if condition and 𝑐 for
the return statement ( ignoring some operations like assignment of 𝑖 ).
As we can see that the total time depends on the length of the array 𝐴. If the
length of the array will increase the time of execution will also increase.
Order of growth
Order of growth is how the time of execution depends on the length
of the input, we can clearly see that the time of execution is linearly
depends on the length of the array.
Order of growth will help us to compute the running time with
ease. We will ignore the lower order terms, since the lower order
terms are relatively insignificant for large input. We use different
notation to describe limiting behavior of a function.
Asymptotic Notation
It is used to describe the running time of an algorithm - how much
time an algorithm takes with a given input, n.
There are three different notations:
1) big O, 2)big Theta (Θ), and 3)big Omega (Ω).
1)Time complexity notations
While analysing an algorithm, we mostly consider 𝑂-notation because it will give us an
upper limit of the execution time i.e. the execution time in the worst case.
To compute 𝑂-notation we will ignore the lower order terms, since the lower order
terms are relatively insignificant for large input.
Let 𝑓(𝑁)=2∗𝑁2+3∗𝑁+5
𝑂(𝑓(𝑁))=𝑂(2∗𝑁2+3∗𝑁+5)=𝑂(𝑁2)
Lets
intconsider
count =some0; example:
for (int i = 0; i < N; i++)
for (int j = 0; j < i; j++) count++;
1. mul <- 1
2. i <- 1
3. While i <= n do
4. mul = mul * 1
5. i=i+1
6. End while
Let T(n) be a function of the algorithm's time complexity.
Lines 1 and 2 have a time complexity of O. (1).
Line 3 represents a loop.
As a result, you must repeat lines 4 and 5 (n -1) times.
As a result, the time complexity of lines 4 and 5 is O. (n).
Finally, adding the time complexity of all the lines yields the overall time
complexity of the multiple function fT(n) = O(n)
.
The iterative method gets its name because it calculates an iterative
algorithm's time complexity by parsing it line by line and adding the complexity.
Aside from the iterative method, several other concepts are used in various
cases.
The recursive process, for example, is an excellent way to calculate time
complexity for recurrent solutions that use recursive trees or substitutions.
The master's theorem is another popular method for calculating time
complexity.
Methods for Calculating Space Complexity
With an example, you will go over how to calculate space
complexity in this section.
Here is an example of computing the multiplication of
array elements:
1.int mul, i
2.While i < = n do
3. mul <- mul * array[i]
4. i <- i + 1
5.end while
6.return mul
Space Complexity Explanation :
Let S(n) denote the algorithm's space complexity. In most systems, an integer
occupies 4 bytes of memory.
As a result, the number of allocated bytes would be the space complexity.
Line 1 allocates memory space for two integers, resulting in S(n) = 4 bytes
multiplied by 2 = 8 bytes.
Because the array is used in the algorithm to allocate n cases of integers, the final
space complexity will be fS(n) = n + 12 = O (n).
As you progress through this tutorial, you will see some differences between
space and time complexity.
Q. Imagine a classroom of 100 students in which you gave your pen to one
person.
You have to find that pen without knowing to whom you gave it.
Here are some ways to find the pen and what the O order is.
•O(n2): You go and ask the first person in the class if he has the pen. Also, you ask
this person about the other 99 people in the classroom if they have that pen and so
on,
This is what we call O(n2).
•O(log n): Now I divide the class into two groups, then ask: “Is it on the left side, or
the right side of the classroom?” Then I take that group and divide it into two and
ask again, and so on. Repeat the process till you are left with one student who has
your pen. This is what you mean by O(log n).
constant − O(1)
logarithmic − O(log n)
linear − O(n)
quadratic − O(n2)
cubic − O(n3)
polynomial − nO(1)
exponential − 2O(n)
Asymptotic analysis of an algorithm refers to defining the mathematical
foundation/framing of its run-time performance.
Using asymptotic analysis, we can very well conclude the best case,
average case, and worst case scenario of an algorithm.
Big-Omega is an Asymptotic Notation for the best case or a floor growth rate for
a given function. It gives you an asymptotic lower bound on the growth rate of
an algorithm's runtime.
1) for (i=0;i<n;i++)
printf(“%d\n”,i);
1) Read n,m 1
2) Let x=1 1
3) For i=1 to m m+1
4) X = x*n m
5) Print x 1
6) Stop
1) For i =1 to n do n
4) X = x+1 n *(n-1)*(n-2)
do{
j++ 20
If( i ==20) 20
break; 1
i++ 20
}while (i <n) 20
x=n/2 1
{ n/2
k++;
j++; n/2
}
if(x>y)
{
x=x+1;
}
else
{
for (i=1;i<=N; i++)
{
x=x+1;
}
Frequency
x>y x<=y
if(x>y) 1 1
{ 1 -
x=x+1; - -
} - -
else - -
{ - -
for (i=1;i<=N; i++) - N+1
{ - N
x=x+1; - -
} - -
- -
Total 2 2N+2
Simple Algorithms and its Complexity as Example
a) Double x[3]
Ans : 24 bytes