0% found this document useful (0 votes)
4 views

Algorithms-sheet

The document outlines various sorting algorithms such as Insertion Sort, Counting Sort, and Radix Sort, detailing their methodologies and complexities. It also discusses concepts like minimum subarray sums, expected values, and probability analysis in algorithms, including randomized algorithms and their applications. Additionally, it covers tree structures like Binary Search Trees and Red-Black Trees, emphasizing their properties and operations.

Uploaded by

comp2100.g40
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Algorithms-sheet

The document outlines various sorting algorithms such as Insertion Sort, Counting Sort, and Radix Sort, detailing their methodologies and complexities. It also discusses concepts like minimum subarray sums, expected values, and probability analysis in algorithms, including randomized algorithms and their applications. Additionally, it covers tree structures like Binary Search Trees and Red-Black Trees, emphasizing their properties and operations.

Uploaded by

comp2100.g40
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

InsertionSort: Array split-Take element from unsorted and place e__sum= A[s]+A[e]_if sum = v__return Yes\\else if sum >

sum > v_e=e-1


in sorted. \\ else_S=S+1\\ \\ return No CountingSort: IP:F(use extra arrays),S=T Sorting integers from
InsertionSort (A)_for j = 2 to A. length_Key = A[j]_i = j-1_While i > a small range. A[a1…an] 0<=a<=m. 1. Construct temp array C of
0 and A[i] > key_A[i+1] = A[i]_i = i-1_A[i+1] = key Minimum size subarray sum: in- array of +ve int and a target int size M+1indices of 𝐶 are the values of the input array and 𝐶[𝑖] is
v. Out- min len of subarray whose sum > v. the number of times the value 𝑖 appears in the input array. 2.
𝑂(𝑔(𝑛)={ 𝑓(𝑛)| ∃ Brute-force: Omega(n^2) as there are n(n+1)/2 possibilities Define array 𝑆 of size 𝑀 + 1 = 5, where 𝑆[𝑖] is the number of
𝑝𝑜𝑠𝑖𝑡𝑖𝑣𝑒 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡𝑠 𝑐 SlidingWindow- O(n) elements equal or smaller than 𝑖 in 𝐴
𝑎𝑛𝑑 𝑛0 𝑠𝑢𝑐ℎ 𝑡ℎ𝑎𝑡 0 ≤ MinSubArrayLen(A,v)_s=1, e=1, sum=0, min=0;_while(e <= n)
𝑓(𝑛) ≤ 𝑐𝑔(𝑛) 𝑓𝑜𝑟 ∀ 𝑛 ≥ __sum+=A[e]_while (sum>v)__min=minimum(e-s+1,min) _
𝑛0 } sum=sum-A[s]_S++ \\ e++ \\return min
Ω(𝑔(𝑛)=0 ≤ 𝑐𝑔(𝑛) ≤
𝑓(𝑛) Egg Dropping: Use a dynamic step size starting with d floors,
Θ(𝑔(𝑛))=0 ≤ 𝑐1𝑔(𝑛) ≤ then (d-1), (d-2), etc.If egg breaks: search remaining floors one by
𝑓(𝑛)≤ 𝑐2𝑔(𝑛) one If egg survives: jump to next calculated floor Maximum drops
needed is d. To cover n floors: d + (d-1) + (d-2) + ... + 1 ≥ n.This
sum simplifies to d(d+1)/2.Minimum required d is in O(√n)
Complexity: Theta(n+M)(But M<<n, so theta(n))
Probability:
Bucket Sort (O(n)): IP:F(use linked list), S:T Sort numbers in
Expected Value: Weighted average of possible values of X interval (0,1] that are drawn independently and form uniform distrib
[X=function that maps S to a number]] E[X] = for all x
(Summation) x P(X=x)
Solving limits: Always see if you can simplify​ 𝐸 𝑎𝑋 + 𝑏 = 𝑎𝐸 𝑋 + 𝑏 || 𝐸 𝑋 + 𝑌 = 𝐸 𝑋 + 𝐸[𝑌]
1.only_exponents-compare highest degree in num&den and divide I {𝐴} = 1 𝐴 𝑜𝑐𝑐𝑢𝑟𝑠
both by the highest degree. 0 𝐴 𝑑𝑜𝑒𝑠 𝑛𝑜𝑡 𝑜𝑐𝑐𝑢𝑟
2. Exponential vs polynomial & log function- use L’Hopital’s rule. 𝑋A = I{𝐴} , then 𝐸[𝑋A] = 𝑃({𝐴})
USE L’Hopital’s rule when you get limit of 0/0 or inf/inf
Asymptotic notations: Transitivity - 𝑓(𝑛) = Θ(𝑔(𝑛) ) and 𝑔(𝑛) =
Θ(ℎ(𝑛)) imply 𝑓(𝑛) = Θ(ℎ(𝑛)). [SAME FOR OTHERS]
Reflexivity: 𝑓 𝑛 = Θ(𝑓 𝑛 ) || 𝑓 𝑛 = O(𝑓 𝑛 ) || 𝑓 𝑛 = Ω(𝑓 𝑛 )
Symmetry: 𝑓 𝑛 = Θ(𝑔 𝑛 ) if and only if 𝑔(𝑛) = Θ(𝑓 𝑛 ) RadixSort : sort from the
Transpose Symmetry:𝑓 𝑛 = O(𝑔 𝑛) if and only if 𝑔 𝑛 = Ω(𝑓 𝑛) ​ ​ ​ least to most significant
|| 𝑓 𝑛 = o(𝑔 ) if and only if 𝑔 𝑛 = 𝜔(𝑓 𝑛) ​ ​ ​ Digit one by one.
​ Number of matches: Each card in the first deck has a 1 chance in Theta(dn)
Asymptotic classes: 1<logn<n<nlogn==log(n!)<n^2<n^3<2^n<n! n of matching its paired card.𝐸[𝑥𝑖] = 𝑃(𝑥𝑖 = 1) = 1/n. Total number IP:F (use counting Sort)
of matches is X = summation (i=1 to n) xi. Thus expected matches Stable: T (counting sort)
Recurrence Analysis: is E[X] = substitute above = summation(i=1 to n)1/n = 1 Closest pair of points: brute force combination(n 2) = n(n-1)/2 =
theta n^2 . Solution: divide and conquer into 3 or smaller so sol is
Substitution: constant time. Join them to get bigger answer. While merging, we
Coupon collector: 𝑥𝑖=no.coupons want to check if there is a point on the left and a point on the right
: ​ ​ Needed to buy to get while having i-1 with a smaller distance than 𝑑.(𝑑 = min(𝑑𝑟, 𝑑𝑙).) Merge of two
​ ​ Coupons. Probability of picking a subproblems n1 n2 = O(n1+n2)
​ ​ New coupon =𝑝𝑖 = 1 − (𝑖 − 1)/ 𝑛
= (𝑛 − 𝑖 + 1)/n.__ E[xi] = 1/pi = BST:​
n/(n-i+1) __ E[X] = summation(i to n) Pre-order Traversal: n -> l -> r
E[xi] = substitute above = n* In order: l -> n-> r
summation(j=1 to n) 1/j Post order: l -> r -> n
Above summ. Is n-th harmonic no Recovering BST from pre-order traversal: first no is root, For each
= logn+c. Therefore, E[X] = nlogn+c next number:​
𝑛 If it's smaller than current node, it goes to left.​
​ ​ If it's larger than current node but smaller than its nearest greater
Prob. analysis of deterministic algo: ancestor, it goes to right.
​ Inversion: (i,j) where i<j but ai>aj for arr a. Recovering BST from post order: reverse the list, first no is root,
Worst case: n(n-1)/2 inversions. Swapping and then insert based on greater/smaller.
​ A pair reduces inversion by 1. Predecessor: for a node v, if v has a left subtree, pred = node with
#Inversions = #Swaps largest key
= summ(j=2 to n)(tj-1) for insertion sort.​ No left subtree = lowest ancestor whose right subtree contains v.
​ Avg runtime: atmost: cn + c1E[#inversion]
​ Atleast c2E[#inversion]. Successor: for a node v, if v has a right subtree, succ = node with
smallest key
No right subtree = lowest ancestor
whose left subtree contains v.
Insertion: same as search, but add a node at the end.
Deletion: Let node v be the one to be deleted.
1. v is leaf: Just delete
2. v has one child: delete v and replace its child in its place.
3. v has two children: (y == succ)
3.1 y is right child: delete v and replace y in its place.
3.2 y is not right child: “delete” y from its place, replace it with
its succ, and then replace v with y.
Balanced BST: same insert & delete, but rebalance
Issues: Performance dependency on h. Major operations take O(h)
Plug E[X] into atleast runtime == theta(n^2). time, making worst case h=n(linear).
Height balanced BST: difference in height of left subtree and right
Prob.analysis random algo. RandQuickSort: subtree is at most 1.
X(i,j)=
Sized balanced BST: no. of nodes in left and subtree is approx. the
I{Si,j}=
same. (ration b/w 3/4 and 4/3).
1 if ai &
Leaf balanced BST: ==size-balancedness, but considers no. of
Aj areleaves in subtrees.
compar
Recursion Tree: ​ ​ ​ Ed. 0
Other AVL Trees, for a node:
Wise. RR(Left): when right subtree is taller than left subtree.
anticlockwise(child up and node down)
LL(right): when left subtree is taller than right subtree.
Clockwise(child up and node down)
LR: right subtree of left
child is taller than left subtree.
𝑏𝑖 and 𝑏𝑗 will only 1. RR rotation on the left child and its right child, and then
be compared if 𝑏𝑖 2. LL rotation
or 𝑏𝑗 is selected on the unbalanced node and
as the its new left child
first pivot chosen
from elements in RL: left subtree of right child
𝐵𝑖,𝑗 . is taller than right subtree.
➢ All elements in 1. LL rotation on the right
𝐵𝑖,𝑗 have the child and its left child
same probability 2.RR rotation on the
of being the unbalanced node and its
first one to be new right child
chosen as a
pivot.
𝑃(𝑋𝑖,𝑗 = 1) = 𝑃(𝑏𝑖 𝑖𝑠 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑝𝑖𝑣𝑜𝑡 𝑖𝑛 𝐵𝑖,𝑗) + (𝑃 𝑏𝑗 𝑖𝑠 𝑡ℎ𝑒 𝑓𝑖𝑟𝑠𝑡 𝑝𝑖𝑣𝑜𝑡 𝑖𝑛
𝐵𝑖,𝑗) = 2/(𝑗 − 𝑖 + 1)
Randomised algo: for picking an array that has n/2entries of 0&1,
Each round independent; hit 1 with prob ½; takes i trials until we
hit 1. Probability = (1/2)^i; E[T] = summation(i=1 to inf) i * P(it takes
i trails) = summation(i=1 to inf)(i/(2^i)) = 2 (Geometric series)
Verifying matrix multiplication: Theta(n^3) for deterministic.
Randomized Algorithm (Freivalds’s Algorithm):
1. Generate an 𝑛 × 1 random 0/1 vector 𝑟.
2. Compute 𝑃 = 𝐴 × (𝐵 × 𝑟) − 𝐶 𝑟.
3. Output Yes if 𝑃 is equal to zero vector and No otherwise.
AXB=C always returns Yes, AXB!=C returns yes at most ½.
(Repeat algo k times), it will result in giving Yes (½)^k
CNF: (𝑥1 ∨ 𝑥2 ∨ 𝑥3)∧(!𝑥1 ∨ 𝑥2 ∨ 𝑥4)∧(𝑥1 ∨ !𝑥2 ∨ 𝑥3)(!=put a
bar)
In- 3CNF. Randomised algo for CNF(O(n)): Set each variable
independently to 1 with probability ½ and 0 otherwise.
Master Theorem: ​
1. If the recursive part (a) dominates the extra work f(n)
2. If they're equal
3. If the extra work f(n) dominates


Stable/Inplace:
InsertionSort: IP=T, S=T Worst,Avg=Theta(n^2) Red-Black Trees:
MergeSort: IP=F, S=T, Worst=Theta(nlogn)
RandquickSort: IP=T, S=F, Worst=Theta(n^2)(When random picks 1. Root is black. 2. Each internal node is black or red.
the smallest elem repeatedly), Avg=Theta(nlogn) 3. Each internal node has two children. 4. Each leaf is an empty
node, black by default. 5. Path from root to any leaf have same no.
When memory is limited, RandQuick Sort would be better, as it
Binary Search: BinarySearch(A,v)_S=1_e=n_while s <= e__m= is in place. || || When previous ordering matters, RandQuick Sort black nodes (black height). 6. No consecutive red nodes.
L(s+e)/21_if A[m] == v then__return m\\else if A[m]<v then_S=m+1 is problematic because it is unstable.
ANY COMPARISON BASED ALGO IS big-omega(nlogn) in the
Insertion: node v to be inserted. W is parent. X is
\\else_e=m-1\\ \\return null. RUNTIME: O(logn) worst case.A decision tree represents all possible comparison grandparent.
Min in rotated sorted array: Use BS-compare mid elem with
right/leftmost elem. If mid>rightmost, search right half, else left half sorts. Non-leaf nodes show comparisons between elements. Leaf
nodes show final permutations. Each sorting path goes from root
Two-sum-prob: in-sorted array of integers and a target int v. to leaf. The tree must have n! leaf nodes for all permutations. A 1. W is black. Set
Out- yes if there is a and b such that a+b=v. No, otherwise. binary tree with h levels has at most 2^(h-1) leaves. This means n!
≤ 2^(h-1). Taking log of both sides: log(n!) ≤ h-1. We know log(n!) = v to red.
Brute-force: O(n^2) || Better - fix a and use binary search for b. Ω(n log n). Therefore h-1 = Ω(n log n). This proves
Repeat until True - O(nlogn)​ comparison-based sorting needs Ω(n log n) comparisons worst
Best - Linear algorithm (O(n)) — int s = 1_int e = n_while s <= case.
2. W is red. Set v as red.(If v < w < x or v > w > x.)
2.1 s is black(symmetric)

2.2 s is red.
2.1.1 (as shown)
2.1.2: If x’s parent is
black, done.
Otherwise, recolour 3.2 p = black. Keep repairing double black node.
recursively Starting from S[0, 0], we
trace the pointer; each
diagonal move means a new
character is added to the
final LCS.
Jump Game: compute min no.jumps needed to reach the end of
the array.i.e.,n-1.
4. P,q,r = black. S=red. Reduces to 1,2 or 3.1

3. If w < v < x or w >


v > x.

3.1 If s is black.
3.1.1 (as shown)
3.1.2 Tree on right is Heaps: Greedy algorithms only need insert, find-min and
same as 2.1 extract-min.
Binary-heap (find-min - O(1); insert & extract-min - O(logn))
Perfect Tree, satisfies heap property(the key of the parent is less
than the key of any of its child)
3.2 If s is red Written as array- Has index from 0. Two children of the node at
index i = 2i+1 && 2i+2
3.2.1 (as shown) Heapify(O(logn)): if key value is larger than smallest value of
3.2.2 If x’s parent is children, exchange node until heap property is satisfied.
black, done. Decrease key(O(logn)): when a key is inserted or updated. Do the
same as heapify.
Otherwise, recolour x Fibonacci-heap(find-min & insert - O(1); extract-min- O(logn)).
recursively. Instead of maintaining one tree, multiple non-binary trees are
maintained with heap property in every tree.(not perfect tree)
Each node has at most logn children. If a node has k children, then
Deletion: the size(subtree rooted at that node) is atleast Fibonacci(k+2)
1. V is red and children = empty nodes. Delete. GOAL: have small no of trees (find-min is fast) and small heights
2. V is black and children = empty nodes. (Double black) of trees(decrese-key, insert, extract-min is fast)
3. V is black and Hashtables: ata with key k will be stored at index h(k)​
has exactly one Modulo arithmetic: a = qm+r (1000 = 34 X 29 + 14)(1000 = 14
non-empty (mod 29)). Add, subtract, multiply.
h(k) = ak +b (mod m) m=prime no, 1<=a<m, 0<=b<m
child.Child S pairwise collision occurs when two different keys hash to the
must be red. same value. (PWC = n(n-1)/2)
4. V is red and Collision Resolution: Each cell in the hash table points to a linked
list and can store multiple values with the same hash value.
has exactly one (Search,insert and delete = O(ni)) where i=h(k).
non-empty Observation 1: The sequence h(0), h(1), h(2), . . . , h(u − 1) has cycle
of period m. In other words, for any k, h(k) = h(k + m) = h(k + 2m) = . .
Observation 2: h(0), h(1), h(2), . . . , h(m − 1) are all different.Hence,
each of 0, 1, 2, . . . , m − 1 appears once.
α = n/m(the load factor.)

child.(Impossible)
5. V has two non-empty children. (y == succ)

5.1 y is red. Then y must have two empty-node children. Universal Hashing: if we pick h uniformly randomly from H,
the probability that h(k1) = h(k2) is at most 1/m.

5.2 y is black and y’s non right child is non empty(z)


(z must be red).
5.2.1 (as shown)
5.2.2 Follow case 3 as we were deleting y. Perfect hashing: construct a hash table with 1. Size of HT=O(n)
5.3 y is and no collision occurs. (Perform two levels of hashing).
​ ​ ​
black and
y’s right
child(z) is
empty.
5.3.1
Replace v
Open Addressing: keep trying until you find an available slot.
by y and The sequence h(k, 0), h(k, 1), h(k, 2), . . . , h(k, m − 1) is
follow case
called the probing sequence for key k.
D2 as we were deleting y, and repair double black node. Linear probing: h(k, i) = h′(k) + i (mod m), where h′ is a usual hash
function (bad: causes clustering). Quadratic probing: h(k, i) = h′(k)
Repairing: (Symmetric) + ai + bi2 (mod m), where b̸ = 0.(Clustering issue with linear
U = double black node, p = parent, u is right child of p. S = sibling. probing suggests we make “jumps” in probing sequence).Double
S cannot be empty. Q and R = s’ children. probing: h(k, i) = h1(k) + i · h2(k) (mod m), where h1, h2 are two
usual hash functions. h2(k)!=0
1. s=black, q=red.
Dynamic Programming:
def fibonacci(n) (O(n)):_if n==0 or n==1:__return 1 \\ if n >=2:__F =
[1 for i in range(n + 1)]_for i in range(2, n + 1): __F[i] = F[i − 1] + F[i −
2] \\ return F[n]
Rod Cutting: Trying out all combinations will give O(2^n), but
recurrence gives O(n^2). Recurrence:v[n] = max 1≤ℓ≤n {p[ℓ] + v[n −
2. s=black, q=black, r=red. Use case 1 on u after the step. ℓ]}. def rodcutting(n, p): # assume p is an array with p[0] = 0
if n <=1:_return p[n] \\ if n >=2: _v = [0 for i in range(n + 1)];v[1] =
p[1];for k in range(2, n + 1):_runningmax = -1;for ℓ in range(1, k + 1):
_runningmax = max(runningmax, p[ℓ] + v[k − ℓ]) \\ v[k] =
runningmax \\return v[n].
Longest Increasing Subsequence(O(n^2)): 2^n ways of generating
a subsequence.Therefore BigOmega(2^n). Output: length of LIS.
def longest increasing subsequence(A):
n = len(A);L = [1 for i in range(n)];for i in range(n-2, -1, -1):
_runningmax = 1;for j in range(i+1, n):_if A[j] >= A[i]:
3. S,q,r = black _runningmax = max(runningmax, 1 + L[j])
3.1 p=red \\ \\ L[i] = runningmax \\return max(L).
Longest Common subsequence(O(n^3)):given two arrays A, B of
strings, find their LCS.O(nm·min{n, m}) but m=n, so O(n^3)

You might also like