Algorithm Materials
Algorithm Materials
Unit – 1
1.Asymptotic Notations and its Properties:
- Example: For T(n) = 2T(n/2) + n, guess T(n) = O(n log n), and prove it
using induction.
4.Linear Search:
- Definition: Linear search scans through each element in a list until the
target element is found.
```python
"""
Parameters:
Returns:
"""
for i in range(len(arr)):
if arr[i] == target:
# Example usage
my_list = [4, 2, 7, 1, 9, 3, 6]
target_value = 7
result = linear_search(my_list, target_value)
if result != -1:
else:
```
In this example, the target value is 7, and it is found at index 2 in the list
`my_list`. The output will be:
```
```
Keep in mind that the linear search algorithm has a time complexity of O(n),
where n is the length of the list. It is suitable for small lists or unsorted data.
For larger datasets or sorted data, more efficient algorithms like binary search
may be preferred.
5.Binary Search:
```python
"""
Perform binary search to find the target in the given sorted list.
Parameters:
Returns:
"""
if arr[mid] == target:
else:
# Example usage
target_value = 6
if result != -1:
else:
```
In this example, the target value is 6, and it is found at index 4 in the sorted
list `sorted_list`. The output will be:
```
```
Binary search has a time complexity of O(log n), making it more efficient
than linear search for large datasets, especially when the data is sorted.
- Example: Searching for 7 in the sorted array [1, 2, 5, 7, 8] would find
the element in the fourth position using binary search.
6.Heap Sort:
Heap Sort is a comparison-based sorting algorithm that uses a binary heap data
structure to build a max-heap (or min-heap), and then sorts the heap. In a max-
heap, the value of each node is greater than or equal to the values of its
children. Conversely, in a min-heap, the value of each node is less than or equal
to the values of its children.
The basic idea of the heap sort algorithm involves two main steps:
1. **Heapify the Array:** Convert the input array into a binary heap. This is
done by repeatedly applying heapify operations to build a max-heap.
```python
largest = i
left_child = 2 * i + 1
right_child = 2 * i + 2
largest = left_child
# Check if right child exists and is greater than the largest so far
largest = right_child
if largest != i:
heapify(arr, n, largest)
def heap_sort(arr):
n = len(arr)
heapify(arr, n, i)
arr[i], arr[0] = arr[0], arr[i] # Swap the root with the last element
heap_sort(unsorted_array)
```
The `heapify` function is used to maintain the heap property, and `heap_sort`
orchestrates the overall sorting process. In the example, the output will be:
```
```
Heap Sort has a time complexity of O(n log n) and is an in-place sorting
algorithm.
- Example: Given an array [4, 10, 3, 5, 1], heap sort would arrange it in
ascending order: [1, 3, 4, 5, 10].
7.Rabin-Karp Algorithm:
2. **Comparison:** Compare the hash values. If the hash values match, perform
a character-by-character comparison to ensure it's not a spurious hash collision.
3. **Rolling Hash:** As you slide the window of the pattern over the text,
update the hash value using a rolling hash function. This allows you to efficiently
compute the hash value of the new substring based on the previous substring's
hash value.
```python
text_len = len(text)
pattern_len = len(pattern)
if pattern_hash == text_hash:
# Example usage
text = "ABABCABABCDABCABC"
pattern = "ABC"
if result != -1:
else:
```
In this example, the pattern "ABC" is found in the text at indices 2, 10, and 14.
The output will be:
```
```
The Rabin-Karp algorithm is useful when you need to search for multiple
patterns in the same text efficiently. It's important to choose a good rolling
hash function and a prime number to avoid hash collisions.
- Example: Searching for the pattern "ABC" in the text "AABC" using
Rabin-Karp would efficiently identify the match.
8.Knuth-Morris-Pratt:
2. **Searching:** While searching for the pattern in the text, the KMP
algorithm uses the LPS array to determine the maximum number of characters
to skip when a mismatch occurs.
def compute_lps_array(pattern):
m = len(pattern)
lps = [0] * m
j = lps[j - 1]
if pattern[i] == pattern[j]:
j += 1
lps[i] = j
return lps
n = len(text)
m = len(pattern)
lps = compute_lps_array(pattern)
while i < n:
if pattern[j] == text[i]:
i += 1
j += 1
if j == m:
if j != 0:
j = lps[j - 1]
else:
i += 1
# Example usage
text = "ABABDABACDABABCABAB"
pattern = "ABABCABAB"
if result != -1:
else:
```
In this example, the pattern "ABABCABAB" is found in the text at index 10.
The output will be:
```
```
9.Matching Algorithm:
It seems like your question is a bit general, and it could refer to various
matching algorithms depending on the context. I'll provide an overview of
different types of matching algorithms, and if you have a specific type of
matching in mind (e.g., string matching, pattern matching, graph matching),
please let me know, and I can provide more detailed information.
- **Brute Force:** The simplest method, where you check each position in the
text for a match.
10.Native String:
1. **C/C++:**
3. **Python:**
4. **JavaScript:**
5. **C# (C Sharp):**
Applications:
1. BFS Applications:
2. DFS Applications:
1. Order of Visiting:
3. Completeness:
4. Memory Usage:
- BFS: Uses more memory as it needs to store all the nodes at the
current level.
5. Applications Example:
Prim's Algorithm:
1. Application:
3. Time Complexity:
4. Example:
Kruskal's Algorithm:
1. Application:
2. Steps:
4. Example:
1. Greedy Approach:
2. Data Structures:
3. Complexity:
4. Connection Strategy:
- Prim's: Grows the tree from a starting vertex.
5. Cycle Checking:
1. Application:
2. Definition:
3. Algorithm:
4. Complexity:
- Consider a set of jobs and workers with certain skills. The goal is
to maximize the number of jobs assigned to workers based on
compatibility.
Dijkstra's Algorithm:
1. Application:
2. Steps:
- Start with the initial node and set tentative distances to all
other nodes to infinity.
3. Time Complexity:
4. Example:
- In a map with cities and roads, Dijkstra's algorithm can find the
shortest path between two cities.
Floyd-Warshall Algorithm:
1. Application:
2. Steps:
3. Time Complexity:
- O(V^3).
4. Example:
Bellman-Ford Algorithm:
1. Application:
2. Steps:
3. Time Complexity:
4. Example:
1. Application:
2. Algorithm:
3. Termination:
Unit – 3
1. Greedy Algorithms:
Optimal Merge Pattern:
- Example Algorithm:
```python
def optimal_merge_pattern(sizes):
total_cost = 0
sizes.sort()
total_cost += merged_size
return total_cost
```
```python
def build_huffman_tree(freq):
heapify(heap)
lo = heappop(heap)
hi = heappop(heap)
return heap[0]
```
- Example Algorithm:
```python
n = len(finish)
activities = list(range(n))
activities.sort(key=lambda x: finish[x])
selected_activities = [activities[0]]
selected_activities.append(activities[i])
return selected_activities
```
2. Dynamic Programming:
- Example Algorithm:
```python
n = len(keys)
for i in range(n):
dp[i][i] = freq[i]
j = i + cl - 1
dp[i][j] = float('inf')
c += sum(freq[i:j + 1])
if c < dp[i][j]:
dp[i][j] = c
return dp[0][n - 1]
```
Matrix-Chain Multiplication:
- Example Algorithm:
```python
def matrix_chain_order(p):
n = len(p) - 1
j=i+l-1
m[i][j] = float('inf')
if q < m[i][j]:
m[i][j] = q
return m[0][n - 1]
```
Multistage Graph:
- Example Algorithm:
```python
n = len(cost)
dist = [float('inf')] * n
dist[n - 1] = 0
return dist[0]
```
Elements:
Unit – 4
1. Backtracking Problems:
a. N-Queen Problem:
1. Problem Statement:
2. Algorithm Steps:
- Start in the leftmost column.
```python
def solve_n_queens(n):
Initialize the board
```
1. Problem Statement:
2. Algorithm Steps:
```python
def hamiltonian_cycle(graph):
```
1. Problem Statement:
2. Algorithm Steps:
```python
Return True
```
2. Optimization Problems:
1. Problem Statement:
2. Algorithm Steps:
```python
```
b. Knapsack Problem:
1. Problem Statement:
2. Algorithm Steps:
```python
```
3. Assignment Problem:
1. Problem Statement:
2. Algorithm Steps:
```python
def hungarian_algorithm(cost_matrix):
```
Unit – 5
1. Approximation Algorithm for TSP:
b. Christofides Algorithm:
- Christofides algorithm guarantees a solution within 3/2 times the
optimal for metric TSP instances.
c. Performance Characteristics:
d. Example:
e. Limitations:
2. Randomized Algorithms:
v. Example:
- Items are placed in the next bin if they fit, else a new bin is
used.
c. Characteristics:
d. Example:
e. Limitations: