Midterm Topic Data Structure and Algo
Midterm Topic Data Structure and Algo
A Tree is a hierarchical data structure consisting of nodes, with a single node as the root and zero or
more subtrees. Each node contains a value and references to its children. Trees are widely used to
represent hierarchical relationships and are fundamental in various applications such as file systems and
organizational structures.
Key properties:
The depth of a node is the length of the path from the root to that node.a
The height of a tree is the length of the longest path from the root to a leaf.
A Binary Search Tree is a type of binary tree where each node has at most two children. It satisfies the
following properties:
The left subtree of a node contains only nodes with values less than the node’s value.
The right subtree contains only nodes with values greater than the node’s value.
Both left and right subtrees are also binary search trees.
Key operations:
AVL Trees
An AVL Tree is a self-balancing binary search tree. In an AVL tree, the difference in heights between the
left and right subtrees (the balance factor) is at most 1 for every node. This balancing ensures that the
tree remains approximately balanced, leading to efficient operations.
Key operations:
Rotations: When the balance factor of a node becomes greater than 1 or less than -1 after an
insertion or deletion, rotations (single or double) are performed to restore balance.
Heaps
A Heap is a special tree-based data structure that satisfies the heap property. There are two main types:
Max Heap: The value of each node is greater than or equal to the values of its children. The
maximum value is at the root.
Min Heap: The value of each node is less than or equal to the values of its children. The
minimum value is at the root.
Key properties:
Heaps are often implemented as binary trees, but they are usually stored as arrays for efficiency.
They are commonly used to implement priority queues, where the highest (or lowest) priority
element can be accessed quickly.
Key operations:
Extract Max/Min: Remove the root element and maintain heap properties.
Algorithm analysis involves evaluating the performance of an algorithm in terms of time complexity and
space complexity. The goal is to understand how the algorithm's resource usage grows with input size.
Time Complexity:
Describes the amount of time an algorithm takes to complete as a function of the input size nnn.
Common notations:
o Big O Notation (O): Upper bound of the running time (worst case).
Space Complexity:
Measures the amount of memory space required by the algorithm as a function of input size
nnn.
Algorithm Strategies
2. Dynamic Programming:
o Breaks problems into overlapping subproblems and stores the results to avoid
redundant calculations.
3. Greedy Algorithms:
o Makes a series of choices, each of which looks best at the moment, with the hope of
finding a global optimum.
4. Backtracking:
o Explores all potential candidates and abandons those that fail to satisfy the constraints
of the problem.
5. Brute Force:
o Tries all possible solutions and selects the best one. Often inefficient but
straightforward.
1. Sorting Algorithms:
o Quick Sort: Average case O(n log n), worst case O(n²) but usually faster in practice.
2. Searching Algorithms:
o Breadth-First Search (BFS): Explores all neighbors at the present depth before moving
on.
5. String Algorithms:
Concept: Split the list in half, sort each half, and then merge the sorted halves.
How it works:
Visualization:
o F(0)=0F(0) = 0F(0)=0
o F(1)=1F(1) = 1F(1)=1
How it works:
3. Greedy Algorithms
Concept: Given a set of coin denominations, find the minimum number of coins needed to make
a certain amount.
How it works: Always take the largest denomination possible until the amount is met.
Example:
Amount: 30
Steps:
4. Backtracking
Concept: Place N queens on an N×NN \times NN×N chessboard so that no two queens threaten
each other.
How it works:
o Place a queen in a column, then recursively try to place queens in subsequent columns.
o If a placement doesn’t work, backtrack and try the next position.
Simple Approach:
3. If placing a queen leads to a conflict, backtrack to the previous column and move the queen.
5. Sorting Algorithms
Concept: Repeatedly step through the list, compare adjacent elements, and swap them if they
are in the wrong order.
How it works:
Example:
List: [5, 1, 4, 2, 8]
Steps:
6. Searching Algorithms
Concept: Efficiently find an element in a sorted list by repeatedly dividing the search interval in
half.
How it works:
Example:
Target: 5
Steps:
1. Bubble Sort
Concept: Bubble Sort repeatedly steps through the list, compares adjacent elements, and swaps
them if they are in the wrong order. This "bubbles" the largest unsorted element to its correct
position at the end of the list with each pass.
How it works:
1. Start at the beginning of the list.
2. Compare the first two elements.
3. If the first is greater than the second, swap them.
4. Move to the next pair and repeat until the end of the list.
5. Repeat the process for the entire list until no swaps are needed.
Concept: Selection Sort divides the list into a sorted and an unsorted region. It repeatedly selects
the smallest (or largest) element from the unsorted region and moves it to the end of the sorted
region.
How it works:
1. Start with the entire list as unsorted.
2. Find the minimum element in the unsorted portion.
3. Swap it with the first unsorted element.
4. Move the boundary between sorted and unsorted one element forward.
5. Repeat until the entire list is sorted.
3. Insertion Sort
Concept: Insertion Sort builds a sorted list one element at a time by repeatedly taking the next
element from the unsorted portion and inserting it into the correct position in the sorted portion.
How it works:
1. Start with the first element, assuming it’s sorted.
2. Take the next element and compare it with the sorted part.
3. Shift all larger elements in the sorted part one position to the right.
4. Insert the new element into its correct position.
5. Repeat until all elements are sorted.
Time Complexity: O(n²) in the worst case, but O(n) if the array is already sorted.
4. Merge Sort
Concept: Merge Sort is a divide-and-conquer algorithm that divides the list into halves,
recursively sorts each half, and then merges the sorted halves back together.
How it works:
1. If the list has one element, it’s already sorted.
2. Split the list into two halves.
3. Recursively sort each half.
4. Merge the two sorted halves into one sorted list.
5. Quick Sort
How it works:
1. Choose a pivot element from the array.
2. Partition the array into two parts: elements less than the pivot and elements
greater than the pivot.
3. Recursively apply the same process to the sub-arrays.
Time Complexity: O(n log n) on average, but O(n²) in the worst case (rare with good pivot
selection).
Summary Table
Input Size: nnn is the size of the array or list. For example, if you have an array with 10
elements, then n=10n = 10n=10.
Performance Measurement: The time complexity, like O(n)O(n)O(n) or O(n2)O(n^2)O(n2),
describes how the running time of the algorithm grows relative to the size of the input.
O(n)O(n)O(n) means that as the input size increases, the time taken increases linearly.
O(n2)O(n^2)O(n2) means that if you double the size of the input, the time taken
increases fourfold