0% found this document useful (0 votes)
16 views

Midterm Topic Data Structure and Algo

Lectures on data structures important key points

Uploaded by

Rashi MrBRD
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Midterm Topic Data Structure and Algo

Lectures on data structures important key points

Uploaded by

Rashi MrBRD
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 9

Tree ADT (Abstract Data Type)

A Tree is a hierarchical data structure consisting of nodes, with a single node as the root and zero or
more subtrees. Each node contains a value and references to its children. Trees are widely used to
represent hierarchical relationships and are fundamental in various applications such as file systems and
organizational structures.

Key properties:

 A tree with nnn nodes has n−1n - 1n−1 edges.

 The depth of a node is the length of the path from the root to that node.a

 The height of a tree is the length of the longest path from the root to a leaf.

Binary Search Tree (BST)

A Binary Search Tree is a type of binary tree where each node has at most two children. It satisfies the
following properties:

 The left subtree of a node contains only nodes with values less than the node’s value.

 The right subtree contains only nodes with values greater than the node’s value.

 Both left and right subtrees are also binary search trees.

Key operations:

 Insertion: Place a new value in the correct position.

 Deletion: Remove a node while maintaining BST properties.

 Search: Efficiently locate a value in the tree.

AVL Trees

An AVL Tree is a self-balancing binary search tree. In an AVL tree, the difference in heights between the
left and right subtrees (the balance factor) is at most 1 for every node. This balancing ensures that the
tree remains approximately balanced, leading to efficient operations.

Key operations:

 Rotations: When the balance factor of a node becomes greater than 1 or less than -1 after an
insertion or deletion, rotations (single or double) are performed to restore balance.

 Height Maintenance: AVL trees maintain height for efficient balancing.

Heaps
A Heap is a special tree-based data structure that satisfies the heap property. There are two main types:

 Max Heap: The value of each node is greater than or equal to the values of its children. The
maximum value is at the root.

 Min Heap: The value of each node is less than or equal to the values of its children. The
minimum value is at the root.

Key properties:

 Heaps are often implemented as binary trees, but they are usually stored as arrays for efficiency.

 They are commonly used to implement priority queues, where the highest (or lowest) priority
element can be accessed quickly.

Key operations:

 Insert: Add a new element and maintain heap properties.

 Extract Max/Min: Remove the root element and maintain heap properties.

Basic Algorithm Analysis

Algorithm analysis involves evaluating the performance of an algorithm in terms of time complexity and
space complexity. The goal is to understand how the algorithm's resource usage grows with input size.

Time Complexity:

 Describes the amount of time an algorithm takes to complete as a function of the input size nnn.

 Common notations:

o Big O Notation (O): Upper bound of the running time (worst case).

o Big Theta Notation (Θ): Tight bound (average case).

o Big Omega Notation (Ω): Lower bound (best case).

Space Complexity:

 Measures the amount of memory space required by the algorithm as a function of input size
nnn.

Algorithm Strategies

Here are some common algorithmic strategies used to solve problems:

1. Divide and Conquer:


o Breaks a problem into smaller subproblems, solves them independently, and combines
their results.

o Example: Merge Sort, Quick Sort.

2. Dynamic Programming:

o Breaks problems into overlapping subproblems and stores the results to avoid
redundant calculations.

o Example: Fibonacci sequence, Knapsack problem.

3. Greedy Algorithms:

o Makes a series of choices, each of which looks best at the moment, with the hope of
finding a global optimum.

o Example: Prim's algorithm for Minimum Spanning Tree, Huffman coding.

4. Backtracking:

o Explores all potential candidates and abandons those that fail to satisfy the constraints
of the problem.

o Example: N-Queens problem, Sudoku solver.

5. Brute Force:

o Tries all possible solutions and selects the best one. Often inefficient but
straightforward.

o Example: Finding the shortest path by checking all paths.

Classic Algorithms for Common Tasks

1. Sorting Algorithms:

o Bubble Sort: Simple comparison sort, O(n²) time complexity.

o Merge Sort: Divide and conquer, O(n log n) time complexity.

o Quick Sort: Average case O(n log n), worst case O(n²) but usually faster in practice.

2. Searching Algorithms:

o Linear Search: O(n) time complexity, scans through the list.

o Binary Search: O(log n) time complexity, works on sorted arrays.


3. Graph Algorithms:

o Dijkstra’s Algorithm: Finds the shortest path in weighted graphs.

o Depth-First Search (DFS): Explores as far as possible along each branch.

o Breadth-First Search (BFS): Explores all neighbors at the present depth before moving
on.

4. Dynamic Programming Examples:

o Fibonacci Sequence: Recursive vs. iterative vs. dynamic programming approach.

o Longest Common Subsequence: Finding the longest sequence present in two


sequences.

5. String Algorithms:

o Knuth-Morris-Pratt (KMP): Efficient substring search algorithm.

o Rabin-Karp: Uses hashing to find patterns in a string.

1. Divide and Conquer

Example: Merge Sort

 Concept: Split the list in half, sort each half, and then merge the sorted halves.

 How it works:

1. If the list has one element or is empty, it’s already sorted.

2. Split the list into two halves.

3. Recursively sort each half.

4. Merge the two sorted halves back together.

Visualization:

 List: [38, 27, 43, 3, 9, 82, 10]

 Split: [38, 27, 43] and [3, 9, 82, 10]

 Further split: [38], [27, 43], [3, 9], [82, 10]

 Merge: [27, 38, 43], [3, 9, 10, 82]

 Final merge: [3, 9, 10, 27, 38, 43, 82]


2. Dynamic Programming

Example: Fibonacci Sequence

 Concept: The Fibonacci sequence is defined as:

o F(0)=0F(0) = 0F(0)=0

o F(1)=1F(1) = 1F(1)=1

o F(n)=F(n−1)+F(n−2)F(n) = F(n-1) + F(n-2)F(n)=F(n−1)+F(n−2) for n≥2n \geq 2n≥2

 How it works:

o Instead of recalculating Fibonacci numbers, store previously computed values.

3. Greedy Algorithms

Example: Coin Change Problem

 Concept: Given a set of coin denominations, find the minimum number of coins needed to make
a certain amount.

 How it works: Always take the largest denomination possible until the amount is met.

Example:

 Denominations: [25, 10, 5, 1]

 Amount: 30

 Steps:

1. Take one 25-cent coin.

2. Take one 5-cent coin.

 Total coins used: 2

4. Backtracking

Example: N-Queens Problem

 Concept: Place N queens on an N×NN \times NN×N chessboard so that no two queens threaten
each other.

 How it works:

o Place a queen in a column, then recursively try to place queens in subsequent columns.
o If a placement doesn’t work, backtrack and try the next position.

Simple Approach:

1. Start from the first column.

2. Place a queen and move to the next column.

3. If placing a queen leads to a conflict, backtrack to the previous column and move the queen.

5. Sorting Algorithms

Example: Bubble Sort

 Concept: Repeatedly step through the list, compare adjacent elements, and swap them if they
are in the wrong order.

 How it works:

o Repeat until no swaps are needed.

Example:

 List: [5, 1, 4, 2, 8]

 Steps:

o Compare 5 and 1, swap: [1, 5, 4, 2, 8]

o Compare 5 and 4, swap: [1, 4, 5, 2, 8]

o Compare 5 and 2, swap: [1, 4, 2, 5, 8]

o Compare 5 and 8, no swap.

 Repeat until sorted: [1, 2, 4, 5, 8]

6. Searching Algorithms

Example: Binary Search

 Concept: Efficiently find an element in a sorted list by repeatedly dividing the search interval in
half.

 How it works:

1. Compare the target value to the middle element.

2. If equal, you’re done.


3. If the target is less, search the left half; if greater, search the right half.

Example:

 Sorted list: [1, 3, 5, 7, 9]

 Target: 5

 Steps:

o Middle is 5 (found it in one step!).

1. Bubble Sort

Concept: Bubble Sort repeatedly steps through the list, compares adjacent elements, and swaps
them if they are in the wrong order. This "bubbles" the largest unsorted element to its correct
position at the end of the list with each pass.

 How it works:
1. Start at the beginning of the list.
2. Compare the first two elements.
3. If the first is greater than the second, swap them.
4. Move to the next pair and repeat until the end of the list.
5. Repeat the process for the entire list until no swaps are needed.

Time Complexity: O(n²) - inefficient for large lists.


2. Selection Sort

Concept: Selection Sort divides the list into a sorted and an unsorted region. It repeatedly selects
the smallest (or largest) element from the unsorted region and moves it to the end of the sorted
region.

 How it works:
1. Start with the entire list as unsorted.
2. Find the minimum element in the unsorted portion.
3. Swap it with the first unsorted element.
4. Move the boundary between sorted and unsorted one element forward.
5. Repeat until the entire list is sorted.

Time Complexity: O(n²) - consistent performance regardless of the initial order.

3. Insertion Sort

Concept: Insertion Sort builds a sorted list one element at a time by repeatedly taking the next
element from the unsorted portion and inserting it into the correct position in the sorted portion.

 How it works:
1. Start with the first element, assuming it’s sorted.
2. Take the next element and compare it with the sorted part.
3. Shift all larger elements in the sorted part one position to the right.
4. Insert the new element into its correct position.
5. Repeat until all elements are sorted.

Time Complexity: O(n²) in the worst case, but O(n) if the array is already sorted.

4. Merge Sort

Concept: Merge Sort is a divide-and-conquer algorithm that divides the list into halves,
recursively sorts each half, and then merges the sorted halves back together.

 How it works:
1. If the list has one element, it’s already sorted.
2. Split the list into two halves.
3. Recursively sort each half.
4. Merge the two sorted halves into one sorted list.

Time Complexity: O(n log n) - efficient for large datasets.

5. Quick Sort

Concept: Quick Sort is also a divide-and-conquer algorithm. It selects a 'pivot' element,


partitions the other elements into two sub-arrays according to whether they are less than or
greater than the pivot, and then recursively sorts the sub-arrays.

 How it works:
1. Choose a pivot element from the array.
2. Partition the array into two parts: elements less than the pivot and elements
greater than the pivot.
3. Recursively apply the same process to the sub-arrays.

Time Complexity: O(n log n) on average, but O(n²) in the worst case (rare with good pivot
selection).

Summary Table

Algorithm Best Case Average Case Worst Case Stable


Bubble Sort O(n) O(n²) O(n²) Yes
Selection Sort O(n²) O(n²) O(n²) No
Insertion Sort O(n) O(n²) O(n²) Yes
Merge Sort O(n log n) O(n log n) O(n log n) Yes
Quick Sort O(n log n) O(n log n) O(n²) No

 Input Size: nnn is the size of the array or list. For example, if you have an array with 10
elements, then n=10n = 10n=10.
 Performance Measurement: The time complexity, like O(n)O(n)O(n) or O(n2)O(n^2)O(n2),
describes how the running time of the algorithm grows relative to the size of the input.
 O(n)O(n)O(n) means that as the input size increases, the time taken increases linearly.
 O(n2)O(n^2)O(n2) means that if you double the size of the input, the time taken
increases fourfold

You might also like