0% found this document useful (0 votes)
19 views3 pages

CS502 Midterm Study Guide

Uploaded by

tahaqaiser4
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views3 pages

CS502 Midterm Study Guide

Uploaded by

tahaqaiser4
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

CS502 Midterm Exam Study Guide (Chapters 1–6)

Chapter 1: Introduction

Definition and Origins of Algorithms

The term 'algorithm' originates from Al-Khwarizmi, a mathematician whose works


introduced the concept of systematic problem-solving methods.
An algorithm is a finite sequence of well-defined instructions to solve a problem or perform
a computation.
Example: Recipe steps to bake a cake.

Algorithm Design and Efficiency

Good algorithm design emphasizes simplicity, efficiency, and adaptability to avoid


bottlenecks in computational performance.
Contrast: A brute-force approach versus a refined divide-and-conquer strategy.

2-Dimensional Maxima Problem

Problem: Identify points in a 2D space not dominated by others.


Algorithms:
- Brute-Force: Θ(n²) complexity.
- Plane Sweep: Θ(n log n) complexity.
Example: Finding cars that are either faster or cheaper among various options.

Chapter 2: Asymptotic Notation

Big-O, Ω, and Θ Notations

- Big-O: Upper bound, worst-case complexity. Example: Bubble Sort O(n²).


- Ω: Lower bound, best-case scenario.
- Θ: Tight bound, both upper and lower.
Example: Merging sorted arrays is Θ(n).

Common Complexities

- Θ(1): Constant time (e.g., array access).


- Θ(log n): Binary search.
- Θ(n): Linear search.
- Θ(n log n): Efficient sorting (e.g., Merge Sort).
- Θ(2ⁿ): Exponential growth, impractical for large n.

Chapter 3: Divide and Conquer Strategy

Merge Sort

Steps:
1. Divide: Split array into halves.
2. Conquer: Recursively sort subarrays.
3. Combine: Merge sorted halves.
Example: Sorting [3, 1, 4, 2] -> Divide into [3, 1] and [4, 2], sort, then merge.
Complexity: Θ(n log n).

Chapter 4: Sorting Algorithms

Slow Sorting Algorithms

Examples:
- Bubble Sort: Repeatedly swap adjacent elements if they are in the wrong order.
- Selection Sort: Select the smallest unsorted element and place it in order.
Complexity: Θ(n²).

Efficient Sorting

1. Merge Sort: Divide-and-conquer, Θ(n log n).


2. Heap Sort: Uses a heap data structure, Θ(n log n).
3. Quick Sort: Partition-based, average Θ(n log n), worst Θ(n²).

Chapter 5: Linear Time Sorting

Non-Comparison-Based Sorting

1. Counting Sort: Sorts integers by counting occurrences. Complexity: Θ(n).


2. Bucket Sort: Distributes elements into buckets, then sorts each. Best for uniformly
distributed data.
3. Radix Sort: Sorts digit by digit. Example: Sorting phone numbers by digits.
Chapter 6: Dynamic Programming

Fibonacci Sequence

Demonstrates overlapping subproblems. Recursive and memoized solutions avoid


redundant computations.
Example: F(n) = F(n-1) + F(n-2).

Edit Distance Problem

Applications: Spell check, DNA sequencing.


Algorithm: Uses a matrix to compute minimum changes required to convert one string to
another.
Complexity: Θ(mn), where m and n are string lengths.

0/1 Knapsack Problem

Given items with weights and values, maximize the total value without exceeding the weight
limit.
Algorithm: DP table to track maximum values for weight limits.

You might also like