0% found this document useful (0 votes)
42 views13 pages

Interview Preparation Plan For Fresher

This document outlines a comprehensive interview preparation plan for freshers, spanning over 100 days, focusing on programming languages, data structures, algorithms, and problem-solving techniques. It includes a structured approach to learning various algorithms such as sorting, searching, graph algorithms, and dynamic programming, along with practical applications and time complexities. The plan encourages continuous revision and competitive programming to enhance skills beyond typical undergraduate levels.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views13 pages

Interview Preparation Plan For Fresher

This document outlines a comprehensive interview preparation plan for freshers, spanning over 100 days, focusing on programming languages, data structures, algorithms, and problem-solving techniques. It includes a structured approach to learning various algorithms such as sorting, searching, graph algorithms, and dynamic programming, along with practical applications and time complexities. The plan encourages continuous revision and competitive programming to enhance skills beyond typical undergraduate levels.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Interview Preparation Plan for Fresher:-

Day 0: Stick to a programming language like C or C++. Make sure that you
are comfortable with pointers/objects.
Day 1: Understand the concept of Algorithmic complexity. Skip the theory
for now, but for every piece of code you write, you should be able to derive
both time and space complexity.

Day 2 - 10: Let’s start with some simple data structures,

1. Arrays
2. Linked Lists
3. Strings
4. Stacks
5. Queues
Understand their basic operations (insert, delete, and search, traversal) and
their complexity - Big-O Algorithm Complexity Cheat Sheet, and code them all.

Day 11 - 25: Let’s now learn some simple algorithms,

1. Sorting - Insertion sort, Merge sort, Quick sort, Heap sort, Bucket
sort, Counting sort, Radix sort, External sorting
2. Search - Linear search, Binary Search (along with its variants).
3. Prime Numbers - Sieve of Eratosthenes, Primality test
4. Strings - String searching, LCS, Palindrome detection
5. Miscellaneous - Euclidean algorithm, Matrix multiplication, Fibonacci
Numbers, Pascal's Triangle, Max Subarray problem
Day 26 - 50: Once you are comfortable with everything above, start doing
problems from,

1. Cracking the Coding Interview


2. Elements of Programming Interviews
3. Programming Interviews Exposed: Secrets to Landing Your Next Job
4. GeeksforGeeks
5. HackerRank
6. InterviewBit
Stick to chapters of arrays, linked lists, strings, stacks, queues and
complexity.

Day 51 - 60: Let’s learn some non-linear data structures,

1. Tree
a. Binary Tree, Binary Search Tree - Tree traversals, Lowest
common ancestor, Depth, Height & Diameter, Finding k-th
smallest element
b. Heaps
2. Hash table - 4 sum problem, Checking if sudoku solution is valid
3. Graph - Breadth-first search, Depth-first search, Topological
sorting, Minimum spanning tree, Shortest path problem,

Day 61- 90: Refer to the previous resources and start doing problems from
trees, hash tables, heaps and graphs.
Day 91 - 100: Understand Computational complexity theory and NP-
completeness, Knapsack problem, Travelling salesman problem, SAT
problem and so on.

Day 101 - ∞∞: You are now better than most of the CS undergrads. Keep
revising the above topics and start competitive programming! Good luck!

https://www.appliedaicourse.com/blog/a-star-algorithm-in-ai/

https://medium.com/@sachin28/the-algorithms-behind-the-working-of-google-maps-73c379bcc9b9

https://medium.com/@sachin28/the-algorithms-behind-the-working-of-google-maps-73c379bcc9b9

List of Algorithms Covered:


1. Sorting Algorithms

 Bubble Sort

 Merge Sort

 Quick Sort

2. Search Algorithms

 Linear Search

 Binary Search

3. Graph Algorithms

 Breadth-First Search (BFS)

 Depth-First Search (DFS)


 Dijkstra’s Algorithm

4. Dynamic Programming Algorithms

 Fibonacci Sequence

 Longest Common Subsequence

5. Greedy Algorithms

 Activity Selection Problem

 Huffman Encoding

6. Divide and Conquer Algorithms

 Binary Search

 Merge Sort

7. Backtracking Algorithms

 N-Queens Problem

 Subset Sum Problem

8. Hashing Algorithms

 Hash Tables
 Cryptographic Hash Functions

9. String Matching Algorithms

 Knuth-Morris-Pratt (KMP) Algorithm

 Rabin-Karp Algorithm

1. Sorting Algorithms
Sorting is one of the most fundamental tasks in
programming. It involves arranging data in a particular
order, typically ascending or descending. Here are three
key sorting algorithms every developer should understand:

 Bubble Sort
Bubble Sort works by repeatedly stepping through
the list, comparing adjacent elements, and swapping
them if they are in the wrong order. This process
continues until no swaps are needed, signaling that
the list is sorted.

Real-life example: Imagine organizing books on a shelf by


comparing two at a time and swapping them until they are
all in the correct order.

Time complexity: O(n²)

 Merge Sort
Merge Sort is a divide-and-conquer algorithm. It
divides the list into two halves, recursively sorts each
half, and then merges them back together in sorted
order. This is a more efficient approach than Bubble
Sort.

Real-life example: Think of dividing a deck of cards into


two piles, sorting each pile, and then merging them into
one sorted pile.

Time complexity: O(n log n)

 Quick Sort
Quick Sort is another divide-and-conquer algorithm
that picks a “pivot” element and partitions the list
around it. The elements smaller than the pivot go to
one side, and the larger ones go to the other. This
process is repeated recursively for each side.

Real-life example: Organizing a group of people by height


by using one person as a reference (pivot), grouping
shorter people to one side and taller people to the other.

Time complexity: O(n log n) (on average)

2. Search Algorithms
Searching is another essential operation when dealing with
data. Whether you’re searching for a specific value in an
array or traversing a graph, these algorithms will help you
find what you’re looking for efficiently.

 Linear Search
Linear Search involves going through each element in
a list one by one until the desired element is found.

Real-life example: Searching for a specific book on a shelf


by scanning each book until you find the one you’re
looking for.

Time complexity: O(n)

 Binary Search
Binary Search is an efficient searching algorithm that
works on sorted lists. It repeatedly divides the search
interval in half, comparing the target value to the
middle element. If the target is smaller, the search
continues in the left half; if it’s larger, the search
continues in the right half.

Real-life example: Think of searching for a word in a


dictionary. You open the book in the middle, check if the
word is on the left or right side, and repeat the process
until you find it.

Time complexity: O(log n)

3. Graph Algorithms
Graphs are widely used in many applications, including
social networks, maps, and recommendation systems.
Understanding how to traverse and find the shortest path
in graphs is crucial for solving real-world problems.

 Breadth-First Search (BFS)


BFS explores all the nodes at the present depth level
before moving on to nodes at the next depth level. It’s
particularly useful for finding the shortest path in an
unweighted graph.

Real-life example: Imagine you’re in a building and want to


find the shortest route to an exit. You start at your current
location and explore all nearby rooms (nodes) before
moving on to further rooms.

Time complexity: O(N + E) (where Nis the number of


vertices and E is the number of edges)

 Depth-First Search (DFS)


DFS explores as far as possible along each branch
before backtracking. This can be helpful in situations
like finding all possible paths or exploring every
node.

Real-life example: Think of exploring a maze, where you go


as deep as you can along one path before retracing your
steps and trying another route.
Time complexity: O(N + E)

 Dijkstra’s Algorithm
Dijkstra’s algorithm finds the shortest path between
nodes in a graph, particularly in weighted graphs
where each edge has a cost. The algorithm iteratively
selects the vertex with the smallest known distance
and updates the distances of its neighbors.

Real-life example: If you’re navigating through a city using


a GPS app, Dijkstra’s algorithm helps find the quickest
route from your location to your destination, taking into
account road distances or travel time.

Time complexity: O(N²) with a simple implementation,


but can be optimized to O((N + E) log N) with a priority
queue.

4. Dynamic Programming Algorithms


Dynamic programming is a method for solving complex
problems by breaking them down into simpler
subproblems and solving each subproblem just once,
storing its result.

Fibonacci Sequence
Dynamic programming can efficiently calculate the
Fibonacci sequence using a bottom-up approach.
Example: Calculating the 6th Fibonacci number:

1. Start with F(0) = 0 and F(1) = 1.

2. Calculate F(2) = F(1) + F(0) = 1.

3. Continue iteratively up to F(6) = 8.

This avoids the redundant calculations of a naive recursive


approach.

Longest Common Subsequence (LCS)


LCS finds the longest sequence of characters that appear
left-to-right in both strings, but not necessarily
consecutively.

Example: For strings "ABCD" and "ACBAD", the LCS is "ABD".

Dynamic programming builds a matrix to track matches


and efficiently compute the result.

5. Greedy Algorithms
Greedy algorithms build up a solution piece by piece,
always choosing the next piece that offers the most
immediate benefit.

Activity Selection Problem


Given start and end times for activities, select the
maximum number of non-overlapping activities.

Example: Activities with intervals (1, 3), (2, 5), (4, 6):

1. Choose (1, 3).

2. Skip (2, 5) as it overlaps.

3. Choose (4, 6).

Huffman Encoding
Huffman Encoding is a compression algorithm that uses a
greedy approach to assign variable-length codes to
characters based on their frequencies.

Example: For the string "AAABB", A gets a shorter code


than B because it appears more frequently.

6. Divide and Conquer Algorithms


Divide and conquer algorithms break a problem into
smaller parts, solve them independently, and combine the
results.

Binary Search
Already covered earlier, Binary Search is a classic divide-
and-conquer algorithm.

Merge Sort
Merge Sort is also a divide-and-conquer algorithm, as
explained above.

7. Backtracking Algorithms
Backtracking is used to solve problems recursively by
trying partial solutions and abandoning them if they don’t
work.

N-Queens Problem
Place N queens on an N x N chessboard so no two queens
threaten each other.

Example: For N = 4, place queens such that no two share


the same row, column, or diagonal.

Subset Sum Problem


Determine if there’s a subset of numbers that adds up to a
target sum.

Example: For the set {3, 34, 4, 12, 5, 2} and target 9, the
subset {4, 5} works.

8. Hashing Algorithms
Hashing algorithms efficiently map data to fixed-size
values for quick lookups.

Hash Tables
Hash tables store key-value pairs and use a hash function
to index data.

Example: A hash table can map student IDs to names: 123

-> John Doe.

Cryptographic Hash Functions


These functions generate unique hashes and are used in
encryption.

Example: Storing passwords securely by hashing them


before saving.

9. String Matching Algorithms


String matching algorithms find occurrences of a substring
within a string.

Knuth-Morris-Pratt (KMP) Algorithm


KMP avoids redundant comparisons by precomputing a
prefix table.

Example: Searching "ABABAC" in "ABABABCABABAC" finds


matches efficiently.

Rabin-Karp Algorithm
Rabin-Karp uses hashing to find a substring.
Example: Hash the pattern and sliding window hashes in
the text to locate matches.

You might also like