Studying For A Tech Interview Sucks
Studying For A Tech Interview Sucks
This list is meant to be a both a quick guide and reference for further research into these topics. It's basically a summary of that comp sci course you
never took or forgot about, so there's no way it can cover everything in depth. It also will be available as a gist on Github for everyone to edit and add
to.
They are one of the oldest, most commonly used data structures.
Optimal for indexing; bad at searching, inserting, and deleting (except at the end).
Are static in size, meaning that they are declared with a fixed size.
Dynamic arrays are like one dimensional arrays, but have reserved space for additional
elements.
o
Two dimensional arrays have x and y indices like a grid or nested arrays.
Big O efficiency:
Linked List
Definition:
Nodes, at its most basic it has one datum and one reference (another node).
A linked list chains nodes together by pointing one node's reference towards another node.
Doubly linked list has nodes that reference the previous node.
Circularly linked list is simple linked list whose tail, the last node, references the head, the first node.
Stack, commonly implemented with linked lists but can be made from arrays too.
Made with a linked list by having the head be the only place for insertion and removal.
Made with a doubly linked list that only removes from head and adds to tail.
Big O efficiency:
Hash functions accept a key and return an output unique only to that specific key.
o
This is known as hashing, which is the concept that an input and an output have a one-to-one correspondence to map
information.
Hash collisions are when a hash function returns the same output for two distinct inputs.
This is often accommodated for by having the hash tables be very large.
Big O efficiency:
Binary Tree
Definition:
Is a tree like data structure where every node has at most two children.
o
A degenerate tree is an unbalanced tree, which if entirely one-sided is a essentially a linked list.
A binary tree that uses comparable keys to assign which direction a child is.
Because of the above it is more likely to be used as a data structure than a binary tree.
Big O efficiency:
Search Basics
Breadth First Search
Definition:
An algorithm that searches a tree (or graph) by searching levels of the tree first, starting at the root.
o
It finds every node on the same level, most often moving left to right.
While doing this it tracks the children nodes of the nodes on the current level.
When finished examining a level it moves to the left most node on the next level.
The bottom-right most node is evaluated last (the node that is deepest and is farthest right of it's level).
Uses a queue to store information about the tree while it traverses a tree.
o
Because it uses a queue it is more memory intensive than depth first search.
Big O efficiency:
E is number of edges
V is number of vertices
An algorithm that searches a tree (or graph) by searching depth of the tree first, starting at the root.
o
Once it reaches the end of a branch it traverses back up trying the right child of nodes on that branch, and if possible left from the
right children.
When finished examining a branch it moves to the node right of the root then tries to go left on all it's children until it reaches the
bottom.
The right most node is evaluated last (the node that is right of all it's ancestors).
Because a stack is LIFO it does not need to keep track of the nodes pointers and is therefore less memory intensive than breadth
first search.
Big O efficiency:
E is number of edges
V is number of vertices
The simple answer to this question is that it depends on the size and shape of the tree.
Nuances:
Because BFS uses queues to store information about the nodes and its children, it could use more memory than is available on your
computer. (But you probably won't have to worry about this.)
If using a DFS on a tree that is very deep you might go unnecessarily deep in the search. See xkcd for more information.
Compares each number one at a time, moving the smallest number to left of the pair.
Once all pairs sorted it then compares left most elements of the two leftmost pairs creating a sorted group of four with the
smallest numbers on the left and the largest ones on the right.
Know that it divides all the data into as small possible sets then compares them.
Big O efficiency:
Quicksort
Definition:
Divides entire dataset in half by selecting the average element and putting all smaller elements to the left of the average.
It repeats this process on the left side until it is comparing only two elements at which point the left side is sorted.
When the left side is finished sorting it performs the same operation on the right side.
While it has the same Big O as (or worse in some cases) many other sorting algorithms it is often faster in practice than many other sorting
algorithms, such as merge sort.
Know that it halves the data set by the average continuously until all the information is sorted.
Big O efficiency:
Bubble Sort
Definition:
It iterates left to right comparing every couplet, moving the smaller element to the left.
It repeats this process until it no longer moves and element to the left.
While it is very simple to implement, it is the least efficient of these three sorting methods.
Know that it moves one space to the right comparing two elements at a time and moving the smaller on to left.
Big O efficiency:
Merge Sort divides the set into the smallest possible groups immediately then reconstructs the incrementally as it sorts the groupings.
Quicksort continually divides the set by the average, until the set is recursively sorted.
If you've seen either of these from a recursive algorithm, you messed up.
It means that your base case was never triggered because it was faulty or the problem was so massive you ran out of RAM before
reaching it.
Knowing whether or not you will reach a base case is integral to correctly using recursion.
Iterative Algorithms
Definition:
An algorithm that is called repeatedly but for a finite number of times, each time being a single iteration.
o
Generally you will see iteration as loops, for, while, and until statements.
The differences between recursion and iteration can be confusing to distinguish since both can be used to implement the other. But know
that,
o
Pseudo Code of Moving Through an Array (this is why iteration is used for this)
Recursion
| Iteration
----------------------------------|---------------------------------recursive method (array, n)
| iterative method (array)
if array[n] is not nil
|
for n from 0 to size of array
print array[n]
|
print(array[n])
recursive method(array, n+1) |
else
|
exit loop
|
Greedy Algorithm
Definition:
An algorithm that, while executing, selects only the information that meets a certain criteria.
A selection function, which chooses the best candidate to be added to the solution.
A feasibility function, that is used to determine if a candidate can be used to contribute to a solution.
A solution function, which will indicate when we have discovered a complete solution.
Generally used on sets of data where only a small proportion of the information evaluated meets the desired result.
Pseudo Code of a Greedy Algorithm to Find Largest Difference of any Two Numbers in an Array.
greedy algorithm (array)
var largest difference = 0
var new difference = find next difference (array[n], array[n+1])
largest difference = new difference if new difference is > largest difference
repeat above two steps until all differences have been found
return largest difference
This algorithm never needed to compare all the differences to one another, saving it an entire iteration.