0% found this document useful (0 votes)
32 views3 pages

Data Structure - Algo Expert

The document provides an overview of various data structures, including arrays, linked lists, hash tables, stacks, queues, graphs, and trees, along with their time and space complexities. It covers concepts such as memory management, Big O notation, and the characteristics of different structures like heaps and binary trees. Additionally, it includes notes on operations and complexities associated with these data structures, as well as some questions for further exploration.

Uploaded by

marcelsena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views3 pages

Data Structure - Algo Expert

The document provides an overview of various data structures, including arrays, linked lists, hash tables, stacks, queues, graphs, and trees, along with their time and space complexities. It covers concepts such as memory management, Big O notation, and the characteristics of different structures like heaps and binary trees. Additionally, it includes notes on operations and complexities associated with these data structures, as well as some questions for further exploration.

Uploaded by

marcelsena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 3

Data Structures

- used to organize and manage data


- see wikipedia definition

Commplexity Analisys
Time Complexity: time a algo takes to run
Space Complexcity: space a algo takes to execute

Memory
- bounded, limmited of slots; 8 bits = 1 byte;
- base 2 system (lookup wikipedia) -> from right to left -> | 2^7 2^6 2^5
2^4 | 2^3 2^2 2^1 2^0 |
- any memory slot you can store another memory address / pointer
- endianess (not topic for interview)
- memory address
- each memory slots can store only 8 bits = 1 byte. eg. an integer of 32 bits
ocupies 4 memory slots. a 64 bits ocupies 8 slots

Big O Notation
- Time and space complexity are measured the same way
- asyntotic analysis
- the amount of operations won't matter as long as it doesn't change if the
input grows. bigger the input the worse an algo gets.
- O(1) constant time, O (log(n)), O(n) linear time, O (n log(n)), O(n^2) ie
O(n^3), O(n^4), etc...; O (2^n), O (n!)

Logarithm
- always base 2; log(n) = y iif 2^y = n; - if and only if (iif)
- if you double the number N then your exponent only increases by one; eg 2^4
= 16 -> 2^5 = 32; 2^30 = 1 Billion
- the more N increases, the Y only increases by a tiny amount. a log of N is
good because if the input increases the complexity only increases by a tiny amount
- log(n), remember the example of binary tree traversal, where half the tree
is discarded for search only the other half remaining to be searched

Arrays
Static Array
- fixed size; insertion is costly until then
- Arrays are constant time
- specify size. allocates fixed (specified) amount of memory in
sequence.
- stored back to back in memory slots
- system knows exactly where each information is. that's why the access
to the information is so fast.
- get, set: O(1) ST - init: O(n) ST - traverse: O(n) T, O(1) S -
copy: O(n) ST - insert: O(n) T, O(1) S (amortized analysis)

Dynamic Array
- it can change its size
- faster for insertion until it reaches the array limit size. otherwise
it has to copy the array
- get, set: O(1) ST - init: O(n) ST - traverse: O(n) T, O(1) S -
copy: O(n) ST
- insert: O(1) TS
:best case insertion occurs at the end of the array
:insertion in dynamic array does not happen in the worst case bc
it won't copy/double the array (unless otherwise specified in the interview)
:if insertion occurs in the beginning or middle, or anywhere that
is not the end then the operation is O(n) T
:pop O(1) at the beginning; anywhere else O(n).
:in some problems at the beginning at algoexpert pop at 0
index is refered as O(1)

Linked Lists
- is not stored back to back in auxiliary memory; it has a head pointing to
the first item in the list; may or may not have tail pointing to the last item
- it has two nodes (memory slots). one with value other with (1) to the next
node
:get = O(i) T, O(1) S where i = number of items to find the
information;
:set = O(i) T, O(1) S; :init O(n) ST; :traverse = O(n)T, O(1)
S :copy = O(n) ST
: impl uses node = next() and doubled linked lists node = next() and
node = prev()
- with a few caveats all the actions can be done in constant time. impl

HashTables
- key / value pairs
- keys don't have to be string as long as they can be converted into integers
- is a list containing linked list so when collisions occur you won't lose
information. each item in the linked list points to its key
- uses a hash function to transform the key into an index
- operations on average:
:insert - O(1) ST; delete - O(1) ST; search - O(1) ST;
resize - O(1) ST Amortized Time Complexity - bc it's gonna happen rarely
- operations on worst case they are O(n). only case is if the interviewer
mention an edge case where you r dealing with collision. otherwise is constant
time.

Stacks and Queues


- they are implemented with linked list
- Stacks follows the rule LIFO: usually implemented with singly linked lists
(sometimes dynamic arrays)
- Queues follows the rule FIFO: usually implemented w/ doubly linked lists
:push O(1) ST; :pop O(1) ST; search O(n) T, O(1) S; peek
O(1) ST - look at element on the top of the stack / head of queue

Strings
- operations
:traverse O(n) T, O(1) S; :copy O(n) ST; :get O(1) ST
TODO: RESEARCH HOW STRINGS WORKS UNDER THE HOOD. ALGO DOESN'T EXPLAIN
ANYMORE. NOT RELEVANT TO INTERVIEWS. HOW IT REPLACES ETC.??? OR WAS IT HASHTABLES?

Graphs
- a collection of vertices may or may not be connected
- are made of vertices (values) and edges (connect the nodes to one another)
- ciclic graphic, 3 or modes pointing to each other making an infinite loop;
something to look out for

Tree
- Heaps (min/max), Binary searh trees, Tries (three child nodes). operations
run on O(n) ST.
- Min Heap:
- completeness, BT has to have all its level completed, except
last level but it ne eds both left and right filled;
- min heap property: every child node value is higher than the
root node; root node is always the smallest number in the tree;
- max heap property: every child node value is lower than the
root node; root node is always the highest number in the tree;
- it is not sorted;
- can be represented as lists
- current node => i;
- left child => 2*i + 1;
- right child => 2*i + 2;
- root and nodes. leaf nodes or leaves nodes with no child. path between them
is called branches. height = longest . depth = distance from root
- k-arys tree (number of child nodes right after the root node)
- a tree can be considered if it maintains rougly a log(n) time coplexity
when it's traversed
- leaf/branch/completeness/full/perfect: validate tree vocabulary
- math proof: find how many leaf nodes there are in a binary tree
- Binary Tree: trees with two nodes. many of its operations are logarithmic
time complexity.
- K-ary tree: trees whose nodes have up to k child nodes. binary tree is k-
ary tree where k==2
- Perfect Binary Tree:
- Complete Binary Tree: nodes in the last level are as far left as possible
- Balanced Binary Tree: subtress whose heights differs by no more than 1
- Full Binary Tree: nodes have all two child nodes or no nodes.

Questions:
- WTF Non Adjacent Sum
- DFS - still lost

You might also like