Stack Basics
Stack Basics
Applications of Stacks:
ArrayList:
LinkedList:
● Internally uses: A doubly linked list, where each element (node) points to both its
previous and next element.
● Performance Characteristics:
○ Access Time (get): O(n) – Slower, as you need to traverse the list from the start.
○ Insertion (at the beginning or middle): O(1) – No shifting required, just adjusting
pointers.
○ Insertion (at the end): O(1) – If we keep track of the tail node.
○ Deletion (from the beginning or middle): O(1) – Fast removal by updating
pointers.
● Memory usage: Requires more memory per element due to storing pointers for the next
and previous elements.
When to Use Which?
● You need fast insertions and deletions at both ends or the middle of the list.
● You are working with large data sets that require frequent insertion or removal of
elements at arbitrary positions.
● You don’t need random access or are working with a queue or stack-like structure.
● Example scenario: Implementing a queue, deque, or scenarios where you frequently
insert or remove elements.
● ArrayList uses less memory as it only needs to store the elements themselves.
● LinkedList requires more memory because each node stores references (pointers) to
both the next and previous elements.
When deciding between using a LinkedList or an ArrayList in Java, it’s important to consider
the differences in their internal structure and performance characteristics.
1. ArrayList:
2. LinkedList:
● Internally uses: A doubly linked list, where each element (node) points to both its
previous and next element.
● Performance Characteristics:
○ Access Time (get): O(n) – Slower, as you need to traverse the list from the start.
○ Insertion (at the beginning or middle): O(1) – No shifting required, just adjusting
pointers.
○ Insertion (at the end): O(1) – If we keep track of the tail node.
○ Deletion (from the beginning or middle): O(1) – Fast removal by updating
pointers.
● Memory usage: Requires more memory per element due to storing pointers for the next
and previous elements.
● You need frequent insertions or deletions at both the beginning or in the middle of the
list.
● You don’t need fast random access to elements.
● The size of the list changes frequently.
2. What are the time complexities of get, add, and remove in ArrayList and LinkedList?
● ArrayList:
○ get(i): O(1)
○ add(e) at end: O(1) (amortized)
○ remove(i): O(n) (due to shifting)
● LinkedList:
○ get(i): O(n) (since it requires traversal)
○ add(e) at head/tail: O(1)
○ remove(i): O(n) (but O(1) if you're removing the first or last element)
● ArrayList uses less memory as it only needs to store the elements themselves.
● LinkedList requires more memory because each node stores references (pointers) to
both the next and previous elements.
In short:
● ArrayList is best when you need fast access (retrieval) and occasional modifications.
● LinkedList is best when you frequently insert or delete elements at various positions in
the list.
Memory Usage Efficient (no extra Efficient for small Higher due to node
memory for stacks, but overhead structure (data + pointers)
pointers) for resizing
Push (Insertion) O(1) (if not full) Amortized O(1) O(1) (constant-time
(resizes when full) insertion at head)
Space Complexity O(n) (size of the O(n) (size of the O(n) (size of the list +
array) array + extra space memory for pointers)
when resizing)
Memory Very efficient, no Uses more memory Less memory efficient due
Efficiency wasted memory during resizing to pointer overhead
Random Access O(1) (direct O(1) (direct access O(n) (traverse the list)
access via index) via index)
Use Cases - When stack size - For moderate - For stacks with
is known in stacks with frequent unpredictable size changes
advance resizes
When to Use - Best for small, - Suitable for - Best for stacks with
fixed-size stacks dynamic stacks with frequent push/pop and
fewer operations unpredictable size
Advantages - Simple and fast - Dynamically resizes - No need for resizing
as needed
Amortized O(1) refers to an average time complexity over a series of operations, where most
operations are O(1) (constant time), but occasionally, one operation might take longer (e.g.,
O(n)).
For example, in the case of ArrayList, adding an element is generally O(1) because you add
the element to the end of the list. However, when the underlying array is full, the array must be
resized to accommodate new elements. This resizing involves creating a new, larger array and
copying all the elements over, which takes O(n) time.
However, this resizing doesn't happen frequently. Most of the time, adding elements is still O(1).
So when we consider the average time complexity over many additions, the total cost of all
operations is spread out, and it averages out to O(1). This is what we call amortized O(1).
Analogy:
Imagine filling up a jar with marbles. Every time you add a marble, it fits, and it takes no extra
time (O(1)). But once the jar is full, you need to get a bigger jar, which takes time to transfer all
the marbles (O(n)). This jar-resizing step doesn't happen often, so most of your marble-adding
actions are still quick, making the overall average O(1).
Yes, you are correct that ArrayList is dynamically sized, meaning it grows automatically when
needed. However, the underlying implementation of an ArrayList in Java uses a resizable
array. This array has a fixed capacity at any given time, which is why the concept of being "full"
still applies temporarily.
● Initially, the ArrayList is backed by a regular array with a certain capacity (e.g., 10
elements by default).
● When you keep adding elements, eventually the capacity of the backing array is
reached.
● At this point, the ArrayList must resize the backing array to make more space.
● When the ArrayList reaches its current capacity, it creates a new array that is typically
1.5 times or 2 times larger than the current array.
● All the elements from the old array are copied into this new, larger array.
● After this resize, new elements can be added to the ArrayList again until it fills up, at
which point the process repeats.
● Even though resizing takes O(n) time because you have to copy all elements, it doesn’t
happen on every insertion.
● Most insertions are done in O(1) (constant time).
● The cost of resizing is spread out over all the insertions, making the average time per
insertion O(1) in the long run.
● It adjusts its size dynamically by creating larger arrays, but this resizing mechanism is
what causes the array to occasionally be "full" before it can grow.
● This is why ArrayList offers amortized O(1) insertion — most insertions are fast, but
resizing can be costly.
in the case of an ArrayList, the underlying implementation would be the specific data structure
and algorithms used to store and manipulate the elements. This might involve using a dynamic
array or a linked list, along with methods for adding, removing, and accessing elements.
● Hidden from the user: The user typically interacts with the ArrayList through a
high-level interface, without needing to know the details of the underlying
implementation.
● Essential for functionality: The underlying implementation is crucial for the correct
behavior and performance of the ArrayList.
● Can vary between implementations: Different programming languages or libraries may
have different underlying implementations for ArrayLists, leading to variations in
performance or features.