**Stacks and queues**. Stacks and queues are data types that represent a collection of items; to which we can add and remove items. They differ in the order that items are removed:

- a stack removes the item that was most recently added (LIFO: last in first out)
- a queue removes the item that was least recently added (FIFO: first in first out)

**Linked lists**.
Linked lists can be used to implement both stacks and queues, yielding Θ(1) time operations. To achieve Θ(1) time operations for a queue, we require a reference to both the front and back of the list (for `dequeue` and `enqueue` respectively); a stack needs only a reference to the front.
The implementation of queues and stacks using linked lists (from lecture) consumes ~40*n* bytes of memory for a stack/queue containing *n* items (40 bytes per node in the list).

**Resizing arrays**. Logically, a resizing array is an array
that grows and shrinks as elements are added and removed. A
resizing array is implemented using a reference to a fixed-size
array: growing and shrinking is accomplished by allocating a new
fixed array of appropriate size and copying the contents from old to
new.

The operations of stacks and queues implemented with resizing arrays
run in Θ(n) time in the worst case due to the potential for array
resizing; however, they use only Θ(1) *amortized* time. The
implementation of queues and stacks using resizing arrays (from
lecture) consumes between ~8*n* and ~32*n* bytes of
memory for a stack/queue containing *n* items (~8*n*
in the best case when the array is full, and ~32*n* in the
worst case when the array is 1/4 full).

**Amortized analysis**.
Amortized analysis is a way of understanding the performance of a data
structure or algorithm that is less pessimistic than the worst-case
model. It is often used for data structures in which there are
events that are computationally expensive but uncommon (e.g.,
invocations of `push` or `pop` that trigger resizing
in an array implementation of a stack).

The amortized cost of the data structure is the worst-case cost of a
sequence of operations on that data structure (starting from a
freshly-initialized data structure) divided by the number of
operations, or equivalently the average cost of an operation in a
sequence of operations.

**Loitering**. Java is a garbage-collected language, meaning that the Java runtime is responsible for deallocating unused memory (garbage). The Java garbage collector determines that memory is garbage if there are no accessible references to it. *Loitering* is a bug in which there is memory that will never be used, but also cannot be collected because there are still references to it. Loitering can be avoided by removing references to memory that are no longer used (e.g., by writing `null` to a location that contains a reference to that memory).

- Textbook 1.3.6
- 1.3.3
- 1.3.22, then 1.3.23
- Consider the following implementation of
`pop`in`ResizingArrayStack`(Algorithm 1.1 from the textbook)// Remove item from top of stack. public Item pop() { Item item = a[--N]; a[N] = null; if (N > 0 && N == a.length/4) resize(a.length/2); return item; }

What is the purpose of the`A[N] = null`line?

**Answers**

- Textbook 1.3.5
- Textbook 1.4.35 [note: a pushdown stack is just a stack.]
- Textbook 1.4.36 [note: they recommend using a static inner class to reduce Node overhead, hence the disagreement with the lecture slides!]
- Consider again the
`pop`implementation in C-level question 4. If we get rid of the line`A[N] = null`, how does this affect the best and worst case memory usage of our resizing array based stack?

**Answers**

- What is the
*worst case*time complexity of pushing the numbers 1,2,...,n onto a stack and then popping them back off as a function of n (in Θ notation) when the stack is implemented with a resizing array?

**Answers**

- Textbook 1.4.32
- Suppose we have a resizing array that increases in size by K entries when the array is
full, and decreases in size by K entries when the array has K empty entries. Show that the push and pop take amortized M time for some worst
case sequence. Give an example of a worst case sequence. Observe that this results in M
^{2}time for M operations. -
Explain how a queue can be implemented using two stacks which achieving Θ(1) amortized time
`push`and`pop`operations.

**Answers**