A **dynamic array** automatically grows when you try to make
an insertion and there is no more space left for the new
item. Usually, the array doubles in size.

Additional functionality often comes with a cost. Here, we need to do some tricky things under the hood when we run out of room. When we don't have any space for a new item, we have to allocate a bigger array and copy over all of the elements from the array we've outgrown before we can finally append our item.

Doing all that copying takes time, where n is the number of elements in our array.

Oof. That's an expensive cost for an append. In a fixed-length array, appends only take time!

But appends take time only when we insert into a
*full* array. And that's pretty rare, especially if we double the size of
the array every time we run out of space. So in *most* cases
appending is still time, and
*sometimes* it's time.

So focusing on a single append seems a bit harsh. Sure, the append
*could* be expensive, but it's far more likely it'll be cheap. A
more fair analysis would look at the cost of a single append
averaged over a large number of appends. This is
called **amortized analysis**.

"Amortize" is a fancy verb used in finance that refers to paying off the cost of something gradually. With dynamic arrays, every expensive append where we have to grow the array "buys" us many cheap appends in the future. Conceptually, we can spread the cost of the expensive append over all those cheap appends.

Here, instead of looking at the worst case for an append individually, let's look at the overall cost of doing many appends—let's say m appends.

The cost of doing m appends is
m (since we're appending every item),
*plus* the cost of any doubling when the dynamic array needs
to grow. How much does the doubling cost?

Say we start off with space for just one item. Then the first doubling costs 1. The second costs 2. The third costs 4. The fourth costs 8.

So, the cost we're looking at is:

1 + 2 + 4 + 8 + ... + \frac{m}{2} + mIt's helpful to look at that from right to left:

m + \frac{m}{2} + \frac{m}{4} + ... + 4 + 2 + 1We can see what this comes to when we draw it out. If this is m:

\frac{m}{2} is half the size:

\frac{m}{4} is half the size of that:

And so on:

We see that the whole right side ends up being another square of size m, making the sum 2m.

So when we do m appends, the appends themselves cost m, and the doubling costs 2m. Put together, we've got a cost of 3m, which is . So on average, each individual append is . m appends cost us .

Remember, even though the **amortized cost** of an append
is , the **worst case**
cost is still . Some
appends *will* take a long time.

## Interview coming up?

**Get the free 7-day email crash course.** You'll learn *how to think algorithmically*, so you can break down tricky coding interview
questions.

No prior computer science training necessary—we'll get you up to speed quickly, skipping all the overly academic stuff.

No spam. One-click unsubscribe whenever.

You're in! Head over to your email inbox right now to read day one!