Going bottom-up is a way to avoid recursion, saving the memory cost that recursion incurs when it builds up the call stack.
Put simply, a bottom-up algorithm "starts from the beginning," while a recursive algorithm often "starts from the end and works backwards."
For example, if we wanted to multiply all the numbers in the range 1...n, we could use this cute, top-down, recursive one-liner:
This approach has a problem: it builds up a call stack of size , which makes our total memory cost . This makes it vulnerable to a stack overflow error, where the call stack gets too big and runs out of space.
To avoid this, we can instead go bottom-up:
This approach uses space ( time).
Going bottom-up is a common strategy for dynamic programming problems, which are problems where the solution is composed of solutions to the same problem with smaller inputs (as with the fibonacci problem, above). The other common strategy for dynamic programming problems is memoization.
Pass Your Interviews with My FREE 7-Day Crash Course
I'll teach you the right way of thinking for breaking down tricky algorithmic coding interview questions you've never seen before.
No prior computer science training necessary—I'll get you up to speed quickly, skipping all the overly academic stuff.
No spam. One-click unsubscribe if you hate it.