# Dynamic Programming

Not an interview from the tech giants goes by without a question from Dynamic Programming. It is kind of a fancy name for memorization or storing values for future references.

Would it take time for you to calculate the Fibonacci of 100?

Would it take much time to calculate the Fibonacci of 100, say you knew Fibonacci of 98 and 99?

If you even come across a problem where in order to solve for the given input you have to solve the problem for sub-problems of the given input, then it is a classic case of dynamic programming.

For integers it could be to solve for smaller integers. For a string it could be to solve for a set of sub strings etc.

To demonstrate this I would like to take the classic example of Fibonacci numbers. If you would like a primer of the recursive solution or Fibonacci check out: Fibonacci Iterative vs. Recursive.

## Time complexity:

As we have seen in the post Fibonacci Iterative vs. Recursive , the recursive solution takes O(2 ^ n) time.

If we look at the recursion tree then as the recursive function goes along the longest branch of the tree, all the required Fibonacci numbers are calculated and stored in the array, therefore, further calls to those values do not spawn any new trees and therefore the time taken is as below:

## Space Complexity:

The space complexity for the array is O(n) and for the recursive call stack is also O(n). Hence the total space complexity is O(n).

Why go to such lengths when we can find the solution of Fibonacci numbers iteratively in O(n) time and O(1) space?

The space trade off using Dynamic Programming let’s us calculate a pre-calculated value in O(1) time. Hence in case of real time systems, this could be very beneficial and even desired.