My biggest insight about recursion came in my graduate algorithms course. I had a great professor who was talking about dynamic programming. I think that most people can agree that DP is one of the hairier subjects. When I'm griding Leetcode I mostly used to skip these problems and then just hope in an interview I'm never asked about it.

My prof cleared everything up though. He said that DP is what naturally happens whenever you take a recursive algorithm and refactor it to put all the recursive calls into their own data structure. When you are doing recursive fibonacci, you are really just using the call stack as a linked list. So instead of making the recursive call, figure out how value N in the list relies on the previous values and then compute that directly.

For more complicated algorithms where the recursive call has N arguments, that means you need a N-dimensional array (worst case) to store your calls in.

After that lecture I was never scared of dynamic programming because I had a meta-algorithm to produce the DP solution.

1. Write a recursive algo (probably exponentially inefficent)

2. Figure out how to store recursive call data in a data structure

3. Figure out how to populate field (a,b) of the data structure, normally by combining/minimizing/maximizing (a,b-1), (a-1,b)

4. Figure out how to get the answer out of the data structure once you have filled it in

A similar "data structure," that made recursion easy for me, is a set that's populated by an inductive proof in math. I think it's pretty much what you're talking about.

I always question how long an article should be, but then I reliase... Even if 30 mins is too much, if it saves you much more time googling around, it is worth it.

I want my articles to be used as future references, to be comprehensive and self-contained. I want them to be useful both to the complete beginner and to the person who already know most of the stuff.

After gathering some feedback I am now considering extending it to include the difference between structural recursion and generative recursion and differences between memoization and Dynamic Programming.

My biggest insight about recursion came in my graduate algorithms course. I had a great professor who was talking about dynamic programming. I think that most people can agree that DP is one of the hairier subjects. When I'm griding Leetcode I mostly used to skip these problems and then just hope in an interview I'm never asked about it.

My prof cleared everything up though. He said that DP is what naturally happens whenever you take a recursive algorithm and refactor it to put all the recursive calls into their own data structure. When you are doing recursive fibonacci, you are really just using the call stack as a linked list. So instead of making the recursive call, figure out how value N in the list relies on the previous values and then compute that directly.

For more complicated algorithms where the recursive call has N arguments, that means you need a N-dimensional array (worst case) to store your calls in.

After that lecture I was never scared of dynamic programming because I had a meta-algorithm to produce the DP solution.

1. Write a recursive algo (probably exponentially inefficent)

2. Figure out how to store recursive call data in a data structure

3. Figure out how to populate field (a,b) of the data structure, normally by combining/minimizing/maximizing (a,b-1), (a-1,b)

4. Figure out how to get the answer out of the data structure once you have filled it in

A similar "data structure," that made recursion easy for me, is a set that's populated by an inductive proof in math. I think it's pretty much what you're talking about.

Wow, this is really helpful. Thanks for sharing.

This is a very helpful rule of thumb

Last week I published this article about recursion in Python and wanted to share to gather feedback

I think you could've kept the section on "basic recursion" short, keeping the focus on complex cases.

I.e. removing any definition other than normal & tail recursion, and shortening the "tree structure of regression" bit into 1 paragraph.

Most people do understand basic recursion and the function call tree, but fail to grasp it in nontrivial scenarios

30 minutes read for an article is too much. Since you like explaining things in depth, consider writing a book.

I always question how long an article should be, but then I reliase... Even if 30 mins is too much, if it saves you much more time googling around, it is worth it.

I want my articles to be used as future references, to be comprehensive and self-contained. I want them to be useful both to the complete beginner and to the person who already know most of the stuff.

After gathering some feedback I am now considering extending it to include the difference between structural recursion and generative recursion and differences between memoization and Dynamic Programming.