Bill Wadge ruined my life! I stumbled upon his book on his language Lucid in the library when I was in grad school, and it opened up my eyes about different programming models. It inspired me to try making my own programming language and I’ve been doing it ever since!
Going on 10 years now, no sign of losing interest. Don’t read his book if you have an interest in the thing you’re doing; it’ll turn you into a compiler dev and you’ll never do your interest again.
Lucid was a strong inspiration for my programming language Intonal, particularly the “fby” operator, which took over ~5 years of my life. Probably going to open source it soon
In my head the two dimensions are tail Vs non-tail jumps, and explicit Vs implicit scope passing.
The most interesting case is implicit scope+non-tail recursion, usually this requires you to capture the variable context in mutable objects/monads/effects or similar.
This explicit capturing is neat because you still have consistent shapes for your state to define invariants over, but it's easier to decompose the logic and ignore parts of the context which are irrelevant.
It lets you split problems into domain specific languages, each of which has a set of (maybe overlapping) contexts, and ideally each of which can be proven in isolation.
Also, the control flow of loops is a very restricted case of even tail jumps. Tail recursion has arbitrary jumps between basic blocks and loops are properly nested basic blocks. Even with labeled breaks, efficiently simulating arbitrary tail recursion without goto is tough. Induction proofs/data flow analysis/abstract interpretation don't care, but I'm not sure it makes proofs easier.
For the first few paragraphs I actually thought that the article is describing D language but apparently it's a new interpreted language namely PyFL.
It seems to me that to have the ability to create the powerful meta web framework like Ruby's RoR the programming language need to have the right Yin and Yang (recursion and iteration) balance mentioned in the article [1],[2]. To get the balance in an interpretered language is much easier than in a strongly typed compiled language hence this makes D language truly unique and special [3].
[1] Why Ruby on Rails still matters (458 comments):
I like the idea mentioned in the article of exploring limited higher-order functions: functions that can only take functions as argument that themselves are not taking functions as arguments. But what simplification does this buy (in the implementation of such a language) over a language that is fully functional? It is not explained in the article.
If you return a function, the compiler has to bundle up the values of any variables that the function uses from its outer environment and return the whole package. That's known as a closure. You can't just return a function pointer.
Basically, if you don't return functions, you can avoid dealing with them "escaping" the lexical environment in which their code lives.
"Java and C, for example, have elaborate while and for statements." OK, I'll give you the `for` statement; it's remarkably complicated when you look at it. But I don't see how you can call the `while` loop elaborate.
Iteration is not the Yang it is bloat. There is no such thing as iteration, only head and tail recursion. May we pray to God for forgiveness from our sins.
I must confess, I have never been clear on the difference between iteration and recursion, or more specifically what differentiates a tail recursive loop from an iteration. Usage of iteration in the context of, say, an iterative solution technique such as gradient descent or an iterated function system feels very much like a (tail) recursion.
Recursion without optimizations requires a possibly unbounded stack. Tail call optimization allows optimizing certain recursive functions in a way that no nested stack frames are needed.
So from a purely functional point of view, tail recursion and iteration are essentially a bijection.
> The answer is that bright Yang – iteration – has its own dark side, its own Yin. The presence of side effects (among other things) makes the programs hard to understand, verify, or modify.
There are no "side-effects", only effects. The question is, "Have you planned and implemented their cascades according to the design requirements?"
Change is all that matters in our digital logic manipulation business, and the only bottom-line results of any consequence are the bits that have been changed, plus the where, how, and why they did.
The overall theme of the article presents an interesting and well-considered perspective, but my already knowing that all recursive algorithms have an equivalent iterative version is enough for me to be more interested in the traversals than the mechanism used to perform them.
I want my data to flow in specifically precise ways. Choosing the proper tooling is an essential part of a project's work's earliest stages, and different folks succeed using different tools.
[addendum: learned a cool new word today: eduction := the act of drawing out or bringing into view]
Bill Wadge ruined my life! I stumbled upon his book on his language Lucid in the library when I was in grad school, and it opened up my eyes about different programming models. It inspired me to try making my own programming language and I’ve been doing it ever since!
Going on 10 years now, no sign of losing interest. Don’t read his book if you have an interest in the thing you’re doing; it’ll turn you into a compiler dev and you’ll never do your interest again.
Lucid was a strong inspiration for my programming language Intonal, particularly the “fby” operator, which took over ~5 years of my life. Probably going to open source it soon
In my head the two dimensions are tail Vs non-tail jumps, and explicit Vs implicit scope passing.
The most interesting case is implicit scope+non-tail recursion, usually this requires you to capture the variable context in mutable objects/monads/effects or similar.
This explicit capturing is neat because you still have consistent shapes for your state to define invariants over, but it's easier to decompose the logic and ignore parts of the context which are irrelevant. It lets you split problems into domain specific languages, each of which has a set of (maybe overlapping) contexts, and ideally each of which can be proven in isolation.
Also, the control flow of loops is a very restricted case of even tail jumps. Tail recursion has arbitrary jumps between basic blocks and loops are properly nested basic blocks. Even with labeled breaks, efficiently simulating arbitrary tail recursion without goto is tough. Induction proofs/data flow analysis/abstract interpretation don't care, but I'm not sure it makes proofs easier.
For the first few paragraphs I actually thought that the article is describing D language but apparently it's a new interpreted language namely PyFL.
It seems to me that to have the ability to create the powerful meta web framework like Ruby's RoR the programming language need to have the right Yin and Yang (recursion and iteration) balance mentioned in the article [1],[2]. To get the balance in an interpretered language is much easier than in a strongly typed compiled language hence this makes D language truly unique and special [3].
[1] Why Ruby on Rails still matters (458 comments):
https://news.ycombinator.com/item?id=43130546
[2] "Stop Designing Languages. Write Libraries Instead.":
https://lbstanza.org/purpose_of_programming_languages.html
[3] D language forums reaction to the article in [2]:
https://forum.dlang.org/post/mailman.8425.1556827686.29801.d...
I like the idea mentioned in the article of exploring limited higher-order functions: functions that can only take functions as argument that themselves are not taking functions as arguments. But what simplification does this buy (in the implementation of such a language) over a language that is fully functional? It is not explained in the article.
If you return a function, the compiler has to bundle up the values of any variables that the function uses from its outer environment and return the whole package. That's known as a closure. You can't just return a function pointer.
Basically, if you don't return functions, you can avoid dealing with them "escaping" the lexical environment in which their code lives.
"Java and C, for example, have elaborate while and for statements." OK, I'll give you the `for` statement; it's remarkably complicated when you look at it. But I don't see how you can call the `while` loop elaborate.
Don’t forget “do while”, and the various control statements they support.
And continue and break
Iteration is not the Yang it is bloat. There is no such thing as iteration, only head and tail recursion. May we pray to God for forgiveness from our sins.
"But they treat functions as second-class objects that e.g. can’t be returned as results." Returning a lambda in Java is really no big deal.
I must confess, I have never been clear on the difference between iteration and recursion, or more specifically what differentiates a tail recursive loop from an iteration. Usage of iteration in the context of, say, an iterative solution technique such as gradient descent or an iterated function system feels very much like a (tail) recursion.
Recursion without optimizations requires a possibly unbounded stack. Tail call optimization allows optimizing certain recursive functions in a way that no nested stack frames are needed.
So from a purely functional point of view, tail recursion and iteration are essentially a bijection.
> Even Python, however, still limits the use of functions. It does not allow partial application
https://docs.python.org/3/library/functools.html#partial-obj...
> The answer is that bright Yang – iteration – has its own dark side, its own Yin. The presence of side effects (among other things) makes the programs hard to understand, verify, or modify.
There are no "side-effects", only effects. The question is, "Have you planned and implemented their cascades according to the design requirements?"
Change is all that matters in our digital logic manipulation business, and the only bottom-line results of any consequence are the bits that have been changed, plus the where, how, and why they did.
The overall theme of the article presents an interesting and well-considered perspective, but my already knowing that all recursive algorithms have an equivalent iterative version is enough for me to be more interested in the traversals than the mechanism used to perform them.
I want my data to flow in specifically precise ways. Choosing the proper tooling is an essential part of a project's work's earliest stages, and different folks succeed using different tools.
[addendum: learned a cool new word today: eduction := the act of drawing out or bringing into view]