I watch this project since a few years and they make good progress. To whoever is interested in open source Computer Algebra Systems, there are of course plenty of more mature solutions. Classical ones such as GNU Octave or Maxima but also "modern ones" such as SAGEmath, Symbolics.jl or sympy. In particular, there is a broad range from symbolic libraries such as GiNaC up to "battery included" IDEs like SAGEmath. The community is vivid and amazing, for instance SAGEmath basically pioneered the web notebook interface which today brought us Jupyter in all its fashions.
I personally love the LISPy style of Mathematica (MMA) but of course it is not the (only) the core which makes MMA so powerful but the super large library which has not only instance industry-leading solutions for basic topics such as symbolic integration, 2D/3D graphics or finite element methods but also a plethora of special purpose domains such as bioinformatics. I guess Mathics has a good clone of the core but lacks, of course, all the libraries. It is, by the way, the same logic as with Matlab and its many "toolkits" compared to the numpy clone. However, the python movement brought many novel codes into the numpy world which no more work on Matlab.
You're right about the progress they've made: to me, this project's a really wonderful example of quietly chipping away at a project you love. I remember about... five years ago?... when the project was first launched, and I thought "hmm, that's nice - they've really nailed the symbolic evaluation engine, but let's see what happens". I'll have to remember their example every time I feel like starting a new project instead of chipping away at an old one...
Hmm, I was using it fairly heavily while going through a book which required loads of messing with graphs last year for a month maybe and I don't remember it acting up... Perhaps I got lucky? Is it known to be buggy, in general?
Yes it's terribly buggy. Incorrect answers sometimes depending on release versions and solve gets stuck all the time. I tend to use Xcas/GIAC instead where I can. Also bundled on my calculator (HP Prime).
It's so bad that the university I "loosely associate" with have their own patched version to fix a load of problems with it.
Indeed. The problem is that there is no regression suite clearly. Sometimes things get broken and then have to be fixed again. You end up in situation where you need to tell a person to use a specific version and hope they know how to get it working.
This makes communication, which is a really big part of mathematics, extremely difficult.
BTW, the current SBCL versions are much better than the stable/oldstable releases such as the ones for Debian or Ubuntu LTS.
Try Fricas, you might like it too.
On Maxima, beware, because your package manager might ship you a Maxima version built with generic and unnoptimized CL compilers.
GNU CLisp's performance against SBCL it's abysmal. On an n270 netbook, MCClim widgets built from Ultralisp (QuickLisp repo) run in realtime with SBCL. ECL it's a bit beter, closer to SBCL than CLISP. CLISP fails to compile due the lack of threading support on 32 bit. And even if it ran MCClim, it would run visibily redrawing widgets like Win32 ones under a Pentium as they did back in the day.
Thanks for the suggestion - will look into it tomorrow.
Not performance heavy here. What I do require is something that is less buggy though. Maxima has zero to no test suite so there are regularly stupid regressions that break everything.
> I love how I get downvoted for every negative comment about this rather than an actual rebuttal of the issues.
It's because you haven't mentioned any issues. Just said it's terrible and buggy. Nobody can debug that for you based on that description. If they haven't experienced the same "issues", the best they can say is "nuh uh".
Maybe I'm mistaken, but I don't think of Octave, Matlab or numpy as operating in the same space as a computer algebra system -- to me, those are all numerics oriented languages/libraries, used to obtain numerical solutions to problems vs exact symbolic expressions. They complement each other and are often used together, like Mathematica and mathics seem to support both paradigms, but they aren't the same.
When you go into battle to solve a computational mathematics problem, the problem sometimes doesn't care about these boundaries. E.g., you might think that you're solving a problem in "computer algebra", but a sufficiently fast solution might end up involving numerical linear algebra in surprising ways. (E.g., enumerating elliptic curves over the rational numbers is reduced to exact linear algebra because of a Wiles work on Fermat's last theorem, and often exact linear algebra can be done via clever tricks much more efficiently using floating point matrices, which happen to be insanely fast to work with due to GPU's...). It's thus valuable, at least for research, to have large mathematical software systems that span these boundaries.
You're correct. The engines are actually quite orthogonal (symbolic vs numerical). There's the rare cross-over, but for the most part they are used to obtain different types of solution.
Mathematica can do both, but it's much stronger in symbolic than numerics. I wouldn't try to implement large-scale numerical models in Mathematica.
It isn't that simple. Yes, Mathematica has emphasized symbolic computation since day 1, but it has also had numerical capabilities like NIntegrate since day 1 too. There's no good reason why the symbolic capabilities of Mathematica should necessarily hurt its numerical performance. In particular, Mathematica and Matlab often use the exact same libraries for many numerical operations, e.g. Intel's MKL for many matrix operations or SuiteSparse for sparse matrices. Any inefficiencies in calling out to those libraries between Matlab and/or Mathematica are usually down to a few tweaks.
In the vast majority (90% or more) of cases the performance between Matlab and Mathematica is identical or within the margin of error. And where performance does differ substantially, Matlab isn't always coming out on top. So the question then becomes whether or not your work depends on those particular use cases. In my eyes, the Wolfram Language is fundamentally superior to Matlab. And even if Mathematica/WL lags behind Matlab is some numerical aspects, Wolfram Research will have a far easier time fix or improving that deficiency than MathWorks will have improving the poverty of symbolic computation in Matlab.
Of course, the factors in practical use cases are quite a bit more complex than what I've said above. If you're working at an engineering firm that's already knee-deep in Simulink and various Matlab Toolboxes, then the choice has been already been made for you. And sometimes it's not a question of performance but one of feature parity, e.g. Compile in the Wolfram Language vs Matlab Compiler/Runtime. Also, the complexity of the Wolfram Language is not a benefit for users looking to simply write code and get a result ASAP.
If I recall correctly, the symbolic toolbox data structures were not first-class citizens in the MATLAB world (where the fundamental data structure was the matrix).
I used the Symbolic Toolbox in grad school to try to simplify equation systems, but once I got the result, I had to rewrite it in real MATLAB code.
If you only want individual use, Wolfram Cloud is free to use. I think files delete after 30 days or something. Also the Wolfram Engine is a free way to use Mathematica via command line. Hey, its something...
I'm not aware of any real push to include Mathics in sagemath. I speculate it's because SageMath is mainly developed by research mathematicians and cryptographers, so performance is often a primary concern when evaluating components for inclusion. One of the reasons SageMath is the largest Cython project is because Cython makes it possible to utilize fast C/C++ libraries from Sage.
I have the impression that Mathics is currently not seriously concerned with performance. E.g., try this little microbenchmark in Mathics: "AbsoluteTiming[Sum[i, {i, 1, 100000}]]" or read their roadmap. This is fine -- there are many interesting applications of the Mathematica programming language where performance isn't important, e.g., carefully stepping through some symbolic operations with expressions. However, for Sage the motivation of most developers is cutting edge research math, and for that, performance is almost always very important. Performance is why Sage also implements a lot of similar functionality to Sympy rather than just using sympy for that functionality -- since sympy can be relatively slow due to their priority to be easy to install, which is definitely not a priority for SageMath.
The mission statement for SageMath is to be a viable alternative to Mathematica, Matlab, Magma, and Maple, but I never meant that to mean being a clone (e.g., directly running Mathematica code), but instead just an alternative in the sense that it can support research built on open source mathematical software that might otherwise be done with those closed source programs.
It's an interesting exercise to think about why the performance of Sum[i, {i, 1, 100000}] differs between Mathics and MMA. Mathics just calls down to sympy, which I think just does the sum in Python [1]. Mthematica (likely) identifies your sum as the 100000th triangular number, and computes it directly in native code, since I know Mathematica relies heavily on standard tables of summations/integrals/etc. [2]
Pure Python is on the order of 1000x faster at computing that sum by brute force than Mathics. This suggests that perhaps some of the basic optimizations one does when implementing a language, to get even the most minimal level of performance, haven't yet happened with Mathics. For example, when we added asymptotically fast arbitrary precision integers to SageMath (by wrapping GMP, then later MPIR), we had to implement an "integer pool" since otherwise a lot of everyday microbenchmarks were far too slow.
I have a Mathematica license and at the same time find this project quite cool (and I'm a SW engineer). I would be surprised if mathics developers are not Mathematica users.
Mathematica is freely available on the Raspberry Pi[1] and most universities have site-wide licenses. The "Home & Hobby" licenses aren't even that expensive, only $195/year for a subscription or $390 for a perpetual license (and only $175 for renewals)[2]. And honestly, for those who are interested in tinkering but can't afford those prices, cracked copies are neither hard to find nor hard to install.
Personally I am quite fond of Mathematica (well, the "Wolfram Language") and I'm happy to pay for the hobby license price. Not only do I think I'm getting my money's worth, but I think supporting mathematical software is a "good cause" to put money behind. Moreover, I've never understood why people like amateur photographers will often spend more on tools like Adobe CC than many programmers (amateur or otherwise) spend on their tools period. Or why someone would spend $20-40+/month on various subscription services only to balk at $200-$400 licensing fees. Then again, I spend more time in Mathematica than I do in almost any other program installed on my computer.
However there is still an important place for (F)OSS mathematical software. As comprehensive as Mathematica usually is, it still has some significant shortcomings in advanced mathematics. In particular, I have a hard time believing it will ever fulfill those more "niche" areas of mathematics for two reasons. First and foremost, the ROI drops off a cliff for more advanced and/or esoteric fields. Second of all, the Wolfram Language already has 6000+ built in functions, so adding hundreds more to comprehensively support of something like Group Theory just doesn't make sense. Sure, they could be supported via packages, but that comes at the cost of performance (no first class support in the kernel) and usability (users have to go out of their way to use it).
Therefore, (F)OSS software like GAP, M2, and PARI/GP serve an important role in "rounding out" the gaps in the wolfram language. In my case, I contribute to FOSS projects an equal amount to whatever I spend on my Mathematica license. Or when monetary contributions to a project are not so straight forward, I try and contribute my time and skills improving them.
To be frank, I don't care much for projects trying to replicate Mathematica's functionality. Of course people are still going to develop and improve them, which at the very least puts some pressure on Wolfram Research to constantly improve basic functionality, but it will probably take one or two decades for said projects to replicate what Mathematica/WL is today.
Stephen Wolfram seems to get a lot of hate on HN, but for doing experimental math, puzzle solving, quick data viz, and much more, Wolfram is a deep and lovely language. The integrated notebook, mouse hover docs, and (shockingly) single massive namespace with thousands of functions somehow add up to a simple and productive experience unlike any other I've used. And I am not generally a fan of "thick" IDEs.
Adding the notebook GUI to a CAS (with a flexible programming language like Lisp or the Mathematica language) was a huge innovation, and we can probably thank Mathematica (along with MathCAD and Maple) for providing the inspiration for other notebook systems such as Jupyter.
The GUI is a big part of what made Mathematica better (for many users) than Macsyma. But it wasn't just the GUI - it was also the document-oriented design where you could save the graphs and computational results, as well as the code to generate them, in a single document.
Python had a perfectly functional REPL, but Jupyter notebooks were still a huge advancement.
"I'm happy to pay for the hobby license price." ..." Then again, I spend more time in Mathematica than I do in almost any other program installed on my computer."
one of the annoying things about Mathematica is that all functions are crammed into the same namespace and that there is no overloading with different parameterization options...
What do you mean by overloading? Functions can easily have different behavior with different argument counts (e.g., 2- vs. 3-argument Fold), and they can have any number of options (e.g., Graphics, Graphics3D, Solve, Import/Export, etc.). The only big redundancy I can think of is the various Plot functions.
I watch this project since a few years and they make good progress. To whoever is interested in open source Computer Algebra Systems, there are of course plenty of more mature solutions. Classical ones such as GNU Octave or Maxima but also "modern ones" such as SAGEmath, Symbolics.jl or sympy. In particular, there is a broad range from symbolic libraries such as GiNaC up to "battery included" IDEs like SAGEmath. The community is vivid and amazing, for instance SAGEmath basically pioneered the web notebook interface which today brought us Jupyter in all its fashions.
I personally love the LISPy style of Mathematica (MMA) but of course it is not the (only) the core which makes MMA so powerful but the super large library which has not only instance industry-leading solutions for basic topics such as symbolic integration, 2D/3D graphics or finite element methods but also a plethora of special purpose domains such as bioinformatics. I guess Mathics has a good clone of the core but lacks, of course, all the libraries. It is, by the way, the same logic as with Matlab and its many "toolkits" compared to the numpy clone. However, the python movement brought many novel codes into the numpy world which no more work on Matlab.
You're right about the progress they've made: to me, this project's a really wonderful example of quietly chipping away at a project you love. I remember about... five years ago?... when the project was first launched, and I thought "hmm, that's nice - they've really nailed the symbolic evaluation engine, but let's see what happens". I'll have to remember their example every time I feel like starting a new project instead of chipping away at an old one...
On Lisp you can jump into Common Lisp from Maxima in a breeze. Better with SBCL because of performance.
But Maxima, which is horrifying buggy.
Hmm, I was using it fairly heavily while going through a book which required loads of messing with graphs last year for a month maybe and I don't remember it acting up... Perhaps I got lucky? Is it known to be buggy, in general?
Yes it's terribly buggy. Incorrect answers sometimes depending on release versions and solve gets stuck all the time. I tend to use Xcas/GIAC instead where I can. Also bundled on my calculator (HP Prime).
It's so bad that the university I "loosely associate" with have their own patched version to fix a load of problems with it.
Hmm ok, shoot. I hope someone comes and cleans it up with one day, a historically significant piece of software, I would say.
Indeed. The problem is that there is no regression suite clearly. Sometimes things get broken and then have to be fixed again. You end up in situation where you need to tell a person to use a specific version and hope they know how to get it working.
This makes communication, which is a really big part of mathematics, extremely difficult.
WxMaxima? For sure. Plain Maxima/XMaxima/Maxima.el it's megastable.
Stable yes. Buggy yes. There are all sorts of problems it gets stuck on. solve() is terrible.
Which Common Lisp interpreter/compiler did you use?
SBCL shipped via homebrew.
SBCL is fine. It's Maxima that is full of bugs.
I love how I get downvoted for every negative comment about this rather than an actual rebuttal of the issues.
It's fucking shit. I hate every moment using Maxima. I'd rather pay for Mathematica but my org is too cheap for that. There you. Enjoy :)
Even better, I'd rather do it by hand!
Fricas it's another CAS based on Common Lisp.
BTW, the current SBCL versions are much better than the stable/oldstable releases such as the ones for Debian or Ubuntu LTS.
Try Fricas, you might like it too.
On Maxima, beware, because your package manager might ship you a Maxima version built with generic and unnoptimized CL compilers.
GNU CLisp's performance against SBCL it's abysmal. On an n270 netbook, MCClim widgets built from Ultralisp (QuickLisp repo) run in realtime with SBCL. ECL it's a bit beter, closer to SBCL than CLISP. CLISP fails to compile due the lack of threading support on 32 bit. And even if it ran MCClim, it would run visibily redrawing widgets like Win32 ones under a Pentium as they did back in the day.
Thanks for the suggestion - will look into it tomorrow.
Not performance heavy here. What I do require is something that is less buggy though. Maxima has zero to no test suite so there are regularly stupid regressions that break everything.
> I love how I get downvoted for every negative comment about this rather than an actual rebuttal of the issues.
It's because you haven't mentioned any issues. Just said it's terrible and buggy. Nobody can debug that for you based on that description. If they haven't experienced the same "issues", the best they can say is "nuh uh".
Maybe I'm mistaken, but I don't think of Octave, Matlab or numpy as operating in the same space as a computer algebra system -- to me, those are all numerics oriented languages/libraries, used to obtain numerical solutions to problems vs exact symbolic expressions. They complement each other and are often used together, like Mathematica and mathics seem to support both paradigms, but they aren't the same.
When you go into battle to solve a computational mathematics problem, the problem sometimes doesn't care about these boundaries. E.g., you might think that you're solving a problem in "computer algebra", but a sufficiently fast solution might end up involving numerical linear algebra in surprising ways. (E.g., enumerating elliptic curves over the rational numbers is reduced to exact linear algebra because of a Wiles work on Fermat's last theorem, and often exact linear algebra can be done via clever tricks much more efficiently using floating point matrices, which happen to be insanely fast to work with due to GPU's...). It's thus valuable, at least for research, to have large mathematical software systems that span these boundaries.
You're correct. The engines are actually quite orthogonal (symbolic vs numerical). There's the rare cross-over, but for the most part they are used to obtain different types of solution.
Mathematica can do both, but it's much stronger in symbolic than numerics. I wouldn't try to implement large-scale numerical models in Mathematica.
It isn't that simple. Yes, Mathematica has emphasized symbolic computation since day 1, but it has also had numerical capabilities like NIntegrate since day 1 too. There's no good reason why the symbolic capabilities of Mathematica should necessarily hurt its numerical performance. In particular, Mathematica and Matlab often use the exact same libraries for many numerical operations, e.g. Intel's MKL for many matrix operations or SuiteSparse for sparse matrices. Any inefficiencies in calling out to those libraries between Matlab and/or Mathematica are usually down to a few tweaks.
In the vast majority (90% or more) of cases the performance between Matlab and Mathematica is identical or within the margin of error. And where performance does differ substantially, Matlab isn't always coming out on top. So the question then becomes whether or not your work depends on those particular use cases. In my eyes, the Wolfram Language is fundamentally superior to Matlab. And even if Mathematica/WL lags behind Matlab is some numerical aspects, Wolfram Research will have a far easier time fix or improving that deficiency than MathWorks will have improving the poverty of symbolic computation in Matlab.
Of course, the factors in practical use cases are quite a bit more complex than what I've said above. If you're working at an engineering firm that's already knee-deep in Simulink and various Matlab Toolboxes, then the choice has been already been made for you. And sometimes it's not a question of performance but one of feature parity, e.g. Compile in the Wolfram Language vs Matlab Compiler/Runtime. Also, the complexity of the Wolfram Language is not a benefit for users looking to simply write code and get a result ASAP.
These were my thoughts too. The use case for Mathematica is extremely different than Octave.
I generally agree, but Matlab does have algebraic capabilities
The numeric capabilities of Mathematica are far, far closer to Matlab's than the symbolic capabilities of Matlab are to Mathematica's.
Matlab's symbolic toolbox is actually a wrapper around another proprietary CAS, Mupad.
If I recall correctly, the symbolic toolbox data structures were not first-class citizens in the MATLAB world (where the fundamental data structure was the matrix).
I used the Symbolic Toolbox in grad school to try to simplify equation systems, but once I got the result, I had to rewrite it in real MATLAB code.
It seems to be based on sympy: https://www.sympy.org/en/index.html
If you only want individual use, Wolfram Cloud is free to use. I think files delete after 30 days or something. Also the Wolfram Engine is a free way to use Mathematica via command line. Hey, its something...
You may also buy a Raspberry Pi which comes with a Mathematica license.
With WLJS on top of Wolfram Engine its a blast
I didn't know about this. Would you know how close it gets to Mathematica desktop experience? I could not find much on this online.
Pretty decent though a bit buggy notebooks
> I could not find much on this online
https://jerryi.github.io/wljs-docs/
A simpler intro to Mathics is here: https://mathics.org/
why do i think this will be integrated into sagemath? :D
I'm not aware of any real push to include Mathics in sagemath. I speculate it's because SageMath is mainly developed by research mathematicians and cryptographers, so performance is often a primary concern when evaluating components for inclusion. One of the reasons SageMath is the largest Cython project is because Cython makes it possible to utilize fast C/C++ libraries from Sage.
I have the impression that Mathics is currently not seriously concerned with performance. E.g., try this little microbenchmark in Mathics: "AbsoluteTiming[Sum[i, {i, 1, 100000}]]" or read their roadmap. This is fine -- there are many interesting applications of the Mathematica programming language where performance isn't important, e.g., carefully stepping through some symbolic operations with expressions. However, for Sage the motivation of most developers is cutting edge research math, and for that, performance is almost always very important. Performance is why Sage also implements a lot of similar functionality to Sympy rather than just using sympy for that functionality -- since sympy can be relatively slow due to their priority to be easy to install, which is definitely not a priority for SageMath.
The mission statement for SageMath is to be a viable alternative to Mathematica, Matlab, Magma, and Maple, but I never meant that to mean being a clone (e.g., directly running Mathematica code), but instead just an alternative in the sense that it can support research built on open source mathematical software that might otherwise be done with those closed source programs.
It's an interesting exercise to think about why the performance of Sum[i, {i, 1, 100000}] differs between Mathics and MMA. Mathics just calls down to sympy, which I think just does the sum in Python [1]. Mthematica (likely) identifies your sum as the 100000th triangular number, and computes it directly in native code, since I know Mathematica relies heavily on standard tables of summations/integrals/etc. [2]
[1] https://github.com/sympy/sympy/blob/master/sympy/concrete/su....
[2] See Sum's Method option.
Pure Python is on the order of 1000x faster at computing that sum by brute force than Mathics. This suggests that perhaps some of the basic optimizations one does when implementing a language, to get even the most minimal level of performance, haven't yet happened with Mathics. For example, when we added asymptotically fast arbitrary precision integers to SageMath (by wrapping GMP, then later MPIR), we had to implement an "integer pool" since otherwise a lot of everyday microbenchmarks were far too slow.
Note: Mathics is an optional Sage package - https://doc.sagemath.org/html/en/reference/spkg/mathics.html...
Software engineers will do anything to not pay for software.
I have a Mathematica license and at the same time find this project quite cool (and I'm a SW engineer). I would be surprised if mathics developers are not Mathematica users.
It’s not about the price, but about the freedom.
Freedom of or from what?
Freedom to study and modify the software you are using.
To make and redistribute fixes or changes or forks
And to read the source code.
Some go as far as writing software for themselves, and even open sourcing it
I paid $20 back in the day for a 3 DVD case of Debian Sarge plus a magazine sized handbook.
Mathematica is freely available on the Raspberry Pi[1] and most universities have site-wide licenses. The "Home & Hobby" licenses aren't even that expensive, only $195/year for a subscription or $390 for a perpetual license (and only $175 for renewals)[2]. And honestly, for those who are interested in tinkering but can't afford those prices, cracked copies are neither hard to find nor hard to install.
Personally I am quite fond of Mathematica (well, the "Wolfram Language") and I'm happy to pay for the hobby license price. Not only do I think I'm getting my money's worth, but I think supporting mathematical software is a "good cause" to put money behind. Moreover, I've never understood why people like amateur photographers will often spend more on tools like Adobe CC than many programmers (amateur or otherwise) spend on their tools period. Or why someone would spend $20-40+/month on various subscription services only to balk at $200-$400 licensing fees. Then again, I spend more time in Mathematica than I do in almost any other program installed on my computer.
However there is still an important place for (F)OSS mathematical software. As comprehensive as Mathematica usually is, it still has some significant shortcomings in advanced mathematics. In particular, I have a hard time believing it will ever fulfill those more "niche" areas of mathematics for two reasons. First and foremost, the ROI drops off a cliff for more advanced and/or esoteric fields. Second of all, the Wolfram Language already has 6000+ built in functions, so adding hundreds more to comprehensively support of something like Group Theory just doesn't make sense. Sure, they could be supported via packages, but that comes at the cost of performance (no first class support in the kernel) and usability (users have to go out of their way to use it).
Therefore, (F)OSS software like GAP, M2, and PARI/GP serve an important role in "rounding out" the gaps in the wolfram language. In my case, I contribute to FOSS projects an equal amount to whatever I spend on my Mathematica license. Or when monetary contributions to a project are not so straight forward, I try and contribute my time and skills improving them.
To be frank, I don't care much for projects trying to replicate Mathematica's functionality. Of course people are still going to develop and improve them, which at the very least puts some pressure on Wolfram Research to constantly improve basic functionality, but it will probably take one or two decades for said projects to replicate what Mathematica/WL is today.
[1]: https://www.wolfram.com/raspberry-pi/ [2]: https://www.wolfram.com/mathematica/pricing/home-hobby/
I agree.
Stephen Wolfram seems to get a lot of hate on HN, but for doing experimental math, puzzle solving, quick data viz, and much more, Wolfram is a deep and lovely language. The integrated notebook, mouse hover docs, and (shockingly) single massive namespace with thousands of functions somehow add up to a simple and productive experience unlike any other I've used. And I am not generally a fan of "thick" IDEs.
Adding the notebook GUI to a CAS (with a flexible programming language like Lisp or the Mathematica language) was a huge innovation, and we can probably thank Mathematica (along with MathCAD and Maple) for providing the inspiration for other notebook systems such as Jupyter.
A notebook GUI for a CAS it's just a fancy REPL any Lisp had since the 70-80 ;)
Think Macsyma.
The GUI is a big part of what made Mathematica better (for many users) than Macsyma. But it wasn't just the GUI - it was also the document-oriented design where you could save the graphs and computational results, as well as the code to generate them, in a single document.
Python had a perfectly functional REPL, but Jupyter notebooks were still a huge advancement.
"I'm happy to pay for the hobby license price." ..." Then again, I spend more time in Mathematica than I do in almost any other program installed on my computer."
That is odd.
?
I’m not going to pay thousands for a commercial license
Thousands for the commercial license sounds like incentive to work on a free alternative.
Not sure why this is getting downvoted. I mean, what is your job if you spend most of your time with Mathematica but still consider it a hobby?
“My computer” is not “my work computer”.
one of the annoying things about Mathematica is that all functions are crammed into the same namespace and that there is no overloading with different parameterization options...
What do you mean by overloading? Functions can easily have different behavior with different argument counts (e.g., 2- vs. 3-argument Fold), and they can have any number of options (e.g., Graphics, Graphics3D, Solve, Import/Export, etc.). The only big redundancy I can think of is the various Plot functions.