sheepscreek 3 days ago

> Visual Basic was the pinnacle of graphics programming

I am still shocked how no tool since has managed to come even close to VB. You could easily develop a moderately complex GUI application that felt snappy in an afternoon. C# with WinForms is the second closest to that. All other iterations since have not been designed with individual developers in mind, sadly.

A powerful developing alternative to this paradigm could be what I’m calling speech/voice driven development (SDD or VDD). It takes some pain of typing so much away - makes interactions with AI feel a bit more natural. Like talking to a colleague. But for it to really work well, AI models will need to become even faster.

  • thw_9a83c 2 days ago

    The simplicity and efficiency of Visual Basic UI programming relied on several assumptions that became obsolete in 2000 when LCD and high-DPI displays became widely available. Everything in the VB UI editor was based on a fixed layout. There was a fixed font size, fixed DPI-density display expectation, and fixed button sizes. Every dialog assumed fixed sizes and was non-resizable. Adding a translation to such an application was pure hell. Typically, you needed to add a lot of buffer space for English strings to ensure that German strings, for example, would not break the layout too much. UI frameworks that solve these problems with a proper layout management system are obviously more complicated, but they address issues that VB6 never touched.

    • zozbot234 2 days ago

      > The simplicity and efficiency of Visual Basic UI programming relied on several assumptions that became obsolete in 2000 when LCD and high-DPI displays became widely available. Everything in the VB UI editor was based on a fixed layout. There was a fixed font size, fixed DPI-density display expectation, and fixed button sizes. Every dialog assumed fixed sizes and was non-resizable.

      There are visual UI editors that provide for dynamic layouts. (The old Glade editor for GTK+ was one example; modern GTK+4 still does not come with a standard UI editor.) The underlying issue is pointless churn in modern UI frameworks, not the assumption of a fixed-DPI display.

      • bartread a day ago

        Even Java's Swing, introduced in 1999 (or maybe 98 in beta?), dealt with the problem of dynamic layouts fairly well. Of course, it performed like an absolute dog to begin with, but most of the performance kinks were dealt with during the Java 1.3 era (2000), and by 1.4 in 2002/3 or so, performance was absolutely fine, with most of the rendering quirks of the native look and feels having been dealt with as well.

        Moving to C#/WinForms in late 2004 after working with Swing for 4 years felt absolutely jarring by comparison. No concept of layout managers and very limited options for dynamic layout in general (docking was about it, IIRC), really hard to customise components (if it was possible at all - for many it wasn't because they were just thin wrappers around Win32). It was terrible by comparison and I seriously considered quitting the job I was in, even though I liked the company, and finding a role in a Java shop again instead because I hated WinForms so very much.

        Swing is obviously ancient history now. Is it still a thing? Do people still use it? Are there still Java desktop apps? I've no idea. But even 20-odd years later Microsoft have never managed to surpass it for desktop app development. WPF is OKish, and certainly addresses a lot of the problems of WinForms, but it's always felt over-complicated and the syntax of XAML has never really been a plus point in my opinion. Silverlight was... a debacle.

        And, really, since 2013 I've only paid attention to distributed systems, web, and back end development (all of which have their own problems and complexities) so I've really lost touch with Microsoft's desktop development paradigm.

        But I do find myself often yearning for the "simplicity" of desktop app development. Of course, that simplicity comes at a cost: you're trading it for complexity in licensing, distribution, deployment, and support because you have no control over your target environment, and you have to support almost unlimited combinations of machine configuration (which is not necessarily a picnic when you have thousands, tens of thousands, or hundreds of thousands of customers for each product). And, let's be real: nobody misses InstallShield and MSIs.

        • wduquette a day ago

          I've been doing development using JavaFX, Swing's successor, for many years. It's got its quirks, and places where the dev team stopped iterating on the API a little too soon; but it works and does some really nice stuff.

    • electroly 2 days ago

      > fixed DPI-density display

      This part is maybe technically true but not in spirit. High-DPI didn't exist in Windows as a literal scaling percentage, but it had "Large Fonts" as an option which acted as a system-wide scale. VB GUIs were scalable based on twips (device-independent unit). The entire form and all controls would scale according to the Large Fonts setting.

      Many of my VB6 forms were resizable. There was a simple library that let you have anchors in the Tag property so you didn't have to write any sizing logic yourself.

    • FredPret 2 days ago

      This really shows the value of saying “no” to use-cases you consider peripheral.

      If your customers live in one country, and you don’t care about cloud scalability, things can be much simpler.

      Of course, some applications really do need all the complexity.

    • tga 2 days ago

      Even today many apps still fit perfectly within those constraints. I'd gladly accept a fixed layout and no internationalization if that would mean sitting down and writing a rich app with one single dependency (!), zero boilerplate setup, and easy deployment.

    • project2501 4 hours ago

      > Everything in the VB UI editor was based on a fixed layout. There was a fixed font size, fixed DPI-density display expectation, and fixed button sizes. Every dialog assumed fixed sizes and was non-resizable

      Yes, everything in the layout was fixed, unless you chose to resize it through code

      I've developed many applications in VB that dynamically resized every control to maintain a consistent UI regardless of the window size

      I mean, simple things like this

      textarea.width = form.width - ( textarea.left * 2 )

      to keep it centered and preserved the margins

      Granted, for complex layouts, it quickly evolved into a dedicated procedure to resize all controls in the correct order and handle edge cases. But it was absolutely doable, even by an amateur teenage programmer

      I think other IDEs (Delphi maybe?) had anchors and other helpful features to assist with resizing

    • account42 a day ago

      Qt Designer can do visual design for flexible layouts quite well.

    • mexicocitinluez 2 days ago

      > The simplicity and efficiency of Visual Basic UI programming relied on several assumptions that became obsolete in 2000

      This is a great point that extends beyond rebutting the rose-colored glasses for the gool ole days of programming. While yes, things were simpler. That simplicity came at a cost. For instance, how accessible were VB apps to people who had sight-related disabilities?

      • thw_9a83c 2 days ago

        > how accessible were VB apps to people who had sight-related disabilities?

        Coincidentally, they were very accessible because they used corresponding Win32 components for all logical UI entities, despite being fixed in layout. Therefore, they worked really well with screen readers.

        Sadly, this is not the case with UI frameworks, which render the screen as a Vulcan/DirectX/OpenGL canvas. By the way, for today's Electron-based apps, you can use WAI-ARIA (or similar) standards to achieve a similar level of accessibility.

        • Grumbledour a day ago

          And to add to that, because some people might not know or have forgotten, colors where easily adjustable in winforms, so dark mode, high contrast mode, green, blue, hot pink etc. were all easily adjustable for all these apps and back in th day that was pretty standard to do for visually impaired people. No extra work from programmers was necessary, so vastly superior to today where you have to beg for good dark mode support.

          • account42 a day ago

            "Dark mode" today is the biggest scam - selling us a neutered form of color schemes that used to be a standard feature of any UI toolkit as something new and exciting.

      • DiskoHexyl 2 days ago

        More accessible by default than modern electron-based apps

  • int_19h 3 days ago

    Delphi was superior to VB even back in the day, and is still around, as is Lazarus. C# WinForms is actually closest to that, not to VB - and is also still around.

    • throw-the-towel 2 days ago

      It's not surprising, Delphi and C# design were both led by the same person, Anders Hejlsberg. (He then went on to design Typescript.)

  • ranger_danger 3 days ago

    I find Qt Creator to be quite comparable to the experience I remember with e.g. VB6. Have you tried it?

    • EasyMark 3 days ago

      Yeah if you know c++ decently, you can whip up a basic app pretty quickly.

      • ChromaticPanic 2 days ago

        Works with python too, you can build a desktop UI for your app in an hour

  • tiku 2 days ago

    It was also very easy to build the visual aspect, tying it together was step 3 for me. It felt more like a mix of designing and powerpoint than programming in the beginning of a project. Very fast visual results to keep your momentum.

  • raydev 2 days ago

    > I am still shocked how no tool since has managed to come even close to VB

    Over on OS X/macOS, Xcode's Interface Builder carried this torch a long way until Apple introduced SwiftUI. Windows developers just didn't notice because it was a different platform.

  • pjmlp a day ago

    Delphi, C++ Builder, and there are still other ones around like Xojo.

    Problem is the amount of Electron crap nowadays.

  • geon 2 days ago

    I did some iphone development around 2010. The interface builder in xcode was pretty powerful. You could set up multiple screens and hook them up to eachother with drag and drop.

dimitar 3 days ago

I think Emacs still does all of this; the argument the author makes is that it is "arcane", it just uses conventions he is not used to. It is however fully self-documented and interactive.

For me the best textual interface I've ever used remains Magit in Emacs: https://magit.vc/ I wish more of Emacs was like it.

I actually use emacs as my git clients even when I'm using a different IDE for whatever reason.

  • brucehoult 3 days ago

    The thing is that emacs predates Apple developing cmd-z/x/c/v and Microsoft copying Apple in Windows. Before that, the most commonly copied keystrokes in programmer's editors were the freaking Wordstar ones e.g. in all the Borland products.

    Also OP apparently has no knowledge of the far better IDEs we had 30-40 years ago including but not limited to:

    - Apple MPW, 1986. GUI editor where every window is (potentially) a Unix-like shell, running commands if you hit Enter (or cmd-Return) instead of Return. Also the shell scripting has commands for manipulating windows, running editing actions inside them etc. Kind of like elisp but with shell syntax. There's an integrated source code management system called Projector. If you type a command name, with or without arguments and switches, and then hit option-Return then it pops up a "Commando" window with a GUI with checkboxes and menus etc for all options for that command, with anything you'd already typed already filled out. It was easy to set up Commando for your own programs too.

    - Apple Dylan, 1992-1995. Incredible Lisp/Smalltalk-like IDE for Apple's Dylan language

    - THINK Pascal and C, 1986. The Pascal version was orginaly an interpreter, I think written for Apple, but then became a lightning-fast compiler, similar to Borland on CP/M and MS-DOS but better (and GUI). The C IDE later became a Symantec product.

    - Metrowerks Codewarrior, 1993. Ex THINK/Symantec people starting a Mac IDE from scratch, incorporating first Metrowerks' M68000 compilers for the Amiga, then a new PowerPC back end. Great IDE, great compilers -- the first anywhere to compile Stepanov's STL with zero overhead -- and with a groundbreaking application framework called PowerPlant that heavily leaned on new C++ features. It was THE PowerPC development environment, especially after Symantec's buggy PoS version 6.

    - Macintosh Allegro Common Lisp (later dropped the "Allegro"), 1987. A great Mac IDE. A great Lisp compiler and environment. Combined in one place. It was expensive but allowed amazing productivity in custom native Mac application development, far ahead of the Pascal / C / C++ environments. Absolutely perfect for consultants.

    Really, it is absolutely incredible how slick and sophisticated a lot of these were, developed on 8 MHz to 33 or 40 MHz M68000s with from 2-4 MB RAM up to maybe 16-32 MB. (A lot of the Mac II line (and SE/30) theoretically supported 128 MB RAM, but no one could afford that much even once big enough SIMs were were available.)

    • Narishma 3 days ago

      All your example are in the Apple ecosystem. Depending on where the author is from, it may not be that surprising that they wouldn't know about them. In my corner of the wolrd, Apple was basically non-existent until the iPod and iPhone.

      • brucehoult 20 hours ago

        Fortunately my little isolated island chain [1] in the South Pacific Ocean was one of the places where there was someone who brought in early Commodore, Tandy, and Apple machines only a couple of years after the USA and by 1979 or 1980 there were stores in my nearest 40,000 pop provincial city that stocked and repaired them. My school got an Apple ][+ right at the end of 1980, just as I graduated, but the Math HoD [2] asked me to take it home for some of the holidays before I started university and figure it out and come and explain it to him.

        A few months after I got my first job after university I persuaded my boss to get the new Apple LaserWriter (in fact the demo unit from the launch), which I connected to the company's Data General "mainframe" (MV10000) and programmed in raw Postscript. In 1987, when the Mac got good enough (e.g. Mac II) we got on those those too as it was just sooo much better than a PC/AT.

        [1] NZ

        [2] who died just over two years ago, aged 99 1/2, we'd kept in touch the intervening 40+ years

      • oblio 3 days ago

        In most corners of the world, actually. Apple was reasonably popular in the US, the UK and a few developed countries (think marketshare of 5% in 1999) and basically non existant everywhere else.

    • int_19h 3 days ago

      > Before that, the most commonly copied keystrokes in programmer's editors were the freaking Wordstar ones e.g. in all the Borland products.

      Borland switched to https://en.wikipedia.org/wiki/IBM_Common_User_Access shortcuts in the last few versions of their TUI - Ctrl+Ins, Shift+Ins, Shift+Del for clipboard, for example. Since Windows also supported them (and still does!) this actually made for a nice common system between Turbo Vision TUI apps and actual GUI in Windows.

      • kccqzy 3 days ago

        And I only recently discovered that Ctrl+Ins and Shift+Ins worked in my Emacs despite me never configuring it; and it even works in the minibuffer. It also worked in a couple of terminal emulators I have tried on Linux. It's really more universal than one might think.

        • int_19h 2 days ago

          It's unfortunate that the accidental standard also happened to be the one requiring the use of both hands. I think that's why Ctrl+X/C/V won in the end.

          • GuB-42 2 days ago

            It doesn't. Shift+Del, Ctrl+Ins,... can be done just using the right hand, while Ctrl+X/C/V uses the left hand. And sometime you may want to use two hands, which can be done in both cases by using the other Ctrl and Shift key. I actually use both kinds of shortcuts, depending on the situation.

            Del and Ins are also conveniently close to the cursor navigation keys (arrows, home, end, PgUp, PgDn).

            The reason I think Ctrl+X/C/V won out is that it is more convenient when using the mouse, it is also better with nonstandard keyboards where the "Ins" key is awkwardly placed, if not missing entirely.

        • neutronicus 3 days ago

          I use Shift+Insert constantly in Spacemacs since I can't use evil-mode "p" in the minibuffer

          • kccqzy 2 days ago

            I think the usual way of doing paste in Spacemacs is yank, C-y.

            • neutronicus a day ago

              Yeah for whatever reason Shift+Insert is the more "sticky" memory for me. I think because I also use it in Windows terminals.

    • jsrcout 3 days ago

      From my own experience, Think Pascal was fantastic for the time. MPW was serious and high-powered stuff for pro development on a late 80s/early 90s microcomputer; used it professionally for 2 or 3 years and loved it. Never tried the other ones you mentioned.

      As you say, it's incredible how slick and sophisticated those systems were, given the hardware of the day. I mean sure current IDEs may be preferable but we're also using 4 Ghz systems these days instead of 40 Mhz or whatever.

    • versteegen 2 days ago

      (Edit:) Let me start by thanking you, I love these stories about the glory days of computing generations past! I really must try some of these; obviously there are so many fantastic ideas to take.

      Irrelevant aside: I've been using emacs 20 years but I very recently gave up and mapped ctrl-V to paste because I still made the mistake of hitting that too often. (I don't hit ctrl-C accidentally, because to select text I've already switched to emacs for at least a few seconds.)

  • bigstrat2003 3 days ago

    > It is however fully self-documented and interactive.

    Unfortunately not true. I've fired up emacs once or twice, and couldn't even figure out how to save a document because it didn't show me how to do that. It might be more documented than vi (but that bar is *on the floor, vi has one of the most discovery-hostile user interfaces ever made), but it's not self-documented enough to just pick up and use with no instruction.

    • SoftTalker 3 days ago

      I'm pretty sure that if you have an unmodified install and no .emacs that is configured otherwise, when you start emacs you are prompted with a help screen that includes instructions on using the built-in tutorial. If you do that, you'll learn the basics in about 10-15 minutes. If you skip that, yeah it's pretty different from most other software conventions.

      • pkal 3 days ago

        And if you don't have anything configured, graphical Emacs will have a tool bar with a button to save and a menu bar that also gives the binding for the command.

        GUI is different because there is no tool bar, but in Emacs 31 `xterm-mouse-mode' will be enabled by default so you can use the menu bar like a TUI.

      • Narishma 3 days ago

        Yup. Vim is similar, except its tutorial takes more like 30 minutes.

        • SoftTalker 3 days ago

          Yeah the entire emacs tutorial might take a long time. I don't think I ever went through it to "the end" but learning cursor movement, opening and saving files, etc. is right up front.

      • ksherlock 3 days ago

        That's true (you can see it yourself with emacs -nw -q) and the picture is shown in the article. With a completely useless menubar.

    • prmoustache 3 days ago

      vi is documented

      Problem is most people start it the first time by providing a text file instead of firing it on its own and be greeted by the tutorial. I guess that is because they are blindly following another tutorial instead of trying to understand what they are doing.

      My opinion is that "self documentation", "Getting started" pages and "tutorials" is a disease. People would actually get up to speed quicker by reading real manuals instead. They are just lured into thinking they will learn faster with tutorials because they get their first concrete results quicker but at this stage the harsh reality is thay they still usually don't know anything.

      First time I used vi, I just had my operating system manual on my desk and I quickly learned to open man pages in a separate tty.

      • trinix912 2 days ago

        VI or VIM? VI has no welcome screen. It drops you straight into a blank screen of '~' from where you just have to know to enter the insert mode, which is also not indicated anywhere unlike in VIM.

        • skydhash 2 days ago

          There was a time when manuals was kinda big physical books. I think I have a word perfect one that is the size of a textbook. It was not about getting a result (any result) in 5 seconds. It was more about having a goal and browsing through relevant sections to learn how to do it.

    • krs_ 3 days ago

      That's a fair criticism, although once you learn how to access the documentation and where to look for/expext it I find that most things, including add-on packages and whatnot, can be learned from within Emacs itself just fine. But it does take some knowledge to get to that point in the first place for sure.

      • int_19h 3 days ago

        One thing that greatly helped this in the DOS / early Windows era was standardizing on F1 being the key for "online help" (meaning, usually, a hyperlinked document that shipped with the product). That was basically the only thing you had to know to start learning any app.

      • f1shy 3 days ago

        I think is not fair at all, as a default installation has a menu bar, and you can save a file in file->save. While doing so it will tell you the shortcut.

    • lproven 2 days ago

      > couldn't even figure out how to save a document

      100% this.

      It is (counts on fingers) 43 years since I got my first computer of my own, and I've been using Unix-like OSes since just 6 years later in 1988... So, 37 years?

      Still, even now, Emacs is this bizarre thing that teleported in from 1962 or something. Older than Unix, older than -- well, anything else still used by almost anyone except Cobol and Fortran.

      I am old, starting to think about retirement, and Emacs is weird and clunky and ugly. It uses weird nonstandard names for things like "files" and "windows" and even the keys on your keyboard.

      I know they are native and natural for the fans. I'm not one. I'm a fan of the great era of UI standardisation that happened at the end of the 1980s and start of the 1990s.

      I wish someone would do a distro of Emacs with ErgoEmacs built in, on by default, and which could pick up the keyboard layout from the OS.

      ErgoEmacs is a brave attempt to yank Emacs into the 1990s but you need to know Emacs to use it, so it's not enough.

      https://ergoemacs.github.io/

    • jjav 2 days ago

      >it's not self-documented enough to just pick up and use with no instruction.

      If they just plop you in front of a 3-d printer never having seen one and having no documentation, it'll probably take you a good while to produce something useful with it.

      All good tools require training & experience.

      • viraptor 2 days ago

        That's really not the case. I learned programming in the DOS Borland apps and did it with next to no instructions. I don't think I've ever seen the manual. Just read about the language itself. They were that easy to understand and use. Emacs is not in the same class as far as discoverable interface is concerned.

      • trinix912 2 days ago

        The difference here is that this person has likely used text editors before and still couldn't figure this one out.

    • someNameIG 3 days ago

      I'm pretty sure the built in tutorial shows you how to save a document.

    • tiberious726 a day ago

      In unconfigured emacs, you can literally just go Buffer>Save in the toolbar. If you didn't know to look in the buffer menu, then you didn't read even a little bit of the tutorial that appears when you open it

  • dapperdrake 3 days ago

    Magit is mind blowing.

    How did the magit guy or people even come up with the data model? Always had the feeling that it went beyond the git data model. And git porcelain is just a pile of shards.

    • BeetleB 3 days ago

      > How did the magit guy or people even come up with the data model?

      It's not all that different from a typical TUI interface.

      Magit isn't great because of the interface. It's great because the alternative (plain git) has such a crappy interface. Contrast principle and all.

      • Ferret7446 3 days ago

        Magit is pretty great because of transient, the model of showing what all the commands are. It's a very natural and easy UI affordance

        • globular-toast 2 days ago

          Transient was factored out much later. It's not just transient that makes magit great, though. It's the only alternative porcelain for git that I'm aware of and one that makes git both easier to use and understand. I'm the "git guy" at every place I've worked but I owe it all to magit. Other git frontends just do the CLI stuff with point and click, they don't help you understand what's going on at all.

        • BeetleB 2 days ago

          > Magit is pretty great because of transient, the model of showing what all the commands are.

          And that's different from many TUIs how?

      • emmelaich 3 days ago

        Is magit much better than tig? I've never used magit.

        • mbirth 3 days ago

          If you’re still using tig, have a look at lazygit. Needs some getting used to (coming from tig) but supports way more git features.

          • makapuf 2 days ago

            Same, love lazygit. My only issue is that I find it too centered around qwerty keyboard... which is small.

    • nobleach 3 days ago

      For reference, I did use Magit for my short stint with Emacs (and then Spacemacs/Doom Emacs). I've always been more into Vim. I tried the Atom editor several years ago with lots of Vim emulation and quite a bit of customization - one of those being a Magit clone.

      I moved to NeoVim many years ago and have been using NeoGit (a supposed Magit clone) the entire time. It's good but I'm missing the "mind blowing" part. I'd love to learn more though! What features are you using that you consider amazing?

      • thdhhghgbhy 3 days ago

        I found Neogit quite buggy. Not even in the same league as Magit.

      • SoftTalker 3 days ago

        It's mind-blowing because it makes git actually usable.

        • tombert 3 days ago

          Maybe it's Stockholm syndrome for me, but I never really understood what was so unusable about the vanilla command line git interface.

          If you want to do some really advanced stuff, sure it's a little arcane, but the vast majority of stuff that people use in git is easy enough. Branching and committing and merging never seemed that hard to me.

          • jjav 2 days ago

            > Maybe it's Stockholm syndrome for me, but I never really understood what was so unusable about the vanilla command line git interface.

            I'm as hardcode CLI user as it gets, I've only lived in the CLI since the mid 80s and still firmly there.

            git is the absolute worst CLI ever in the history of humanity.

          • SoftTalker 3 days ago

            Wnen I do anything more than commit/push/pull at the command line I will quickly get myself so confused that I end up deleting the directory and cloning it again. That doesn't happen to me (much) with magit.

            • tombert 3 days ago

              Fair enough. I feel like I do a fair amount of the more advanced features (interactive add and rebase, bisect, worktrees) without any fancy tooling and I don't have a problem much anymore, but admittedly they did confuse me at first.

              • em-bee 3 days ago

                i don't remember confusion. i find it's mostly understanding the data model and in particular the branches and references/reflog. when i am worried i might break something then i tag the the checkout where i am at and i know i can always revert to that. i also compare the original with the new state. i usually know what that diff should look like, and even if the operations in between are confusing, if the diff looks like what i expect then i know it went all right. trust the process but verify the results.

                the big thing i am missing from it is a branch history. a record for every commit to which branch it once belonged to. no improved interface can fix that. that would have to be added to the core of git.

          • davemp 2 days ago

            imo git does a terrible job of showing its state so when anything more complicated than committing changes you really have to have thing internalized.

    • kelvinjps10 3 days ago

      I don't know but I didn't find as intuitive as lazygit

  • layer8 3 days ago

    To navigate a Turbo-Vision-style IDE and explore its functionality, you basically only need to know how the Alt and Tab keys work (okay, and Return and Esc and arrow keys), as alluded to in TFA. Emacs doesn’t quite have that base level of operating uniformity I think.

    • JoelMcCracken 3 days ago

      To navigate emacs, you really only need to know ctrl, alt, and the basic norms of keyboard usage (return for newline/accept, shift for capitals)

      Really, compared to what I see here, the chief difficulty with emacs is the sheer volume of possible commands, and the heterogeneity of their names and patterns, which I believe is all a result of its development history. But the basics are just as you describe.

      • layer8 3 days ago

        It’s a good question to what complexity (volume) the approach scales, but dialog boxes can get you quite far, and menus are fundamentally “just” a tree like keyboard shortcuts are.

        Emacs has Elisp commands first, then keyboard shortcuts for them, then maybe (not as a rule) menu items, and rarely dialog boxes. The Turbo Vision approach, from its design philosophy, has menus and dialogs first, then keyboard shortcuts for them.

        One approach isn’t strictly better than the other, nor are they mutually exclusive. Ideally you’d always have both. My disagreement is with the “I think Emacs still does all of this” above. Emacs is substantially different in its emphasis, presentation, and its use of dialogs.

        • JoelMcCracken 3 days ago

          Yeah that’s fair. In many ways the spacemacs/doom model is more akin to what you describe, with a lot of caveats; it’s not a total rework of all key bindings. In emacs novice affordances are usually an afterthought, not part of the core design and community norms.

          Of course, I must say there is a trade off here: you can design for novices or for advanced users, but very often not both.

    • skydhash 3 days ago

      The base input of emacs is ‘M-x’. From there, any command is accessible. And you have ‘M-:’ for evaluating any bit of elisp code. There’s a few UI concepts to learn (frame, window, buffers, point, mark, region,…), but that would fit in a single sheet of paper.

      • layer8 3 days ago

        The keys I enumerated are sufficient to discover and execute all available operations in that style of TUI. You don’t have to type commands or command-specific keyboard shortcuts, like you have to in Emacs. It’s analogous to how in a traditional GUI you can discover and execute everything just using the mouse.

        Like in the GUI analogy, you can then choose to remember and use the displayed keyboard shortcuts for frequently used operations, but you don’t have to.

        • skydhash 3 days ago

          It’s easy when you have a small amount of commands. And distributions like Doom Emacs is fairly discoverable too. But emacs have a lot of commands and while it offers menus, it’s only the surface level of what’s available.

        • JoelMcCracken 3 days ago

          For most fundamental operations there are menus available. Most heavy emacs users opt to turn them off however.

          You can even see the menu atop the screen shot in the article, with the familiar names etc.

      • marssaxman 3 days ago

        It's possible I might once have given emacs a try, if the way people talk about it did not sound like such baffling moon-language: when I encounter stuff like "so I C-x C-f’d into my init.el, M-x eval-buffer’d, then C-c C-c’d an org-babel block before C-x k’ing the scratch buffer" I just want to back away slowly and leave them to it, whatever it is they're doing. Y'all have fun with your C-r X-wing mork-butterfly porg fluffers, I'm going to edit some code over here, using a text editor, that edits text files.

        • skydhash 3 days ago

          So you don’t ‘git clone’ and ‘git commit’, or ‘mkdir’ and ‘grep’?

          • marssaxman 3 days ago

            Not following you; how does that question relate?

            • skydhash 2 days ago

              All the keybindings you mentioned are commands accessible with M-x. The thing with Emacs is that a chord is always attached to a keymap. The global keymap is always accessible while all the others are accessible through a keybind in another keymap, recursively.

              So the only thing you need to know are those commands. And that's the main appeal of Emacs, to have commands that augment text editing. One of the most powerful examples is org mode, which is just another markup language, but there's a lot of commands that makes it an organizer, a time tracker, an authoring platform, a code notebook.

              Each mode is a layer of productivity you put on the bare editing experience.

              • marssaxman 2 days ago

                Thank you for your helpful reply. I'm afraid I don't have enough familiarity with emacs to quite follow your explanation - I didn't actually write the bit of emacs-speak in my previous message, and I don't know what its terms mean. I just asked ChatGPT to invent something one emacs user might plausibly say to another!

                It has always sounded like emacs is extraordinarily powerful and configurable, and that must be great for people who want to do extraordinary things with their text editor. There was a time when I enjoyed tinkering with my environment more, but these days I prefer simple, ordinary tools I can easily understand. I don't really want to think about the tools at all, but focus on the task I'm doing with them. I'm content to let emacs be something other people appreciate.

                • skydhash 2 days ago

                  I get you. Emacs is one software that you must invest some time to get the famous ROI. The promise is one unified interface for all the tools you may need (editor, tasks runner, shell, spellchecking, file manager,...). But the learning curve is there, although not as steep as some would make it appear.

                  One of my major motivation for putting in the time is that Emacs is very stable. You can coast for decades on a configuration. I don't mind learning new stuff, but it's grating to for it to be taken away, especially if there's no recourse (proprietary software).

                  • marssaxman 2 days ago

                    Well, that's ironic. I actually do spend all day, every work day, in a piece of software which unifies most of the tools I need - editor, file manager, make console, find/grep frontend, etc. It's as stable as can be, since I'm the only person who maintains or even uses it, and it's as simple as can be, since I don't bother to write features in unless I really want them.

                    I've always supposed that emacs was for people with inscrutably complex text-editing needs, far beyond the bounds of my "nano is plenty" imagination, but if my cozy little coding environment is the kind of thing people are doing with emacs, I can understand why they would like that.

  • jmmv 3 days ago

    > it just uses conventions he is not used to

    I think that after 25+ years of usage, I'm "used to it" by now.

  • 1313ed01 3 days ago

    Agree about Emacs, and I used it already in MS-DOS back in the day. You could launch the compiler (or make, more likely) using M-x compile or use C-z to open command.com to run commands on the prompt and then exit from that back to Emacs. Almost like multitasking!

    I never really liked any of the typical late-MS-DOS era TUI applications and have no nostalgia for those. I think a small TUI like a OS installer is fine, but I realised it is the command-line I like. Launching into a TUI is not much different from opening a GUI, and both break out of the flow of just typing commands on a prompt. I use DOSbox and FreeDOS all the time, but I almost never spend time in any of the TUI applications.

    Despite that, I am currently working on a DOS application running in 40x25 CGA text mode. I guess technically it is a TUI, but at least it does not look much like a typical TUI.

  • cmrdporcupine 3 days ago

    So, I have been using emacs on and off for 32 years at this point, and I my emacs all set up with eglot and rustic and magit and the like and it's great.. but I still find I just fall back to RustRover when doing development because (unlike some the classic TUI IDEs mentioned in TFA) it just never feels like it's fully glued together and it's always a bit fragile and I never remember how to do certain things -- even though I'm the one who set it up.

    That and lack of a decent visual debugger situation.

    So I have this weird thing where I use emacs for interactive git rebasing, writing commit messages, editing text files and munging text... and then RustRover for everything else.

    It's sorta like the saying, "I wish I was the person my dogs think I am"... "I wish emacs was actually the thing that I think it is" ?

    • frou_dh 3 days ago

      There is a good IDE-style debugger available for Emacs these days: https://github.com/svaante/dape

      Since it has no dependencies, I wouldn't be surprised if it gets merged into Emacs core at some point.

      • cmrdporcupine 3 days ago

        Thanks, I'll check it out. dap made me angrier and angrier the more I tried to configure and use it.

  • creddit 3 days ago

    The Magit experience is due to the use of the transient package for its UI.

    Some other packages also use it. Most notably for my personal usage is the gptel package.

    • tarsius 3 days ago

      > The Magit experience is due to the use of the transient package for its UI.

      (I'm the author of Magit and Transient. (Though not the original author of Magit.))

      The transient menus certainly play an important role but I think other characteristics are equally important.

      A few years ago I tried to provide an abstract overview of Magit's "interface concepts": https://emacsair.me/2017/09/01/the-magical-git-interface/. (If it sounds a bit like a sales pitch, that's because it is; I wrote it for the Kickstarter campain.)

      • f1shy 3 days ago

        This is my experience. While transient mode helped at the beginning for discovery. I learned fast the 10 things I use constantly, and never look the transient buffer. When I want to do something, I see the documentation, for me it is often easier than guessing and searching. Things like spin-off are absolutely gold.

    • dimitar 3 days ago

      Indeed! I went back just to mention it owes its incredible UX to the transient package, I am going to look up more uses for it. Do recommend more if you can, please!

    • pkal 3 days ago

      Transient is the worst part about Magit IMO (the best parts are how you can prepare a commit to just include the right changes, or the functionality bound inside the transient menus that make complex operations such as fixups or rebases trivial). Transient UIs are consistently uncomfortable to work with, and could usually be replaced by just using a regular special-mode keymap in a custom buffer. The fact that Transient hooks into the MVC and breaks elementary navigation such as using isearch or switching around buffers has irritated me ever since Magit adopted the new interface.

      The real neat thing about Emacs' text interface is that it is just text that you can consistently manipulate and interact with. It is precisely the fact that I can isearch, use Occur write out a region to a file, diff two buffers, use find-file-at-point, etc. that makes it so interesting to me at least.

      A far more interesting example than Magit is the compile buffer (from M-x compile): This is just a regular text buffer with a specific major mode that highlights compiler errors so that you can follow them to the referenced files (thereby relegating line-numbers to an implementation detail that you don't have to show the user at all times). But you can also save the buffer, with the output from whatever the command was onto disk. If you then decide to re-open the buffer again at whatever point, it still all looks just as highlighted as before (where the point is not that it just uses color for it's own sake, but to semantically highlight what different parts of the buffer signify) and you can even just press "g" -- the conventional "revert" key -- to run the compile job again, with the same command as you ran the last time. This works because all the state is syntactically present in the file (from the file local variable that indicates the major mode to the error messages that Emacs can recognize), and doesn't have to be stored outside of the file in in-memory data structures that are lost when you close Emacs/reboot your system. The same applies to grepping btw, as M-x grep uses a major mode that inherits the compile-mode.

      • tarsius 3 days ago

        > Transient UIs [...] could usually be replaced by just using a regular special-mode keymap in a custom buffer.

        For people who can look at a list of key bindings once and have them memorized, maybe. Turns out most people are not like that, and appreciate an interface that accounts for that.

        You also completely ignore that the menus are used to set arguments to be used by the command subsequently invoked, and that the enabled/disabled arguments and their values can be remembered for future invocations.

        > The fact that Transient hooks into the MVC and breaks elementary navigation such as using isearch

        Not true. (Try it.) This was true for very early versions; it hasn't been true for years.

        > or switching around buffers

        Since you earlier said that transient menus could be replaced with regular prefix keys, it seems appropriate to point out that transient menus share this "defect" with regular prefix keys, see https://github.com/magit/transient/issues/17#issuecomment-46.... (Except that in the case of transient you actually can enable such buffer switching, it's just strongly discouraged because you are going to shoot yourself in the foot if you do that, but if you really want to you can, see https://github.com/magit/transient/issues/114#issuecomment-8....

        > has irritated me ever since Magit adopted the new interface.

        I usually do not respond to posts like this (anymore), but sometimes the urge is just too strong.

        I have grown increasingly irritated by your behavior over the last few weeks. Your suggestion to add my cond-let* to Emacs had a list of things "you are doing wrong" attached. You followed that up on Mastodon with (paraphrasing) "I'm gonna stop using Magit because it's got a sick new dependency". Not satisfied with throwing out my unconventional syntax suggestion, you are now actively working on making cond-let* as bad as possible. And now you are recycling some old misconceptions about Transient, which can at best be described as half-truths.

        • pkal 3 days ago

          > For people who can look at a list of key bindings once and have them memorized, maybe. Turns out most people are not like that, and appreciate an interface that accounts for that.

          To clarify, the "custom buffer" can list the bindings. Think of Ediff and the control buffer at the bottom of the frame.

          I am not saying that transient offers nothing over regular prefix keys, there is a common design pattern that has some definitive and useful value. My objection is that the implementation is more complex than it should be and this complexity affects UX issues.

          > Not true. (Try it.) This was true for very early versions; it hasn't been true for years.

          Then I was mistaken about the implementation, but on master C-s breaks transient buffers for me on master and I cannot use C-h k as usual to find out what a key-press execute. These are the annoyances I constantly run into that break what I tried to describe in my previous comment.

          > Except that in the case of transient you actually can enable such buffer switching, it's just strongly discouraged because you are going to shoot yourself in the foot if you do that

          I did not know about this, so thank you for the link. I will probably have to take a closer look, but from a quick glance over the issue, I believe that the problem that you are describing indicates that the fear I mentioned above w.r.t. the complexity of transient might be true.

          > I usually do not respond to posts like this (anymore), but sometimes the urge is just too strong.

          I understand your irritation and don't want to deny its validity. We do not have to discuss this publicly in a subthread about DOS IDEs, but I am ready to chat any time. I just want you to know that if I am not saying anything to personally insult you. Comments I make on cond-let and Magit sound the way they do because I am also genuinely irritated and concerned about developments in the Emacs package space. To be honest, it often doesn't occur to me that you would read my remarks, and I say this without any malicious or ulterior motives, in my eyes you are still a much more influential big-shot in the Emacs space, while I see myself as just a junior janitor, who's opinions nobody cares about. But these self-image and articulation problems are mine, as are their consequences, so I will do better to try to remember that the internet is a public space where anyone can see anything.

      • internet_points 3 days ago

        Odd, I can `C-s` just fine in transient buffers. It works exactly like in other buffers.

        The `C-h` override is pretty cool there too, e.g. if from magit-status I do `C-h -D` (because I'm wondering what "-D Simplify by decoration" means), then it drops me straight into Man git-log with point at

               --simplify-by-decoration
                   Commits that are referred by some branch or tag are selected.
        
        (Ooh, I learnt a new trick from writing a comment, who say social media is a waste of time)
        • pkal 3 days ago

          OK, try the following in a Transient buffer:

          - Search for something using C-s - Exit isearch by moving the point (e.g. C-n) - Is the transient buffer still usable for you? In my case it becomes just a text buffer and all the shortcuts just got mapped to self-insert-command.

          • sexyman48 3 days ago

            Dayum, given Transient's prickliness (I always feel like I'm walking on eggshells when I'm in it) I've never dared to C-s. But I tried this, and yeah, the transient reverts to a plain text buffer, and you're left in the lurch.

          • tarsius 2 days ago

            Thanks for the bug report. Fixed. The next release will come out in about two weeks.

      • bowsamic 3 days ago

        Yeah I agree. I think transient is one of the less appealing things about magit and isn't really very emacs-y. Also, you still have to memorise them anyway

      • sexyman48 3 days ago

        You say a lot of dumb ____ (but to be fair, I said a lot more when I was your age), but your disdain for transient is on the money. I'm a satisfied magit user, but transient is a blatant UX error and a confounded implementation. Some guy spends his 20% time hawking an entire suite around transient. No one cares.

        • pkal 2 days ago

          I would love to hear what you disagree with :)

  • rbanffy 3 days ago

    > it just uses conventions he is not used to.

    It just came up with conventions few others adopted later when they reinvented the wheel.

  • pjmlp 3 days ago

    Back in the day I had to use XEmacs as it was more advanced as plain Emacs.

    After IDEs finally started being a common thing in UNIX systems, I left Emacs behind back to IDEs.

    Still I have almost a decade where Emacs variants and vi were the only option, ignoring stuff like joe, nano, ed, even more limited.

  • speed_spread 3 days ago

    It's not about being arcane, it's about the lack of discoverability. Emacs and vi don't have (by default) affordances like a menu that enables a user to discover and learn the interface at their own pace. The learning curve is much smoother and allows for casual uses without having to pull a book each time.

    • tmtvl 2 days ago

      Emacs does have a menu by default. It's one of the things I appreciate most about it (especially the Help menu) and I'm always befuddled to see frameworks like Doom Emacs or Spacemacs disable the menubar.

    • saint_yossarian 2 days ago

      Vim does have a menu system, and AFAIK it's still enabled by default in GUIs like GVim.

  • thdhhghgbhy 3 days ago

    Love Magit, it is a work of art. I moved to vim a few years back and miss magit dearly. The most feature complete Neovim magit clone is buggy.

  • bowsamic 3 days ago

    Magit is really great, however, it can definitely be quite slow and buggy sometimes

    • Syntonicles 3 days ago

      I've been using Magit for years, and have never noticed any bugs.

      The interface is unique and takes a lot of getting used to. I did need to leverage my extensive experience with Git and Emacs to understand unexpected behaviour but the fault always lay with me.

      Given the implications of bugs in such a critical part of a developer's workflow, can you be more specific?

      • bowsamic 2 days ago

        Mainly random lisp errors being thrown in certain cases, likely just unimplemented functionality, I didn't record them since there are so many. I probably see one every other day but usually it's somewhat outside of the normal operation of magit. It still feels like a bug though, and very likely is.

        I mean, magit is not some perfect piece of software. Of course it has bugs, I just hit them quite a lot. The slowness is more annoying though. Sometimes it takes seconds to open magit after hitting C-x g

        I've also had magit get stuck in a 100% CPU usage loop a couple times

        • Syntonicles a day ago

          That sounds frustrating! It hasn't been my experience at all.

          I'm an enthusiast when it comes to [Vanilla] Emacs. I enjoy customizing my editor, jumping into the Lisp when I find an error, and contributing - though my own config is usually the problem.

          This sounds like an opportunity to improve the day-to-day for developers!

          Please report any verified bugs to Github. There are only 12 open issues, most of them enhancement requests. The maintainers are celebrated in the community for their diligence and attentive engagement, and I'm sure they'd love to help.

  • badsectoracula 3 days ago

    > it just uses conventions he is not used to

    ...and everyone else, including everyone who is also using a GUI on Linux - even if they use the GUI version of Emacs.

    • jama211 3 days ago

      Yeah, basically when they said that it should’ve begged the question “why is he not used to those conventions?” And the answer would be because the conventions it uses aren’t used by anything else (which means they can barely be called conventions), and makes no effort to adopt any conventions of the platform it’s running on even just to get you started.

      Also, another user said it has a tutorial when opened which should teach the basics in “10 to 15 min” but I have a feeling I would need 0 minutes to learn the basics of turbo c++.

      I get that there are diehard eMacs and vim fans and honestly I’m happy for them. But at the end of the day scientifically speaking ease of use is not JUST down to familiarity alone. You can objectively measure this stuff and some things are just harder to use than others even with preloaded info.

      • badsectoracula 3 days ago

        > I have a feeling I would need 0 minutes to learn the basics of turbo c++.

        Well, Turbo C++ (at least the one in the article) does use common conventions but those were conventions of 1992 :-P. So Copy is Ctrl+Ins, Paste is Shift+Ins, save is F2, open is F3, etc. Some stuff are similar to modern editing like Shift+motion to select, F1 for help, F10 to activate the menu bar, etc. And all shortcut keys are displayed on the menu bar commands so it is easy to learn them (some of the more intricate editor shortcut keys are not displayed in the menus, but are mentioned in the help you get if you press F1 with an editor window active).

        • jama211 2 days ago

          Yes some unfamiliar commands for sure

      • 1718627440 2 days ago

        These conventions work in a lot of Linux GUI tools, in the built-in Git GUI, the shell and every text-box in MacOS.

        • jama211 a day ago

          emacs conventions do?

          • 1718627440 a day ago

            Eh yes? They are mostly readline conventions. What's the question?

            • jama211 an hour ago

              Which conventions do you mean specifically? Like copy/paste shortcuts? The buttons you use for navigating UI?

    • cmrdporcupine 3 days ago

      and frankly including other emacs users, too.

      Any non-trivial use of emacs ends up involving a pile of customizations.

geophile 3 days ago

Turbo Pascal was completely amazing. I remember resisting it for a long time, because IIRC it implemented non-standard Pascal. But the competitive tools were less powerful and far more expensive, (e.g. the Microsoft tools). And then I tried it, and was completely blown away. I no longer cared about the non-standard stuff. I had a fast intuitive IDE running on my original IBM PC.

As for modern IDEs, Intellij has been orders of magnitude better than any competition for more than 25 years (I think). I have stayed away from Microsoft products for a very long time, so I can't comment on VSCode and its predecessors. The main competition I remember was Eclipse, which I always found to be sluggish, unintuitive, and buggy. The fact that it wasn't even mentioned in this article is telling.

JetBrains, the company that created Intellij (and then PyCharm, CLion and many others) is one of those extremely rare companies that defined a mission, has stuck to it, and excelled at it for many years, and has not strayed from the path, or compromised, or sold out. It is so impressive to me that they maintain this high level of excellence as they support a vast and ever-growing collection of languages, coding standards and styles, and tools.

  • imron 2 days ago

    > so I can't comment on VSCode and its predecessors

    Vscode is a pale imitation of its predecessors.

    Visual c++ was amazing and remains my favorite ide ever.

    It was also the spiritual successor of the Borland TUI IDEs because MS stole all of Borland’s top compiler engineers.

  • prmoustache 3 days ago

    Except for resource usage.

    I chose it becaue I don't have access to neovim on my cloud desktop and ideavim is a superior solution to any vim like plugins for vscode. It is struggling with 4 cores and 16GB of ram with only a few projects open at a time. Some of it is due to being win11 with the amount of security malware installed by my company but still vscode doesn't seem to make it suffer that much.

  • niutech a day ago

    Have you tried NetBeans? It's more intuitive and snappier than e.g. Eclipse.

    • geophile 3 hours ago

      I did, very briefly. It did not compare to Intellij.

  • int_19h 3 days ago

    Visual Studio still supports WinForms including the graphical form designer, which is very close to the OG Delphi experience in late 90s (esp. since WinForms is such a blatant rip off VCL).

    • pjmlp a day ago

      You are missing a step there, before Windows Forms there was WFC, Windows Foudation Classes (not to mix with the other WFC from .NET), used in J++, one of the reasons for Sun's lawsuit.

      Alongside events, and J/Direct the percursor to P/Invoke.

      https://news.microsoft.com/source/1998/03/10/microsoft-visua...

      It was WFC that was the rip off, WinForms is basically WFC redone in C#.

PaulHoule 3 days ago

In the golden age of DOS you had an array of bytes representing characters and an array representing attributes (background and foreground colors) and the hardware drew out of that. If you wanted to write a ‘A’ to a certain spot you wrote 0x41 to a certain memory address and that was that —- there were some wait states involved but it was way faster than drawing on a 9600 baud terminal with ANSI terminal commands that use up even more bytes.

I first used emacs on terminals that were hooked to Sun workstations and you were either going to use a serial terminal which was very slow, or the terminal emulator on the Sun which was a GUI program that had to do a lot of work to draw the characters into the bitmap. So that’s your reason TUIs went away.

  • floam 3 days ago

    Damn that actually sounds superior. How did changing the size work?

    • madmountaingoat 3 days ago

      The standard screen was 80 by 25. There were two addresses you needed to know 0xb000 for monochrome displays and 0xb800 for color. For monochrome you could just blast characters/attributes to the address and everything looked great. For color you had to add a little bit of assembly so writes didn't happen when the monitor was doing certain things (or else you would get some flickering). The little hacks were all well known. Then you could build your own 'windowing' system by just maintaining separate screen buffers and having a little bit of code to combine for buffers when writing the actual hardware. In the early days everyone code was synchronous and code would start listening for keyboard events and react and repaint in a very ad hoc fashion. Mouses made things a bit more complicated as you needed to maintain a persistent model of the UI to process their events. So the UI code was simple and easy to work on, but you had to squeeze these programs into tiny memory footprints so you would spend a lot of time trying to find more memory. One of the bigger projects I worked on had a memory manager that relocated blocks to create contiguous space but since there was no OS support for things that like the code was actually updating pointer in the heap and stack - which was a great source of complicated bugs. Whoa onto anyone that tried to use a linked lists in such an environment. But yeah, it was a fun time.

    • layer8 3 days ago

      Unless the program specifically allowed for it, you couldn’t change the size (video mode, really) without exiting and restarting the program after changing modes on the DOS prompt.

      Remember, the video hardware rendered text mode full-screen, and it had to be reconfigured to change to a different number of lines and columns. Only specific sizes were supported.

    • torgoguys 3 days ago

      You called an "interrupt," which was basically a system call. That changed a bunch of timing registers within the video hardware. For a long time you basically could only do 40, 80 columns of text and 25, 43, or 50 lines. With some trickery you could get the video hardware to output 90 columns and with even more trickery you could get 60 rows.

      If you made a custom font you could also have more diversity in the number of rows too but this was rarely done.

      Eventually different text modes became available with higher resolution video cards and monitors. 132 columns of text were common but there were others.

    • danparsonson 3 days ago

      There were a bunch of predefined modes in the video BIOS, and with a little bit of assembler you'd issue an interrupt (a system call really) which would change the video mode. Then as the parent comment said, you could write to video memory directly and your writes would either be interpreted as ASCII character/attribute pairs in a text mode, or colour palette indices in a graphical mode.

      Most games at that time used mode 13h which was 320x200 with 8-bits per pixel which therefore indexed into a 256-colour palette (which could itself be redefined on the fly via reading from and writing to a couple of special registers - allowing for easy colour-cycling effects that were popular at that time). Here's a list of the modes: https://www.minuszerodegrees.net/video/bios_video_modes.htm

      • lproven 2 days ago

        There were also software-defined modes. A very handy tool for awful early-1990s LCDs was Laptop Ultravision:

        https://www.atarimagazines.com/compute///issue138/124_Laptop...

        It ran the DOS text screen in VGA graphics mode, with soft-loaded fonts, but this also permitted all kinds of extra modes -- iff your DOS apps supported them. Some just read the screen dimensions and worked, some didn't.

carra 2 days ago

Things like Borland C and VB/WinForms really do take me to a simpler time. There was joy in being able to write simple programs very fast, in a more intuitive way, without needing to use browsers or frameworks or writing shaders to do the simplest things. Current systems are more powerful and versatile for sure, but for a teenager curious for coding they are a much less welcoming environment in a lot of ways. The ever growing amount of technologies you need to learn now does not help either.

  • supportengineer 2 days ago

    Another aspect of the sheer number of technologies that you have to use is that you can’t do very much as a solo developer. It introduces a social aspect. You must work as part of a team. Especially when you have a public facing website or large scale. So the field is less appealing for someone who is a typical introvert. The types of personalities are completely different compared to those from the mid-90s or before.

    • aadhavans 2 days ago

      I was talking about this with a friend earlier this week. The people who work in software these days seem much more extroverted and outgoing than the 'introverted nerd' stereotype from the 90s.

      • 4gotunameagain a day ago

        This is also due to the popularisation of computers, the internet and internet culture.

        Everyone and their aunts now are into computers, and one in x is a software "engineer". Back in the day, it was only the hardcore nerds that were attracted to these things :)

        • supportengineer a day ago

          What are the introverts and "hardcore nerds" supposed to do now?

          Where is the new refuge for us?

  • niutech 2 days ago

    You can still run Visual Basic 6 in Windows 11 and compile programs like in '99. Windows is incredibly backwards-compatible.

  • christophilus 2 days ago

    VB6 was a revelation to me, coming from C and C++ at the time. It was so much easier and still plenty performant for the kinds of little business applications I was building at the time.

  • never_inline 2 days ago

    Try flutter.

    • niutech a day ago

      Flutter is not native on any platform, it's just a canvas with painted custom controls. Nothing compared to lightweight native Win32 apps in VB6.

      • never_inline a day ago

        It has a much better language and works across platforms. these things have real value.

        • NetMageSCW a day ago

          It used a language that few know and that has little local knowledge and support. It uses its own widgets and conventions to be cross platform instead of using native controls. These things are real downsides.

CGamesPlay 3 days ago

(Article is from 2023, so the title should be updated to say "32 years ago", or something)

The biggest loss in TUIs is the latest wave of asynchronous frameworks, which bring the joy of dropped keypresses to the terminal.

In any TUI released before the year 2000, if you press a key when the system wasn't ready, the key would just wait until the system was ready. Many TUIs today still do this, but increasingly frequently (with the modern "web-inspired" TUI frameworks), the system will be ready to take your keypress, and discard it because the async dialog box hasn't registered its event listener yet.

Other than that antipattern, TUIs are doing great these days. As for terminal IDEs, Neovim has never been more featureful, with LSPs and other plugins giving all the features this article discusses. I guess it isn't a mouse-driven TUI, so the author wouldn't be interested, but still.

  • cameldrv 3 days ago

    Yes. Back in the DOS days, and even before, when people used actual terminals, there was a keystroke buffer. You'd see people who really knew the interface fly through tasks being multiple keystrokes ahead of the UI. Stuff would just flash onto the screen and disappear as it processed the input that was already in its buffer. It should be possible to implement this with modern frameworks, but it requires thought.

    • markus_zhang 3 days ago

      Yeah. I used to work as a phone surveyor, the one you hate. Our software is a terminal connected to a mainframe. I got used to it after a few weeks and was very productive.

      Costco Canada vision shops still use a terminal connected to an AS/400 machine as I snooped around last month.

      • Twirrim 3 days ago

        In the late 90s I was required to slowly replace dumb terminals with PCs. One of the older ladies taking phone orders was most put out by this, understandably. She was lightning fast on that terminal. She'd never used a PC (I hit on the idea of using solitaire to learn to use a mouse, which worked amazingly well), and was never able to get to the same speed with one as she'd done on her dumb terminal. It's hard to beat the performance of dedicated devices.

        • roelschroeven 3 days ago

          While I agree that dedicated devices can be more efficient than Windows-style user interfaces, and even more so than browser-based user interfaces, many people don't use those modern interfaces in efficient ways.

          I have observed countless times how many people fill in a field, than move their hand to the mouse to move the focus to the next field or button, than move their hand back to the keyboard, instead of just pressing tab to move the focus. It's painful to watch. Knowing just a few keyboard shortcuts makes filling in forms so much faster.

          Things are getting worse, unfortunately. Modern user interfaces, especially in web interfaces, are made by people who have no idea about those efficient ways of using them, and are starting to make it more and more difficult to use any other method than keyboard -> mouse -> keyboard -> mouse -> ... . Tab and shift-tab often don't work, or don't work right. You can't expand comboboxes with F4, only the mouse. You can't type dates, but have to painstakingly select all the parts in inefficient pickers. You can't toggle options with the spacebar. You can't commit with enter or cancel with esc.

          • titzer 2 days ago

            It's for this reason that I dream of us going back to keyboard-first HCI. I wish the underlying BIOS could easily boot and run multiple operating systems simultaneously and there were keys that were hardwired to the BIOS to switch out of whatever GUI crap you were in to the underlying "master control mode".

            I wish we'd made better correspondence between the GUI and the keys on the keyboard. For example, the ESC should always be top-left of the keyboard and every dialog box should have an escape that basically always does the same thing (go back/cancel) and is wired to the hardware key. Instead of drop-down menus at the top of the screen, we could have had pop up menus at the bottom of the screen that positionally correspond to the F1-F12 keys.

        • thequux 3 days ago

          I recall reading somewhere that the entire point of solitaire (at least the original implementation that came with windows 3) was to teach users how to click and drag, so I'm not surprised that it was good for teaching your colleague how to use a mouse

        • int_19h 3 days ago

          An inventory management app was one of my first paid software engineering projects. Sometime in early 00s I had to rewrite it for Windows because the ancient DOS codebase had a bunch of issues running on then-modern Windows versions. I sat down with the users and watched how they were using the DOS version, including the common patterns of keyboard navigation, and then meticulously recreated them in the WinForms version.

          For example, much of the time would be spend in a search dialog where you had a textbox on top and a grid with items right below. In the TUI version, all navigation was with arrow keys, and pressing down arrow in the textbox would move the focus to the first item on the grid. Similarly, if you used up arrow to scroll through the items in the grid all the way to the top, another press would move the cursor to the textbox. This was not the standard focus behavior for Windows apps, but it was very simple to wire up, and the users were quite happy with the new WinForms version in the end.

          • kccqzy 3 days ago

            The world needs more of this. It is nowadays rare for programmers to sit down with users and observe what they are doing. Instead we have UX designers designing the experience and programmers implementing that.

            • markus_zhang 2 days ago

              It is so frustrating that I’m not good enough to create software for myself. Maybe I should just buckle up and start working on that.

              I use an iPhone and found a lot of usability issues. Some apps such as Stock are perhaps not too difficult to recreate in Obj-C. I’m kinda old timer so prefer Obj-C even that I don’t know anything about it.

          • markus_zhang 2 days ago

            The sit down with users part seemed to be the most crucial one. Sadly nowadays developers of such software perhaps are not even in the same continent, and zoom talk can’t do this easily.

        • ssl-3 3 days ago

          In my own little world, I saw this first with mail and news readers. It was fast and simple to read mail and news with pine and tin: The same keystroke patterns, over and over, to peruse and reply to emails and usenet threads.

          As the network ebbed and flowed, email too-often became unreadable without a GUI, and what was once a good time of learning things on usenet became browsing web forums instead. It sucked. (It still sucks.)

          In the greater world, I saw it happen first at auto parts stores.

          One day, the person behind the counter would key in make/model/year/engine and requested part in a blur of familiar keystrokes on a dumb terminal. It was very, very fast for someone who was skilled -- and still pretty quick for those who hadn't yet gotten the rhythm of it.

          But then, seemingly the next day: The terminals were replaced by PCs with a web browser and a mouse. Rather than a predictable (repeatable!) series of keystrokes to enter to get things done, it was all tedious pointing, clicking, and scrolling.

          It was slow. (And it's still slow today.)

        • bionsystem 3 days ago

          I saw this at an airport. Took the same plane twice, one year apart, in between they had replaced the terminal by a web UI. First trip it took 15 seconds from the hostess (well into her 50s) to find my booking and print my pass. Second trip (on the web UI), it took 4 hostesses to team up for something that felt like 5 good minutes to do the same thing.

      • snovymgodym 3 days ago

        Costco still uses AS/400 company-wide for their inventory system I think

        • markus_zhang 3 days ago

          Interesting. Looks like it suits them perfectly. I wonder if the AS/400 is running in an emulator or on a real machine.

          • snovymgodym 3 days ago

            I doubt it, probably just running on a regular Power ISA rack mount server from IBM. Though I guess technically all IBM i aka AS/400 is running on an emulator.

            https://en.wikipedia.org/wiki/IBM_i#Technology_Independent_M...

            • snuxoll 3 days ago

              Nope, we still have an IBM i deployment kicking around at $DAYJOB, it's running natively on POWER hardware. Way back in the days of the original OS/400 running on AS/400 hardware, IBM had the foresight to have applications compile to MI (Machine Interface) code; which is a bytecode format closer to something like LLVM IR instead of something like JVM or CLR bytecode. When a PGM object is copied or created on an IBM i system, TIMI (Technology Independent Machine Interface) takes the MI code and translates it to a native executable for the underlying platform.

              We probably still have a couple of PGM objects kicking around on our modern POWER hardware that were originally compiled on an old AS/400 system, but they run as native 64-bit POWER code like everything else on the machine.

              The IBM midrange line gets a lot of undue disgust these days, it's not sexy by any means, sure, but just like anything running on modern day Z/OS you know that anything you write for it is going to continue to run decades down the line. Well, as long as you limit the amount of stuff you have running on 'modern' languages; because Java, Node, Python, Ruby, etc. are all going to need upgrades while anything written in 'native' languages (RPG, COBOL, C/C++, CL) compiles right down to MI and will keep working forever without changes.

              • hylaride 2 days ago

                In some ways the IBM mainframe line is an amazing piece of engineering. My understanding is that the emulation layers can even emulate hardware bugs/issues from specific lines of long-dead equipment so that ancient code (that was written to take these issues in mind) will still function as expected.

              • sillywalk 3 days ago

                Nitpick:

                The Machine Interface dates back to AS/400's predecessor, the System/38.

                • snuxoll 3 days ago

                  Thanks, I was desperately trying to remember because I swore there was something beforehand, but It's been a very long time since I did the reading.

          • sillywalk 3 days ago

            As far as I know, there are no AS/400 emulators.

            It's still updated by IBM and runs on POWER. It's just called "i" now.

            I believe the naming went something like AS/400->iSeries->System i->System i5->i

            • cwbriscoe 2 days ago

              It's funny that they keep renaming it and everyone still calls it AS/400. I remember when they wanted people to call it iSeries but everyone just still used AS/400. I didn't even know about the others you posted and I still use the AS/400 occasionally.

    • misnome 3 days ago

      Fun story: When I worked at blockbuster I had my computer access revoked and summoned to explain because a colleague told management I was “hacking” when they saw me doing this on the computer system.

      • teeray 3 days ago

        Makes me wonder if that’s where the TV trope of a hacker flying through screens faster than you can see came from

      • p_l 3 days ago

        Was that still on the VMS-based blockbuster video system?

        Weird question, but I accidentally ended up with one of those in my hands that ran in probably non-blockbuster place from 1996 to 2000 :)

    • acuozzo 3 days ago

      > You'd see people who really knew the interface fly through tasks being multiple keystrokes ahead of the UI.

      I remember.

      This, unfortunately, killed people: Therac-25. Granted, the underlying cause was a race condition, but the trigger was the flying fingers of experts typing ahead, unknowingly having been trained to rely on the hardware interlock present in older models.

      • eviks 3 days ago

        > This, unfortunately, killed people: Therac-25. Granted, the underlying cause was a race condition

        So it didn't kill people, something else was that cause

        • acuozzo 2 days ago

          I'm not trying to shift blame to the operators here, but in the absence of flying fingers, nobody would have died. Many, many, people received the right treatment in the Therac-25 machine.

          Also, the author of the buggy software had no idea it would be used to operate a machine without a hardware interlock as, AFAIR, it was not modified prior to being used with the Therac-25 model.

    • Matumio 3 days ago

      Remember the venomous, desperate BEEP! when the keystroke buffer was full. (Or was it when pressing too many keys at once?) Like a tortured waveform generator constantly interrupted by some higher-priority IRQ. Good times.

    • Theodores 2 days ago

      The keyboard buffer size was something like sixteen keystrokes. This was bad news if you noticed your input wasn't working and you needed to press CTRL + whatever to quit the program since the buffer was full and unable to accept the CTRL + whatever. Instead it had to be CTRL + ALT + DEL.

      Three decades later I learn that there were utilities to make the keyboard buffer bigger. But, in those days before search engines, how was I to know?

  • rkagerer 3 days ago

    Yes! That phenomenon drives me crazy. I used to be able to use a computer at warp speed by staying ahead of its responses with chains of rapid keyboard shortcuts etc. Now it's like I'm trying to sride through molasses.

  • nly 2 days ago

    Windows maintains both a synchronous and asynchronous key state. The async one gives the result of the state of keys in a polled fashion, and the other the state as applied by messages as you pump the the Win32 message queue (it's in sync with respect to messages you have observed from the message queue)

    https://devblogs.microsoft.com/oldnewthing/20041130-00/?p=37...

  • a3w 3 days ago

    Rough quote: "in 1984 we had at my house",

    so even 41 years seems to be in the scope.

    I was expecting

    - early projects that ended in Visual Studio 1.0 or NetBeans soon after, (2 to 9 years too early for them)

    not

    - "vim (1991) was not out yet" (not-a-quote, but my feeiling upon looking at ncurses instead of floating windows)

    • projektfu 3 days ago

      I snickered a little because I know Visual Studio didn't have a version 1.0. Wikipedia identifies the first version as Visual Studio 97, which was at version 5.0. I remember before that there was "Microsoft Developer Studio 4.0" which came out around Windows 95, and could run on 95 or on NT 3.51. There was a Visual C++ 1.0 and a Visual Basic 1.0 released at different times. Meanwhile there were also the workhorses, Microsoft C and MASM. In those days, Borland and Watcom were real competitors to Microsoft for C and C++.

    • roryirvine 3 days ago

      Yeah, by 1995, Visual Basic / C++, Delphi / Borland C++, and Symantec C++ were all-conquering.

      A few years before, it was very different - VisualAge and Rational Application Developer were the big names in the early 90s in "professional" IDEs. Interface Builder for university spin-outs or funky startups (and SunWorks / Forte Studio for the less-funky ones). CodeWarrior on the Mac (perhaps with THINK! hanging on too). I think Softbench was popular for scientific software, but I never actually saw it myself.

      And then just a few years later, the rise of Java turned things upside down again and we got Jbuilder, Visual Cafe, & NetBeans as the beginning of yet another new wave. The Visual Studio suite really began to take off around then, too.

      In short, the 90s were a time of huge change and the author seems to have missed most of it!

      • calenti 3 days ago

        An all-in-one like Rational Rose may be making a comeback in terms of these agentic AI projects, because now you actually can turn a spec into code without layers of tagging and UML.

  • karmakaze 3 days ago

    I wasn't paying attention to when 30 years ago actually was...

    So disappointing to expect a GUI Smalltalk System Browser and seeing DOS TUIs.

    And then delight recalling Turbo C/Pascal and MS C 4.0 with CodeView that even worked in 43 or 50 line modes.

    • jll29 3 days ago

      Yes, me too, I was expecting either Smalltalk or LISP machine GUIs.

      Having said that, some old TUIs were clearer and faster even on weaker hardware. This should be a lesson for us today. Color transitions and animated icons flying over the desktop are NOT what I need, but speed, clarity, and discoverability of more rarely used functionality are vital.

      • igouy 3 days ago

        May 1988 -- Smalltalk/V 286 -- on IBM-PC, PS/2 or compatible, with an 80286 or 80386

        "INTRODUCTION TO THE SMALLTALK/V 286 ENVIRONMENT"

        http://stephane.ducasse.free.fr/FreeBooks/SmalltalkVTutorial...

        • pjmlp 3 days ago

          That was my introduction to Smalltalk.

          • igouy 3 days ago

            ditto

            So much better than the TUI Smalltalk/V

            • NetMageSCW a day ago

              Methods.

              • igouy a day ago

                Yes. I'd seen Methods a year earlier and chose to make a prototype with Lotus 123 instead. Then Smalltalk/V 286 became available.

    • bryantnyc 3 days ago

      Anyone else on here recall IBM VisualAge for Smalltalk -> VAST, or Cincom Smalltalk?

  • heresie-dabord 3 days ago

    When people love an IDE product so much that they can't work without it, they have overspecialised to their detriment. And possibly to the detriment of the code itself.

    > As for terminal IDEs

    The GNU/Linux terminal is the killer app. Multiple terminals in a tiling window manager is peak productivity for me. (Browser in a separate virtual workspace.)

    And modern scaling for a big display is unbeatable for developer ergonomics.

    • EbEsacAig 3 days ago

      > When people love an IDE product so much that they can't work without it, they have overspecialised to their detriment.

      I think you are wrong.

      https://en.wikipedia.org/wiki/Muscle_memory

      Being extremely good at something increases the gap between said something and everything else. That doesn't mean being extremely good at the first thing is "over-specialization to detriment". If someone is equally mediocre at everything, they have no such gap, so no "over-specialization to detriment"; but is that really worth desiring? I think not.

      • jancsika 3 days ago

        > Being extremely good at something increases the gap between said something and everything else.

        You're also potentially over-specializing at one level while at the same time neglecting other levels.

        Musicians run into this problem when, for example, they rely solely on muscle memory to make it through a performance. Throw enough stress and complicated music at them and they quickly buckle.

        Meanwhile, a more seasoned performer remembers the exact fingers they used when drilling the measure after their mistake, what pitch is in the bass, what chord they are playing, what inversion that chord is in, the context of that chord in the greater harmonic progression, what section of the piece that harmonic progression is in, and so forth.

        A friend of mine was able to improvise a different chord progression after a small mistake. He could do this because he knew where he was in the piece/section/chord progression and where he needed to go in the next measure.

        In short, I'm fairly certain OP is talking about these levels of comprehension in computer programming. It's fine if someone is immensely comfortable in one IDE and grumpy in another. But it's not so fine if changing a shortcut reveals that they don't understand what a header file is.

      • mxkopy 3 days ago

        What if the IDE is a LeapFrog 2-in-1 Educational Laptop

        • johnebgd 3 days ago

          If you make usable products that solve problems for others from that then it’s a great IDE…

    • wat10000 3 days ago

      Why is it to their detriment? It's not like they're stuck with it forever. "Can't work without it" is really "won't work without it because they prefer installing it over going without."

    • pjmlp 3 days ago

      As someone that started when only rich people could afford GUIs, I don't understand what is killer app about it.

      We used text terminals because that is what we could afford, and I gladly only start a terminal window when I have to.

      • eikenberry 3 days ago

        The killer thing about it is that it is a gateway to the shell, all the command line tooling and the best cross-platform UI.

        • pjmlp 3 days ago

          Xerox PARC, Atari, Amiga and many others had shells, without needing to live on a teletype world.

          It is only cross platform as long as it pretends to be a VT100.

          • eikenberry 3 days ago

            It's not about needing to live in a teletype world, it is about how language/text is just a better interface for a general use computer. Computers primary feature is that they are programmable and an interface that allows you to take advantage of that is superior to one that doesn't. The programmable GUIs all failed to gain traction (smalltalk and like), that left the shell (and maybe spreadsheets) as the best UI for this. Though as AIs mature we might see a shift here as they could provide a programmable interface that could rival shell scripting.

            • int_19h 3 days ago

              The reason why GUIs became so popular so quickly after they were introduced is because text is not "just a better interface for a general use computer".

              Like OP, I remember the days when command line was all you had, and even then we used stuff like TUI file managers to mitigate the pain of it.

              • eikenberry 20 hours ago

                But GUIs never took off as a UI for a general purpose computer, they became the UI for application on a general purpose computer. For them to be the former requires them to be programmable. Smalltalk is the best/most-famous example of a Graphical UI for a general purpose computer I can think of...

                The main point is that for a general purpose computer the UI needs to integrate programming. Programming is how you use a computer. The shell (text) is currently the primary UI that inherently allows programming.

                • igouy 18 hours ago

                  Is a modern phone a general purpose computer?

                  What kind-of UI does a modern phone present?

            • pjmlp 2 days ago

              Great that Microsoft, Apple and Google are on the right path then, with AI voice controlled and gestures OSes.

      • lproven 2 days ago

        > I gladly only start a terminal window when I have to.

        Exactly so. I am perfectly able to work entirely in a text/CLI world, and did for years. I don't becase I don't have to. I have better, richer alternative tools available to me now.

        It was very odd to join Red Hat briefly in 2014 and meet passionate Vi advocates who were born after I tried Vi and discarded it as a horrible primitive editor.

    • rpodraza 3 days ago

      Good luck writing Java with notepad.

      • anthk 3 days ago

        Tons of people did that but with nvi/vim and calling javac by hand.

      • pjmlp 3 days ago

        We did that back in 1996, however the sentiment applies to most languages.

        Example Notepad versus Turbo C++ described on the article.

      • int_19h 3 days ago

        Was literally a thing in some colleges.

  • constantcrying 3 days ago

    [flagged]

    • invader 3 days ago

      People should also stop using terminal emulators. It is pretty silly to base software around ancient printing terminals. Everyone knows for a fact that only tech illiterates use a console instead of a GUI. Since all great devs use a GUI. Just a fact.

      Also, people should stop playing 2D games. It is pretty silly to base your entertainment on ancient technology when modern GPUs can render super-complex 3D scenes.

      And don't make me start on people who still buy vinyl...

      • anthk 3 days ago

        Current GPU's can't compete with my brain 'rendering' a Slash'em/Nethack scene with my pet cat while I kick ass some foes with my Doppleganger Monk full of Wuxia/Dragon Ball/Magical Kung Fu techniques.

      • rkomorn 3 days ago

        Honestly hard to disagree with your first point even though it's sarcasm.

        It's still quite easy to end up with a terminal you need to reset your way out of (eg with a misguided cat), not to mention annoying term mismatches when using remix/screen over SSH, across OSes, or (and this is self inflicted) in containers.

      • constantcrying 3 days ago

        Completely disingenuous. Stop the snark.

        For UI there exists a straight up superior alternative, which keeps all of the benefits of the old solution. Neovim is just straight up better when used outside of a terminal emulator.

        What is true for TUI vs. GUI is not true for CLI vs. GUI (or TUI for that matter) pretending the argument I made applies to the later is just dishonest. You can not replace CLI interfaces adequately by GUI or TUI interfaces, you can totally replace TUI Interfaces by GUI. See neovim as an example. It is superior software when used outside of the terminal.

        • JSR_FDED 2 days ago

          Maybe on paper. But the snappy low-latency feel of TUI apps in the terminal is a joy, and unequaled in GUIs.

          • constantcrying 2 days ago

            >Maybe on paper. But the snappy low-latency feel of TUI apps in the terminal is a joy, and unequaled in GUIs.

            This is not true at all. Terminal emulators are GUIs, the TUI is just another layer on top of that GUI. Using a TUI will always introduces additional latency, depending on the quality of the terminal emulator.

            I do not know what GUIs or TUIs you are using, but my KDE Apps are all extremely snappy.

    • eikenberry 3 days ago

      TUIs are the best cross platform apps. They run on all the major and minor platforms in general use. GUIs cannot compete with browsers being the next closest thing. They can be integrated with the shell and also work perfectly well remotely w/o issues. TUIs are superior in many ways to GUIs and have a place in the ecosystem.

      • lproven 2 days ago

        > TUIs are superior in many ways to GUIs and have a place in the ecosystem.

        There's another reason you don't mention.

        Consistent UI.

        TUI apps can (and in the Windows world usually do) use the same keyboard controls, derived from IBM CUA, as their GUI equivalents do.

        This is why I use Tilde in the Linux shell: the same commands work in it as in Pluma or Leafpad or Mousepad or whatever: Ctrl+O opens a file, Ctrl-X/C/V to cut/copy/paste, Ctrl+N for new, etc.

      • constantcrying 3 days ago

        TUIs do not even run the same across terminal emulators.

        It is a total joke to call something which depends on how the underlying terminal emulator interprets specific ANSI escape sequences "multi platform".

    • yoz-y 3 days ago

      Most of my work is done on remote machines. Nothing beats tmux+tuis in this paradigm.

      • pjmlp 3 days ago

        I rather stick with RDP, or browser based workflows.

        • yoz-y 3 days ago

          They are fine, however RDP requires more bandwidth and most of the stuff I run is terminal commands anyway.

          Company I work for has a great browser based IDE but that’s something I would never setup and maintain for a personal project.

    • mlyle 3 days ago

      Modern terminals do color just fine-- 24 bit color support has existed since 2010-ish, and been mainstream since 2015.

      There's nothing wrong with graphical IDEs... or text user interfaces. Great developers use both. Low effort troll is low effort.

      • calenti 3 days ago

        +1 - crap code can come out of notepad / emacs / vi or IDE-flavor-of-the-day or even the AI code sausage maker. Testing, specification, knowing what you are building and why still matters.

    • pjmlp 3 days ago

      Agreed, we used TUIs because we couldn't afford anything better on MS-DOS, CP/M, 8 bit home computers.

      People on better systems like the Amiga and Atari were already past that.

      • anthk 3 days ago

        Vim was born in Amiga and Amiga OS came with some Emacs clone.

        • pjmlp 2 days ago

          I surely don't remember such clone.

          As for where Vim was born, hardly matters, it was someone with UNIX culture background, that happened to own an Amiga.

          • tralarpa 2 days ago

            > I surely don't remember such clone.

            I think they mean MicroEmacs. Despite its name, it was not Emacs, but it had Emacs-like keyboard shortcuts, multiple buffers, and macros, which was quite neat for a free 1986 application on a home computer.

            • pjmlp 2 days ago

              I guess that is it, thank for the memory refresher, and to be more precise, MEmacs.

          • anthk 2 days ago

            Amiga OS 3.1 has it under the Workbench floppy sets. You get it by default.

            • pjmlp 2 days ago

              Amiga 500 was shipped with AmigaOS 1.2 in 1987, Amiga OS 3.1 was released in 1994, almost at the end of the commercial life of Amiga.

              As the sibling comment points out, MicroEmacs isn't really Emacs.

              Also Emacs history is older than UNIX, and overlaps with Lisp Machines.

    • uniclaude 3 days ago

      SSH comes to mind.

      • constantcrying 2 days ago

        How so? I use remote machines all the time, why would I need a TUI for that? VSCode and zed support editing on remote machines and the machine drives are also mounted on the local machine? What purpose would any TUI have? What even are the potential benefits?

        Right now I can use the exact same software I use on my local machine. Can you give me any reason why I should consider anything else?

mkovach 3 days ago

Ah, Borland’s IDE! An absolute delight. I’ve yet to find anything modern that matches it. Sure, nostalgia turns everything syrupy, but I actively hunt for excuses to use Free Pascal just to fire up that interface. Okay, fine—I like Pascal too. You caught me.

I also use Sam and Acme from Plan 9 (technically from the excellent plan9port), but let’s be honest: those aren’t IDEs. They’re editors. Tools that let me think instead of wrestle.

There’s a lot we could (and probably should) learn from the old TUIs. For example, it’s perfectly acceptable, even heroic, to spawn a shell from the File menu and run something before returning. Seems people are afraid of losing style points with such grievous actions.

And the keybindings! So many of those classic TUIs adopted WordStar’s sacred keystrokes. They’re burned into my muscle memory so thoroughly that using EMACS feels like trying to type with oven mitts. For years, joe (with the blessed jstar alias) was my editor of choice.

Anyway! Time to boot the Dr. DOS VM, spin the wheel of Advent of Code, and be nostalgically inefficient on purpose.

  • bombcar 3 days ago

    One thing about the "professional" DOS software (and you can see it in things like Emacs - eight modes and constantly shifting) was you were basically expected to live in it - it had the full attention of the computer and the user.

    You were also expected to learn it; which meant you became "one with the machine" in a way similar to an organ player.

    I remember watching Fry's Electronics employees fly through their TUI, so fast that they'd walk away while it was still loading screens, and eventually a printout would come out for the cage.

    • Mountain_Skies 3 days ago

      About twenty years ago I did a consulting gig for a government agency that wanted to create a web interface for their CSRs to replace the green screens they had been using. The long time employees hated it because they had deep muscle memory for most tasks on the green screens and could get far ahead of the screen refresh. With the web UI, not only could they not type ahead, but many of the workflows now required use of the mouse.

      The agency was happy to have something new and modern but more important to them was that new employees could be trained on the system far faster. Even though there were a small number of long term employees, they had high turnover with the frontline CSRs, which made training a major issue for them.

    • Aurornis 3 days ago

      > it had the full attention of the computer and the user.

      This why I like to use the full screen mode of my editors and IDEs.

      It surprises a lot of people who see my screen. Full screen features are everywhere but rarely used.

      • vunderba 3 days ago

        Agreed. I do a lot my writing in Typora which, in addition to a full-screen mode, also has other "Focus" style features which get rid of distracting UI/UX elements, etc. so you can concentrate on the task at hand.

    • exe34 3 days ago

      Even normal windows applications used to be like this (outside of crashing). I could alt-tab, type stuff and click where I know a button would show before I even saw the application window. It never missed a key stroke or type into the wrong window. Nowadays you load a webpage and start typing, and half you text appears and then the other half just never shows up.

    • chiph 3 days ago

      Paying at Best Buy was torture - watching the cashier move their mouse around (on the slanted mousing surface they were given so they couldn't just let go) and click the buttons, going through 3 or 4 screens and waiting for them to load vs. using the keyboard. They would have been done with me and on to the next customer in half the time.

    • esafak 3 days ago

      The old TUIs were faster yet I still prefer IntelliJ; it's fast enough and much more powerful.

  • robenkleene 3 days ago

    > So many of those classic TUIs adopted WordStar’s sacred keystrokes.

    What are the WordStar bindings and what do you like about them?

    I have a general interest in the history of how these patterns emerge and what the benefits of them are relative to each other.

    • ninalanyon 3 days ago

      They are control key sequences that are arranged so that a typist need never remover their fingers from the keyboard. The control key was to the left of the A so easily pressed with you left little finger.

      You had full control of the cursor without the need for dedicated arrow keys or page up and down keys. It worked on a normal terminal keyboard. I first used it on an Apple ][ with a Z80 add-on that ran CP/M.

      • robenkleene 3 days ago

        Thanks for sharing! I'd consider all those things true for Emacs/Vim bindings as well? (Just curious if you'd disagree with that assessment.)

        • ninalanyon 7 hours ago

          That's true-ish. But the thing about Wordstar is that it is a word processor not a text editor. Other word processors don't make this so easy. Also the standard keybindings for cursor control in Emacs are much less ergonomic.

          ^ = control

          In Wordstar: ^S/^D moves left/right; ^E/^X moves up/down; ^A/^F word left/right; ^R/^C moves page Up/Down

          Notice that all of those use only the left hand. In Wordstar almost everything to do with cursor control uses only the left hand.

          Emacs is mnemonic ^b for left (back), ^f for right (forward), ^n for next line, ^p for previous line, etc. You need both hands and the keys are all over the keyboard.

    • disqard 3 days ago

      Sci-fi author Robert Sawyer (who has won Hugo and Nebula awards) is a big fan of Wordstar -- he uses it to write his books.

      I highly recommend reading this:

      https://www.sfwriter.com/wordstar.htm

      • robenkleene 3 days ago

        This useful, but it also seems like a very comparable feature set to editors like Emacs and Vim. So I'd still love to hear from someone who has the background to do a direct comparison, especially if they prefer WordStar.

        • mkovach 3 days ago

          Vim was never a steep learning curve for me; more of a gentle slope. But then again, I cut my teeth on ed, and when I met sed, it felt like a revelation. On DOS, I even used edlin, a kind of ed junior with training wheels and a sadistic sense of "functional."

          You have to understand: my first DOS machine was a Tandy 1000, acquired before I had a driver’s license. It was upgraded over the years and not retired until the grunge was well underway and I had already been married and divorced.

          MS-DOS’s edit had WordStar keybindings; Ctrl-S to move back, Ctrl-E to move up, and so on. My dad "brought" home a copy of WordStar from work, and oh, the things that trio, WordStar, me, and a dot matrix printer conspired to create.

          Borland carried those keybindings into Turbo Pascal, which I learned in college, having finally escaped the Fortran 77 gulag that was my high school’s TRS-80 Model III/IV lab. The investment into the Apple II lab didn't happen until AFTER they gave me my exit papers at a spring awards ceremony.

          Why do I still prefer these tools?

          Because they’re what I know. They don’t get in my way. We have history, a better and longer history that I have with my first wife. Those keybinds helped me write my first sorting algorithms, my first papers on circuit design, and the cover letters that got me my first jobs. They’re not just efficient. They’re familiar. They’re home.

          • robenkleene 3 days ago

            Thanks for sharing! (And to be clear, that's totally a great reason!) I wasn't familiar with these bindings and was curious to hear more about them, both the history and the subjective preference for them are both interesting to me.

        • lproven 2 days ago

          > So I'd still love to hear from someone who has the background to do a direct comparison

          Can do.

          > especially if they prefer WordStar.

          I don't. I dislike both WS and Vi.

          Vi (and its variants, I am covering all of them here) is a Unix tool. In the 1980s, Unix was big and expensive, and usually ran on boxes so expensive they had to be shared. So, mainly found in academia and research.

          WordStar is a CP/M tool which became for a while took that dominance to DOS.

          It ran on affordable standalone microcomputers you owned and didn't have to share with anyone else.

          What they share is that they are designed for keyboards before things like cursor keys, Insert/Overtype, PageUp/PageDown, Home/End were added. They do not assume theys; they expect just letters, a few shift/meta/ctrl type keys, and nothing else.

          So, they redefine all these functions with letters and letter combinations.

          So, cryptic, idiosyncratic, hard to learn, but once you learned, fast and powerful. Touch-typists loved them, because your fingers stayed on the home row and you never needed to reach off the alpha-block and into the extra keys. (The ones that are a different colour on the classic IBM Model F and Model M keyboard.)

          Vi is the Unix flavour of touch-typist's UI, for those from universities and research and maybe big rich corporations.

          WordStar is the DOS flavour of touch-typist's UI, for those who bought or built their own computers and installed and ran their own software on inexpensive machines.

          In its time, WS keys were everywhere in DOS. The cracks started showing when WordPerfect took the DOS wordprocessing crown, with its even weirder function-key driven UI, which really favoured the Model F layout (f-keys down the side) and contained built-in copy protection in the form of colourful keyboard templates.

          Then IBM CUA came along and mostly swept that away. I was there and using DOS then and I much prefer CUA.

          Same functional role, but different commercial markets.

          • lproven a day ago

            > They do not assume theys

            "They do not assume these."

            What a bizarre typo. Sorry about that.

        • Narishma 3 days ago

          I've used all three and I think it's just a matter of what you're used to. I mostly use vi but have no problem switching to the other two schemes when needed. But maybe that's just me not having strong preferences. I know some people who have trouble switching from Chrome to Firefox and those are practically identical.

      • mbreese 3 days ago
        • manquer 3 days ago

          Not an example we want to cite for prowess of productivity with WordStar, given Martin's throughput as a writer in last couple of decades.

          • mbreese 2 days ago

            I look at him more as an example of someone who is committed to his process and tools.

  • citbl 3 days ago

    What a wonderful write up and I feel the same.

    I've been working on my free time on a tui code editor in the same vein eventually with make and lldb built in.

  • chuckadams 3 days ago

    There's a lot from Plan 9 I love, but I couldn't find Acme's mouse-dependent UI acceptable in the least. I can't deal with any UI that requires precise aim when I have to use it hour after hour, and I'd hate to imagine using it if I had an actual disability.

    • mkovach 3 days ago

      Most days, you’ll find me in sam, regexing my way to bliss like some monastic scribe with a terminal fetish. When I feel the urge to let AI stroke my curiosity or scaffold a long template like magic, I cut, paste, and drop it into a local or remote model like a well-trained familiar.

      But I’ve also written larger applications and, frankly, a ridiculous amount of documentation in Acme. That 9P protocol was my backstage pass: every window, every label, was accessible and programmable. I could, for example, hook into a save event and automatically format, lint, and compile ten or fifteen years before most IDEs figured out how to fake that kind of integration.

      Sure, the system demands precision. It doesn't coddle. But for me, that was the feature, not the bug. The rigor sharpened my thinking. It taught me to be exact or be silent, forcing me to pause when I usually would not.

  • skopje 3 days ago

    djgpp + vi for dos in 1991 ftw!

    • rahen 2 days ago

      Did you really use vi on DOS in 1991? I don’t remember Elvis being easy to find back then, and I don’t think it was a TSR either, so the compiler couldn’t be spawned in the background like it was with the Borland IDEs.

      Almost every C bedroom programmer I knew had a cracked copy of Turbo C / Turbo C++ because they were so modern and convenient. DJGPP was a nightmare in comparison, it filled up the small HDDs of the time, created large executables, and the process of opening edit.com, leaving the editor, running gcc, and then going back to edit.com was tedious.

      The few brave souls using DJGPP would usually end up running Slackware from around 1993. This was a step up from bolting an awkward POSIX runtime onto a monotasking system, as DJGPP did on DOS.

      DJGPP was a stellar idea, basically the WSL / MinGW of the days, but the limitations of DOS prevented it to shine compared to the Borland IDEs.

MomsAVoxell 3 days ago

I used to use a Java-oriented IDE called “Visix Vibe”, at first as an experiment in application development with Java and then as an alternative to Delphi, which was my bread and butter tooling environment for custom application development.

Both of these IDE’s gave me a huge productivity boost, and it used to be a no-brainer to give customers a realizable estimate for getting the UI done, then wiring up logic, and get things ready to ship, etc.

I really miss these IDE’s, because of the integration factor. It was fun to wire up a database table and generate a form and immediately have something that could be used for data input and validation on the project - then spend a few weeks refining the UI to be more robust and cater to the application use case in focus.

These days, it feels like a lot more careful planning is needed to make sure the UI/API/backend realms can play properly together.

It would be nice to see some more progress on this level of tooling. It’s one thing to have an AI generate UI code - but I still think there is room for painting the code and using a UI to build a UI.

(The moment someone produces a properly WYSIWYG tool for JUCE, the glory days will begin again ..)

  • spacechild1 a day ago

    > The moment someone produces a properly WYSIWYG tool for JUCE,

    Side note: Steinberg's vstgui framework not only has a WYSIWYG editor, you can even build/edit the UI while the plugin is running in a DAW. I usually give Steinberg a lot of shit for their arrogance and ignorance, but this I found extremely cool! I have only toyed around with it, so I don't know how viable it is for complex plugin UIs.

  • dapperdrake 3 days ago

    You may laugh, but that is how I use html forms today. Simple. And effective.

    • MomsAVoxell 3 days ago

      I wouldn’t laugh at that, but my context is native applications and will be, for a while. Sure, the web is great and all. But native applications still have a part to play - especially in realms requiring custom applications be built, i.e. not for mass-market.

whartung 3 days ago

I think these modern TUIs are a testament to the general failure of modern GUIs.

It's not like they're particularly easier to write these.

But since there's no remote GUI option, much less a portable remote GUI option, particularly one that's not just a video of an entire desktop, we're stuck with these.

WHo wants to fire up an entire desktop to get open a simple utility app?

Obviously the Web satisfies much of the demand for this, but clearly not all.

Remote X is, essentially, dead. It's obviously "really hard", since "no one does that any more". Or, folks just don't miss having rootless windows peering into their remote server enclaves.

It's just too bad, full circle, here we are again. "Progress."

  • pjmlp 3 days ago

    Failure of Linux Desktop you mean.

    RDP works great and GUI tooling for Windows and macOS is quite comparable to using VB, Delphi, Smalltalk like experiences.

    • eikenberry 3 days ago

      RDP works great on Linux as well. The problem isn't remote access, it is lack of good cross platform GUIs. There is a reason browsers are dominating the UI space and TUIs are popular.

      • pjmlp 3 days ago

        Yeah, laziness, we had plenty or cross platform GUIs in the 1990's.

      • int_19h 3 days ago

        Qt is still around. And there's stuff like Avalonia.

        The problem is that people don't use that and reach for Electron instead, and then you get that "bad on any platform (but good enough to ship)" effect.

    • f1shy 3 days ago

      X could do that before RDP was even a project. I think OP is meaning something different.

      • pjmlp 3 days ago

        Of course it could, it is essentially dead, because hardly anyone still does X remoting, and Wayland doesn't support it.

        • int_19h 3 days ago

          Ironically these days the built-in "remote desktop" feature in Gnome is literally RDP.

          • lproven 2 days ago

            Not really ironic. The GNOME team are mostly paid by the same company as the Wayland team (and the Flatpak team and the OStree team and the systemd team) and that team wants to replace X.org with its own new tool.

    • vpShane 3 days ago

      There are no failures for Linux Desktop; this can never be the meaning. I say this with humor in mind.

      Requiring me to have a cloud account to format my machine (mac) and requiring me to have a cloud account on only pre-authorized hardware (Windows 11), only to open up Notepad and see they slapped AI inside of it; now that is quite comparable to me slapping Linux on it.

      Just sayin'

      • wpm 3 days ago

        You don’t need any kind of account to erase a Mac.

  • do_not_redeem 3 days ago

    > But since there's no remote GUI option

    ssh -X and waypipe both work perfectly fine.

    And to your point about portability, if you're stuck on an OS other than Linux, VNC/RDP aren't pretty but they'll get the job done.

    • general1465 3 days ago

      > And to your point about portability, if you're stuck on a non-linux OS, VNC/RDP aren't pretty but they'll get the job done.

      If you can make them working. Sorry you can't connect if user is logged on this computer. Whoops RDP session is active, so I will show you this black screen after typing your username and password until user disconnects (Why not kick out the user?). VNC is even bigger pain when you need to boot up server from SSH and sometimes restart it when it gets stuck.

      While on Windows you can just install TightVNC and it works. No screwing with screens. On MacOS you can just tick Remote Screen Sharing, put your VNC password and it just works. Even Android can do that droidVNC-NG, But Linux is such a PITA to make VNC or RDP working.

      And RDP also assumes that you are running X11 and not Wayland.

      • do_not_redeem 3 days ago

        > I will show you this black screen

        Right, that's why I said RDP isn't pretty if you aren't on Linux. Windows insists on creating a separate desktop for each session. IIRC it has something to do with licensing, they don't like simultaneous users using one Windows license.

        > While on Windows you can just install TightVNC and it works.

        If you're resorting to installing third-party apps, you can install TightVNC on Linux too, and it just works. Though I found krfb performs better on my network, ymmv.

        > And RDP also assumes that you are running X11 and not Wayland.

        RDP is just a protocol that describes the bytes going over the network. Why would it care about your display server? There are VNC and RDP servers for both X11 and Wayland. Just install one that's supported by your system.

        Though if you're on linux, you don't have to deal with the VNC/RDP jank at all. Just use ssh -X or waypipe and it's way snappier.

        • Tor3 2 days ago

          I use ssh -X all the time (actually I have -X permanently enabled so it's always there - for the targets where I want to use it). However, if there's any combination of latency and heavy X traffic (which can be found in surprising places), ssh doesn't cut it. Then a VNC is necessary. The difference can be as much as ten minutes of waiting for the GUI to come up, or nearly instantly if I have a VNC running.

Lapel2742 3 days ago

> “In my house”, we used something called SideKick Plus (1984), which wasn’t really a code editor: it was more of a Personal Information Management (PIM) system with a built-in notepad.

Finally! Someone who still remembers the best software ever written. I looooved Sidekick and we used it throughout our small company. It's so long ago. I remember only parts of it now but it was such a useful tool.

  • SoftTalker 3 days ago

    The reason IDEs blossomed on DOS was because there was no multitasking. On unix/linux, even on a "dumb" tty with no GUI, you could hit CRTL-Z and your editor would go into the background and you'd be at a shell where you could run make or gdb or manage files. Then type 'fg' and your editor would be back exactly as you left it.

    IDEs do all that in one huge program because if you exited your editor to run the compiler or run your program, when you went back to the editor it was starting up cold again.

    TSR programs like Sidekick avoided some of this but were a poor substutute for real multitasking.

    • burnt-resistor 3 days ago

      There are multitasking options using DESQview(/X) or Windows >=3.1. A friend of mine in high school ran a 4 line BBS using DESQview and 4 Courier 28.8K modems.

      In real mode, it's possible to have a TSR that swaps the entire contents of RAM from disk. As long as such a hypothetical TSR is always loaded into a fixed location, it's possible to save and restore the entire DOS, program, and/or EMS/XMS session.

  • 3x35r22m4u 3 days ago

    SideKick had the ability to take "screenshots" of the text shown in other applications. Being a TSR was cool, but stealing text from another program interface was mind blowing!

  • NetMageSCW 3 days ago

    The best software ever written was ThinkTank (and next it to it, Memory Mate). Sidekick was great for popularizing the TSR, though.

coldcode 3 days ago

I used Borland Turbo Pascal in 1984. It was amazing to work with something so fast on a PC that was really so slow. No IDE/Compiler since then matched the speed. Today's code is massively more sophisticated and complex, so there is no way to match that performance today despite the speed of computers today.

  • elevation 3 days ago

    I got a windows 95 66MHz Pentium machine with 16MB ram at a yard sale in 2002. Visual Studio 5 Enterprise worked plenty fast on that with features I still miss (or which don’t perform as well) in modern environments.

  • chadcmulligan 3 days ago

    Have you tried Delphi lately, very fast, compiles in less than a second.

  • 1313ed01 3 days ago

    Came here to share this mandatory link on this subject, Dadgm's Things That Turbo Pascal is Smaller Than: https://prog21.dadgum.com/116.html

    I used Turbo Pascal 2 as late as 1991, if not later, because that was the version we had. It was really fast on a 386 40 MHz or whatever exact type of PC we had then. A bit limiting perhaps that it only came with a library for CGA graphics, but on the other hand it made everything simpler and it was good for learning.

    A few years ago I wanted to run my old Turbo Pascal games and decided to port to Free Pascal. Sadly Free Pascal turned out to only ship with the graphics library introduced in Turbo Pascal 4, but on the other hand I got a few hours of fun figuring out how to implement the Turbo Pascal 1-3 graphics API using inclined assembler to draw CGA graphics, and then my games worked (not very fun games to be honest; more fun to implement that API).

  • burnt-resistor 3 days ago

    Neat. I have an original copy of Numerical Recipes in Pascal.

    I used to have a copy of a Turbo Pascal graphics book with a blue-purple Porsche (not pg's hah) on the cover that included code for a raytracer. It would take about a minute to render one line at 320x200x256 colors, depending on the number of scene objects and light sources.

  • marstall 3 days ago

    totally. so strange that that was the fastest, most flowy dev experience i've ever had. been chasing that ever since. though i must say rails, vite, react fast refresh, etc. are pretty nice in that regard.

albertzeyer 3 days ago

Ok, this post is mostly about text-based IDEs, but I think the point mostly stands as well for IDEs in general. I'm thinking about Visual Basic or Delphi.

I think such a IDE for Python would really be helpful for beginners. Not text-based, but more like Visual Basic. But everything integrated, everything easily discoverable (that's very important!). Maybe also with such a GUI builder as in VB. And a built-in debugger. I think for the code editor, as long as it has some syntax highlighting and tab auto-complete, that would already be enough. But some easy code navigation would also be good. E.g. when you place some button on your window, and double click that button in the GUI editor, you get to the call handler code of that button.

Some time ago, a small group of people (me included) (I think from some similar HN post?) got together and we brainstormed a bit on how to implement this. But we got lost in the discussion around what GUI framework to base this on. I already forgot the options. I think we discussed about PySide, Dear PyGui, or similar. Or maybe also web-based. We couldn't really decide. And then we got distracted, and the discussion died off.

Note, Free Pascal was mentioned here. But then you should also mention Lazarus (https://www.lazarus-ide.org/), which is the same as Free Pascal but cloning Delphi. Lazarus is really great. And it is actively developed. But Object Pascal is too little known nowadays, maybe also a bit outdated.

  • mkl 3 days ago

    > I think such a IDE for Python would really be helpful for beginners. Not text-based, but more like Visual Basic. But everything integrated, everything easily discoverable (that's very important!). Maybe also with such a GUI builder as in VB. And a built-in debugger.

    That was Boa Constructor, starting in 2000: https://boa-constructor.sourceforge.net/. It seemed good at the time, but never caught on.

  • mamcx 3 days ago

    There was FoxPro too that was amazing and in some ways better than Delphi, in special for the integrate "command window" and the ability to show "widgets" at will (like BROWSE)

  • NetMageSCW a day ago

    If you are willing to forego the GUI development, LINQPad is an amazing interactive environment for creating libraries to help your scripting and then speed up the scripting of small tasks. It is fast and easy enough that I’ve used it for complicated processing of small (a few screens) of text instead of emacs, but powerful enough to run queries against phone systems or import GB of CSV and produce an Excel summary.

  • jmmv 3 days ago

    > Ok, this post is mostly about text-based IDEs, but I think the point mostly stands as well for IDEs in general. I'm thinking about Visual Basic or Delphi.

    Exactly. I recently recorded a video of me creating a toy app with VB3 on Windows 3.11 and the corresponding tweet went “viral” for similar reasons as this article.

    It’s not really about the TUI: it’s about the integrated experience as you say!

  • cookiengineer 3 days ago

    To me, Lazarus is the golden standard of IDEs, because Lazarus is built in Lazarus and the author really knows how to dogfeed his own product. So many examples and tutorials, it's really awesome for teaching kids how to make programs.

    I mean, it's really an IDE similar to what we were used to later with Microsoft Visual Studio, when it had integrated tagging, search, help, API docs, widget examples and libraries etc.

    Every time someone tries to tell me what an IDE is, I say nope, that's not an IDE because nothing is integrated there. Xcode could be argued is an IDE (still) but it's a painful one in comparison with Lazarus.

    But what do I know, I am so stubborn that I build my own UI frontend framework in Go for the last 6 months because I refuse to go back to the JS world. Hopefully at some point I can make an IDE with that, but who knows how much timd I need for that...

anarticle 3 days ago

"the IDE had to be discoverable right away (which it was) and self-contained to offer you a complete development experience"

This right here was the key to super flow state. Lightning fast help (F1), very terse and straightforward manuals. I have tried to replicate this with things like Dash (https://kapeli.com/dash), to some degree of success.

The closest thing I had to this in windows was probably Visual Studio 6 before the MSDN added everything that wasn't C/C++ to the help docs. After that, the docs got much harder to use due to their not being single purpose anymore. The IDE was a little more complex, but you at least felt like you got something for it. After that, too many languages, too many features, overall not great experience.

The keybindings were so simple and fast, Borland IDE on DOS was a very nice tool. Yes, easier than vim and emacs. The reason is because of mouse in TUI so things like complex selection/blocks/text manipulation are not keybindings in the same way so the key combos are more "programming meta"(build, debug, etc) rather than "text meta".

EDIT: also, I feel like this needs to be mentioned: compilers were not free (as in beer) at that time!

In order to develop on my own machine as a teen, I had to sneakily copy the floppy disks the teacher used to install this on the school computers so I could have more than 1h using it at home! COPY THAT FLOPPY

4b11b4 3 days ago

Recent install of Emacs 30 and Doom 3.0 via https://github.com/jimeh/emacs-builds (MacOS) is feeling very nice.

Emacs actually is friendly! apropos and all of the describe commands make it /discoverable/.

Literate configs and tangling?! I finally feel the end game.

Yes, you probably should read a book at the same time on the side to give you a higher perspective on fundamentals. Sure, some other tools are simpler to get started.

If I could drop everything I'd make a simple emacs config for kids with like a turtles mode and maybe a sound sequencer, then teach them functional programming first. Hah

JeremyHerrman 3 days ago

I'm interested in how these old IDEs were used during the transition from assembly to high level languages. It seems especially topical given the LLM integration into today's IDEs.

Back then was it common to have a split or interleaved view of high level and assembly at the same time?

I'm aware that you could do something like the following, but did IDEs help visualize in a unified UI?:

    $ cc -S program.c
    $ cat program.s    # look at the assembly
    $ vi program.c     # edit the C code

A quick search shows that Borland Turbo C (1987) had in-line assembly:

    myfunc ()
    {
        int i;
        int x;
        if (i > 0)
            asm mov x,4
        else
            i = 7;
    }

From the 1987 Borland Turbo C User's Guide [0] "This construct is a valid C if statement. Note that no semicolon was needed after the mov x, 4 instruction. asm statements are the only statements in C which depend upon the occurrence of a newline. OK, so this is not in keeping with the rest of the C language, but this is the convention adopted by several UNIX-based compilers."

[0]: http://bitsavers.informatik.uni-stuttgart.de/pdf/borland/tur...

  • cameldrv 3 days ago

    The Turbo Pascal version of this was even better, because you could do a whole block of asm instead of just a single line at a time. As I remember there were annoying limitations in the C version around labels and such. It was incredibly useful when writing performance oriented code at that time because it was very easy to write code that would outperform the compiler.

    • badsectoracula 3 days ago

      I'm pretty sure you could do asm blocks in Turbo C using brackets, e.g. asm { ... }, though it might have been a Turbo C++ thing (i never really used plain TC much).

  • miohtama 3 days ago

    It was not very common to interleave assembly in MS-DOS IDEs. Assembler and its IDE were separate tools you paid for. But not unheard of.

    You could "dump" your OBJ file for assembly.

    Later C compilers got some better inline assembler support but this was towards the 32-bit era already.

    Also Borland had its own compiler, linker and such as separate binaries you could run with a Makefile but you really never had to, as why would you when you can do that in the IDE in a single keypress.

    • int_19h 3 days ago

      It was quite common actually, most DOS C compilers supported asm{} blocks and Turbo Pascal also supported inline assembly. Paid assemblers like MASM were high-end tools.

      On Unix though it was more common to have .s files separately.

  • qingcharles 3 days ago

    I was writing game engines in Turbo C and assembler in 1988. I don't remember using inline assembler until the 90s. I just had all the graphics routines in a separate .asm file which was part of the build process and then linked in.

api 3 days ago

Speaking of bloat: why are binaries from Rust or (much worse) Go so damn huge? This is in release mode with debug off.

It’s weird because memory use for the same sorts of programs is not much worse than other languages. In Rust memory use seems comparable to C++. In Go there’s a bit more overhead but it’s still smaller than the binary. So all this is not being loaded.

I get the sense devs just don’t put a lot of effort into stripping dead code and data since “storage is cheap” but it shows next to C or even C++ programs that are a fraction of the size.

I see nothing about Rust’s safety or type system that should result in chonky binaries. All that gets turned into LLVM IR just like C or C++.

Go ships a runtime so that explains some, but not all, of its bloat.

  • AlexeyBrin 3 days ago

    Mostly because of static linking. C and C++ don't put every library they need in the binary by default. The advantage is that a pure Go or Rust binary just works (most of the time) when copied from one machine to another, you don't have to care about installing other libraries.

    • api 3 days ago

      That’s a great point.

      Another advantage is that at least for Rust you can do whole program optimization. The entire program tree is run through the optimizer resulting in all kinds of optimizations that are otherwise impossible.

      The only other kinds of systems that can optimize this way are higher level JIT runtimes like the JVM and CLR. These can treat all code in the VM as a unit and optimize across everything.

      • TinkersW 3 days ago

          C++ has had whole program optimization since forever. And you can use static linking if you want, the same as Rust.
      • aleph_minus_one 3 days ago

        > Another advantage is that at least for Rust you can do whole program optimization. The entire program tree is run through the optimizer resulting in all kinds of optimizations that are otherwise impossible.

        I get why this might lead to big intermediate files, but why do the final binaries get so big?

        • 3836293648 3 days ago

          Rust binaries + all their dynamic libraries are the same size as C++ binaries + their linked libraries (when stripped, this isn't default in Rust)

          The main issue is that Rust binaries typically only link to libc whereas C++ binaries link to everthing under the sun, making the actual executable look tiny because that's not where most of the code lives.

        • uecker 3 days ago

          Both C++ and Rust are based on monomorphization, which means generic programming is based on a expansion of code for each combination of types. This makes compilation slow and causes code bloat. One then needs whole program optimization to get this under control to some degree.

    • Onavo 3 days ago

      Go especially, on some platforms they go straight to syscalls and bypass libc entirely. They even bring their own network stack. It's the maximalist plan 9 philosophy in action.

      • self_awareness 3 days ago

        I don't really like Go as a language, but this decision to skip libc and go directly with syscalls is genius. I wish Rust could do the same. More languages should skip libc. Glibc is the main reason Linux software is binary non-portable between distros (of course not the only reason, but most of the problems come from glibc).

        • badsectoracula 3 days ago

          > Glibc is the main reason Linux software is binary non-portable between distros

          Linux software is binary portable between distros as long as the binary was compiled using a Glibc version that is either the same or older than the distros you are trying to target. The lack of "portability" is because of symbol versioning so that the library can expose different versions of the same symbol, exactly so that it can preserve backwards compatibility without breaking working programs.

          And this is not unique to Glibc, other libraries do the same thing too.

          The solution is to build your software in the minimum version of libraries you are supposed to support. Nowadays with docker you can set it up in a matter of minutes (and automate it with a dockerfile) - e.g. you can use -say- Ubuntu 22 to build your program and it'll work in most modern Linux OSes (or at least glibc wont be the problem if it doesn't).

          • self_awareness 2 days ago

            > Linux software is binary portable between distros as long as the binary was compiled using a Glibc version that is either the same or older than the distros you are trying to target.

            Well, duh? "Property A is possible if we match all requirements of property A".

            Yes, using older distro is the de facto method of resolving this problem. Sometimes it's easy, sometimes it's hard, especially when we want to support older distros and using a new compiler version and fairly fresh large libraries (e.g. Qt). Compiling everything on older distro is possible, but sometimes it's hell.

            > And this is not unique to Glibc, other libraries do the same thing too.

            This only means that it is a very good idea to drop dependency on glibc if it's feasible.

            macOS has a "minimum macos required" option in the compiler. Windows controls this with manifests. It's easy on other systems.

            • badsectoracula 2 days ago

              > Yes, using older distro is the de facto method of resolving this problem.

              What i describe is different from what you wrote, which is that Linux is not binary compatible between distros. This is wrong because Linux is binary compatible with other Linux distributions just fine. What is not compatible is using a binary compiled using a newer version of some shared libraries (glibc included but not the only one) on a system that has older versions - but it is fine to use a binary compiled with an older version on a system with newer versions, at least as long as the library developers have not broken their ABI (this is a different topic altogether).

              The compatibility is not between different distros but between different versions of the same library and what is imposed by the system (assuming the developers keep their ABIs compatible) is that a binary can use shared libraries of the same or older version as the one it was linked at - or more precisely, it can use shared libraries that expose the same or older versions of the symbols that the binary uses.

              Framing this as software not being binary portable between different distros is wildly mischaracterizing the situation. I have compiled a binary that links against X11 and OpenGL on a Slackware VM that works on both my openSUSE Tumbleweed and my friend's Debian system without issues - that is a binary that is binary portable against different distros just fine.

              Also if you want to use a compiler more recent than the one available in the distro you'll need to install it yourself, just like under Windows - it is not like Windows comes with a compiler out of the box.

        • int_19h 3 days ago

          The consequences of this genius decision were stuff like this:

          https://github.com/golang/go/issues/16570

          Which is why they have already backpedalled on this decision on most platforms. Linux is pretty much the only OS where the syscall ABI can be considered stable.

          • self_awareness 2 days ago

            Yes, Linux is reversed in this aspect -- glibc is not really binary friendly, but kernel syscalls are. On other systems, kernel syscalls are not binary friendly at all, but libc is friendly.

            I'm fine with using libc on other systems than Linux, because toolchains on other systems actually support backward compatibility. Not on Linux.

        • steveklabnik 3 days ago

          You can only skip libc on Linux. Other unices and Windows don’t let you.

          • immibis 3 days ago

            You can skip libc on Windows - you can't skip the system DLLs like kernel32. (In fact, Microsoft provided several mutually incompatible libcs in the past.)

            Well, you can non-portably skip kernel32, and use ntdll, but then your program won't work in the next Windows version (same as on any platform really - you can include the topmost API layers in your code, but they won't match the layers underneath of the next version).

            But system DLLs are DLLs, so also don't cause your .exe to get bloated.

            • steveklabnik 3 days ago

              Yes, it's not literally libc on windows, but the point is that directly calling syscalls is not supported, you have to call through the platform's library for doing so.

              On some systems, this is just not a supported configuration (like what you're talking about with Windows) and on some, they go further, and actually try and prevent you from doing so, even in assembly.)

              • immibis 3 days ago

                There's still something on the platform that you can call without extra indirection in the way on your side of the handoff. That is true on all platforms; whether it's an INT or SYSCALL instruction or a CALL or JMP instruction is irrelevant.

                • int_19h 3 days ago

                  If it's a CALL instruction into a user-space DLL, that's still an extra indirection.

                  • immibis 2 days ago

                    Kind of like the syscall dispatch table on the Linux kernel side, right? After you issue the handoff instruction and it becomes the operating system's problem, there's still more code before you get to the code that does the thing you wanted.

  • kranke155 3 days ago

    What binaries are a good example of this?

  • maccard 3 days ago

    Rust doesn’t strip debug info by default.

    • jeroenhd 3 days ago

      The vast amount of debug info does make the problem worse, but it doesn't take long before a moderately complex Rust programs grows to 100MB even after stripping.

      When I tried to compare Rust programs to their C(++) equivalents by adding the sizes of linked libraries recursively (at least on Linux, that's impossible for Windows), I still found Rust programs to have a rather large footprint. Especially considering Rust still links to glibc which is a significant chunk of any other program as well.

      I believe many of Rust's statically linked libraries do more than their equivalents in other languages, so I think some more optimisation in stripping unused code paths could significantly reduce the size of some Rust applications.

tern 3 days ago

I think what people miss about bloat, and about what's changed with software over the years, is that a vast variety of niche use-cases are now supported. Software runs on dozens of different systems, every aspect is customizable and programmable, and thousands of different programming languages and approaches are supported.

To give a random example, I use Neovim with SuperCollider, and music programming language. This involves launching a runtime, sending text to the runtime, which in turn sends commands to a server. The server generates a log, which is piped back into a Neovim buffer. There are all sorts of quirks to getting this functional, and it's a somewhat different workflow from any traditional programming model.

I'm not sure there's an easy solution to keeping things simple while also supporting the unimaginable variety of personalities, skill-levels, environments, and tasks people get up to. I do, however, think it's worth continued imagination and effort.

jfengel 3 days ago

I used most of these and they weren't worth it. They didn't do enough to justify being locked in to the tool. The debuggers were too, uh, buggy. Command line tools were more flexible.

They finally got good enough in the late 90s. I think it helped that computers finally had enough memory to run both the editor and the program itself.

burnt-resistor 3 days ago

I learned to code with Turbo Pascal 6 without the internet by trial-and-error and the debugger. When in real mode, a program crash would often reboot the system or occasionally lead to some other unexpected behavior/semi-silent corruption.

Borland C++ 3.1 & Application Frameworks for DOS and Windows 3.1 came with an entire library of paper books. It was probably the heaviest and largest boxed retail software package ever because 4.0 skimped on paper books and didn't include real mode versions of the IDE for DOS.

The Pascal almost equivalent was Borland Pascal 7.0 with Objects.

It was possible to link assembly, C++, and Pascal in the same executable assuming the memory model and function calling convention were set correctly.

nhatcher 3 days ago

Worth mentioning a version of ms Edit is now opensource. Not only that, it is extraordinary code to learn from:

[1]: https://github.com/microsoft/edit [2]: https://news.ycombinator.com/item?id=44031529

  • self_awareness 3 days ago

    I wouldn't learn from it, since it uses unsafe code even for basic stuff, like hash calculation.

    It's also a complete reimplementation, it shares only the name with original edit.com.

    • nhatcher 3 days ago

      Well, you wouldn't learn as is let's learn Rust from it. But I think it has many interesting bits not found in most open source projects out there. and true that, there is a lot of unsafe code, uses nightly builds, its own allocator, crazy stuff. I did learn a lot from it :)

      And yeah, it is a reimplemantion. But it is a TUI and very minimal. Keepining it minimal, with no dependendecies and software bloat seems to be one of the guidelines. So very much something adding to the article at case.

      But yeah, you are right on both counts.

duanhjlt 3 days ago

Nostalgia aside, those classic TUIs nailed responsiveness and cohesion. Modern setups can match features, but rarely that instant, synchronous feel. Emacs + Magit shows the power of text-first integration, yet JetBrains-style debuggers and glue still win for many. It’d be great to see a modern, fast, Borland‑like TUI with solid LSP and LLDB integration.

  • dualogy 3 days ago

    TextAdept, the most criminally-underhyped high-quality text editor that I know of, has supported LSP for quite some time now IIRC, and has (not just GUI versions but also) a first-class TUI version.

    • jasperry 3 days ago

      TextAdept and its possibilities have intrigued me for a while now; a fast Qt GUI and Lua-based configuration sounds like a sweet spot for me. But I've never fully taken the plunge to configure it for development. I'm open to reading more propaganda for it if you can point me to any :)

guerrilla 3 days ago

I'm going to get punished for saying this, but I don't really see the point of IDEs when you have things like vim, Makefiles and bash. It just seems like more things to go wrong. I used Eclipse while I was doing Java development for a while and it had some conveniences but for the most part I just see it as one more thing that can go wrong and get in my way.

Anyway, does anyone remember Metrowerks CodeWarrior? I see it still exists, but I mean back from the 90s. I got a T-shirt from them at MacWorld '99 and still had it until not too long ago. High quality merch.

  • raw_anon_1111 3 days ago

    As someone who has been doing this either professionally (since 1996) or as a hobbyist programming in assembly and a little Basic (1986-1992), I’m always amazed at the feigned Slashdot style “I haven’t owned a Tv in 40 years why do people still watch them”.

    Are you really saying that you don’t see any utility in modern IDEs? Even back in 1999 I thought Visual Studio was a breath of fresh air let alone R# with all of the built in refactors in 2008.

    But going further back, to the Turbo days in college and my first few years working, breakpoints, conditional breakpoints, watches etc were a godsend

    • zahlman 2 days ago

      I won't speak for OP, but generally I don't think the attitude is feigned.

      I've tried IDEs and didn't like them. I don't even really like having close parentheses auto-typed for me. It breaks my flow.

    • guerrilla 3 days ago

      What I'm saying is that they can't do anything I can't do in a terminal. Another way of putting it is why would I need an IDE other than UNIX (GNU) itself?

      > But going further back, to the Turbo days in college and my first few years working, breakpoints, conditional breakpoints, watches etc were a godsend

      gdb does all of that.

      • raw_anon_1111 3 days ago

        I can also walk 13 miles or get in my car and drive. So why do I need a car?

        GDB does guaranteed safe refactors over large code bases?

      • nec4b 3 days ago

        >> What I'm saying is that they can't do anything I can't do in a terminal.

        Why do you need a terminal for if you can do all that with flipping switches and looking at LEDs?

      • miohtama 3 days ago

        Because working in an IDE is an order of magnitude easier than with UNIX tools, especially for novices, significantly increasing productivity. The author also covers this a bit.

  • a-dub 3 days ago

    > Anyway, does anyone remember Metrowerks CodeWarrior?

    didn't it have a cute little re-distributable header file that had a bunch of useful containers in it? (linked lists, hash tables, etc)

    i didn't work with it much but once worked with a mac guy who added it to our project. sometimes i'd have to build his stuff, i remember lots of yellow road construction icons!

  • fragmede 3 days ago

    The convenience is the point. Instead of having to go and find the file that lists a class's functions, an IDE can list them and you can just click on the one you want. As the author points out, LSPs do that function in the modern era, but the point is, it's useful. Doesn't have to be your cup of tea, but you should at least be able to see the point.

  • tjpnz 3 days ago

    I do like the "quietness" of vim. With a minimal configuration I've replaced most of the conveniences I enjoyed with PyCharm and VSCode, but without the constant notification spam and weird virtualenv configuration issues I previously had switching between projects.

  • ivankelly 3 days ago

    I recall CodeWarrior being the official ide for SymbianOS when I started there. And it sucked, but likely more due to the integration. I think sucky custom rarely working IDEs is what pushed me to full time emacs

  • tpm 3 days ago

    I can't really imagine navigating huge Java codebases with vim or bash. OTOH I used vim to work with Perl (that was not a small codebase either but had a very different structure).

cmrdporcupine 3 days ago

One I used to love back in the 80s/90s was GFA Basic on the Atari ST. In a similar category of TUI (mostly, it did have mouse control and menu bars, but you didn't have to reach for them) with great auto-indentation and code folding (features not common in mainstream editors at the time) and instant compilation and error checking.

It took many decades for me to get that kind of flow back for mainstream programming languages on modern computers. And modern IDEs still have higher latency than they should.

Surac 3 days ago

I agree with emacs. It is a fantastic operating system but it lacks a good text editor. I remember using the Borland ide and miss there clear design language. Fighting with ide, gui and language is no fun. I miss the ide I could just start up over a rs232 connection and use it

markus_zhang 3 days ago

Good article.

I'm more of a GUI guy who is contend with VSCode. I'm intrigued to learn Emacs but don't have the time for it.

Back in the 90s, however, Borland TUI was indeed the pinnacle. I remember I played with Turbo C for a while but did not learn anything, but it was fun just to use the IDE.

  • bigstrat2003 3 days ago

    I think that GUI editors are just plain superior for doing serious work. I'm a Sublime guy myself, but really any GUI editor blows any text based option out of the water. The only good use case for text based editors these days is to quickly edit and save config files while ssh'ed into a server.

    • markus_zhang 3 days ago

      I heard people can be pretty productive in Emacs/Vim. My issue is that I’m not a great programmer, so 99% of the time is spent on reading code and figuring out the steps on paper. I’m sure TUI can be great for that purpose too, but GUI IDEs on multiple screens are on par at least.

mrbonner 3 days ago

The Turbo Pascal “IDE” was my very first dive into programming, and it’s still the best experience I’ve ever had! Maybe it’s just because it was my first, but wow, it already had tons of cool features back then, like syntax highlighting, step debugging with breakpoints, and a quick peek at variable values.

Do you think there’s anything like that out there today? The only ones I can think of that are closed are nano and micro editors, but I wouldn’t really call them IDEs.

buescher 3 days ago

Thirty years ago was 1995. TUI programming environments were closer to obsolete than obsolescent. We had 32-bit versions of Visual C++. We had Delphi and the Borland C++ stuff for Windows. We had Codewarrior. On the lighter end of RAD, Visual Basic was four years old, HyperCard was eight years old, and LabView was nine years old. The future was very unevenly distributed back then. I see now the article is from 2023, well, adjust appropriately.

username223 3 days ago

When I use an editor, I don't want eight extra KILOBYTES of worthless help screens and cursor positioning code! I just want an EDitor!! Not a "viitor". Not a "emacsitor". Those aren't even WORDS!!!! ED! ED! ED IS THE STANDARD!!!

TEXT EDITOR.

-- https://www.gnu.org/fun/jokes/ed-msg.txt

greatgib 3 days ago

A little bit later, there was visual editors for gui app like Delphi and Visual Basic and co.

Despite VB to be a little bit shitty, I think that a big loss happened in the GUI software development world since web apps became the norm.

Not many remember this world where you could easily graphically create your UIs by placing components and that reactive interface were a given without effort.

I really miss the original Delphi before things went DotNet shitty...

  • 1313ed01 3 days ago

    The nicest thing about programming for DOS (or probably any old home computer or console?) is that you are in full control of inputs and timing. If you only need to update the screen when the user hits a key for instance you can just call a function to wait for the next key-press, handle that, ask for the next... There is no async, no callbacks, no events, no threads. Nothing is easier than just imperative code doing things one line then the next, then the next. You can still have a main loop, but you do not need to, and if some function you call somewhere to handle something wants to wait for a key-press before returning it can do that and you do not have to yield or anything.

    I'd love to see some modern environment replicate that somehow. Let us pretend everything is simple and synchronous even if it very much isn't.

shermantanktop 3 days ago

I struggle to understand what the author actually wants, aside from nostalgia about a specific look and feel that he imprinted on. And perhaps the simplicity of having few features.

I would have appreciated a breakdown of what specific individual features those crummy old ides are offering.

I suspect the one the author wants most is a time machine to go be 12yo again, but software can’t do that. Yet.

geenat 3 days ago

Learned to code with Borland Turbo C++

Moved to Dev-C++

Nowadays just any editor and using GCC directly

Eternally greatful for open source, Microsoft charged thousands for Visual C++ back then.

James_K 3 days ago

Of course, Emacs can do everything listed in this article, even edit remote files over SSH while in graphical mode.

cess11 3 days ago

I don't know, all of those are pretty similar to me.

I'd like to be able to develop in other languages the way I do when I dabble in Pharo, i.e. mostly windows and widgets and dialogs that abstract away boilerplate, file and directory management, and allows me to relatively easily extend the environment when I feel like it.

Instead I tend to complement the editor or IDE with a rather large set of Linux and Unix programs in terminal emulators. It's not nice or easy to teach, but nicer than trying to figure out whatever module protocol used by the editor. Perhaps I could have stayed with Emacs and been content, but when I arrived at this methodology Emacs was still single threaded and quite sluggish in comparison.

I'm hoping Glamorous Toolkit might be the thing that eventually grows into what I'd like to have.

LaGrange 2 days ago

The amazing thing about Turbo Pascal was that my memory was better, my focus was better, I could stay up much longer and my right knee didn’t hurt.

Edit: I think there is a lot of “do it because we can do it not because we benefit from it” in modern software. But fetishizing the past gets a bit silly. I do still remember getting plenty of useless documentation in old TP. And blasting through an interface that lags behind your typing is how you end up doing something completely unpredictable because one keystroke was a bit off.

inetknght 3 days ago

> Each program was its own island because its interface was unique to the program. However, they were all so similar in how they looked like—80x25 characters didn’t leave much room for uniqueness—and how they worked that the differences didn’t really get in the way of usability and discoverability. Once you learned that the Alt key opened the menus and that Tab moved across input fields and buttons, you could navigate almost any program with ease.

This is the biggest thing I miss in modern GUIs, especially Windows, macOS, or mobile.

Tabbing across every single possible inputs, with alt or control keys for quick access is insanely powerful compared to "click here, scroll this, click click"

  • oblio 3 days ago

    I haven't checked in Win 11, but all standard Windows apps used to be fully navigable by keyboard only. Notepad, Wordpad, etc.

G_o_D 3 days ago

Been setteled on https://scintilla.org/SciTE.html It has simplicity + extensible for any language, you can create your own linter + highlighter

Its nothing bloated, its not embedded ide like vscode and all

Rather scite is just gui frontend for cli based binaries, it uses programming environments you have installed on your os

Just like old time on cmd or sh, we used compilers eg javac or cpp

Now scite just make it easy, its exactly same as borland turbo, you add path to your compiler binaries in scite and done you click compile or run,

Plus its lightweight and portable, carry it in usb and run on any computer by just setting paths to compiler and executor binaries

skopje 3 days ago

1. coders will use every available resource, and

2. there is no limit on resources.

The consequences can be left to the reader (re: the article in this thread), but these two postulates are the source of all ills in commodity and open-source software today.

  • andai 3 days ago

    Coders considered harmful.

tangotaylor 3 days ago

Having TUIs available for remote administration is an excellent point. I frequently spin up nmtui on machines with NetworkManager because I’m used to Ubuntu’s network settings GUI and I haven’t bothered to learn enough nmcli.

(“real” deployments would use systemd-networkd and config files but for simple things…who cares)

No matter how good computers and networking get, text-based tools always seem to win for remote administration. I’ve tried forwarding X servers, mounting remote file systems with sshfs, vscode’s remote features, VNC, RDP, but I always seem to revert back to just tmux and TUI tools.

Dwedit 3 days ago

The very first image in the article is not edit.com, it is the Windows 95 edit.exe which replaced it.

The actual "edit.com" is a tiny stub that launches QBasic in edit mode, equivalent to "qbasic /edit".

jmyeet 3 days ago

I learned on Turbo Pascal many years ago. It was amazing. There's another aspect beyond the TUI though: compiler speed. Turbo Pascal was designed for compilation speed. It was significantly faster than, say, Turbo C++ for an equivalent program.

But this brings up something I think about every now and again: resource bloat.

When Turbo Pascal was currently it'd be common for PCs to have 1MB of RAM. In fact with the DOS memory model you had to do weird stuff to use more memory than that (IIRC it was called "large mode").

Obviously running in a graphical environment is going to use more memory but we had pretty capable Windows environments with Win 95/98/SE/NT3.5/NT4/XP with not much RAM (256MB to 1GB depending on year).

Now with modern windowing systems we have capabilities that didn't exist in early windowing OSs like scalable rather than bitmapped fonts, UI scaling, etc. But we've had those things for 20+ years now and the resource requirements still keep going up.

Yes we have Javascript UIs running in a browser now and that will never be as cheap as native apps but we've also had those for ~20 years now (GMail is ~20 years old).

In the 90s we had graphical X Windows systems on Linux with 4-16MB of RAM. I know. I ran them.

Why do resource requirements keep going up? Is there demand for a low resource OS that could be user-facing? I know hardware is particularly cheap with Raspberry Pis and similar. We have ARM CPUs for a few dollars now that would've cost millions in the 1990s. So maybe that's why there's no demand.

But this is really something I expected to top out at some point and it just hasn't.

random29ah a day ago

It's sad that no one has commented about Ultimate++ ( https://www.ultimatepp.org ).

I believe it's the easiest way (at least for me) to quickly create GUI programs.

But of course, nothing beats Borland's interface from the DOS era.

kristianp 2 days ago

It's funny that students in the early 90s would use a pirated copy of turbo C or pascal, but in the early 2000s when Java took over curriculums, students couldn't access a decent IDE. Many never saw a debugger in action, and executed javac in a dos window.

  • elric 2 days ago

    That sounds unlikely. AFAIK both Eclipse and Netbeans have had pretty good debuggers since the early days, probably around 2002ish.

    • kristianp 2 days ago

      The 2nd year students I interacted with weren't aware of those and they weren't available in the PC labs at the university. I think there was something called BlueJ IDE as a learning tool.

      https://en.wikipedia.org/wiki/BlueJ

      • elric a day ago

        That sounds like a teaching problem instead of a tooling problem.

        Snark aside, I think there's value in teaching the basics first (e.g. compiling stuff with javac) before moving on to using magic to hide the complexity.

  • lmz 2 days ago

    I recall using Eclipse 2.x in the mid 2000s and it was good, although not fast.

  • titzer 2 days ago

    IntelliJ was already awesome in the early 2000s.

reaperducer 3 days ago

The author notes that he programmed back in the 1980's, but the article only focuses on the mid-90's IDEs.

I'd like to see a companion article about the IDEs from the 80's.

I remember 64FORTH had a multi-pane IDE, but I could only find this low-res picture of it: https://www.c64-wiki.de/images/thumb/2/24/Forth64-audiogenic...

There were others, though, including one I remember that was all text at the bottom half of the screen, and then graphic output at the top.

And, of course, the most famous one of all: the Atari 2600 BASIC Programming IDE which fit in just 4K.

Today's ragebait bloggers like to say how awful it was, but if you're patient and thoughtful, the way people were when it came out, you can do quite a lot.

An entire Pong game in six lines, from Wikipedia:

  1 Hor2←2+Key
  2 IfVer1>90ThenVer1←88
  3 IfHitThenVer1←9
  4 Ver1←Ver1+IfVer1Mod2Then8Else92
  5 Hor1←Hor1+7
  6 Goto1
  • somat 3 days ago

    I do want to point out that the 2600 was at it's heart a slightly generalized pong machine, to the degree that I don't find it surprising that you can make pong in 6 lines of 2600 basic.

    The 2600 graphics were centered around 5 sprites dedicated to two players, two missiles, and a ball. Completely understandable, they were trying to make a toy computer affordable enough for everyone in 1975. but their design process was basically "what is the bare minimum video hardware required to make the games "combat" and "pong". Every single game found on the 2600 that is not a combat or pong clone is probably a masterwork example of making the hardware do something it was not intended for.

    https://en.wikipedia.org/wiki/Television_Interface_Adaptor

    Footnote: yes I know it was released in 1977, but it was designed in 1975.

chuckadams 3 days ago

I learned the C language from K&, and the C API by right-clicking on all the things in Turbo C on a 286. Learned a fair bit of lisp from emacs using C-h f. I do love the navigation capabilities modern IDEs have, but unless a library's author has gone off the deep end writing doc comments, they don't have the same discoverability.

ataru 3 days ago

The interfaces look like they wanted to be graphical, as they have windows and drop down menus, and they wanted to have multi-tasking, as they implemented the overlay tools in Borland Sidekick. They wanted those things but they were limited to staying in text mode because widespread adoption of any real graphical interface was slow.

It seems a little humorous now that professionals were stuck for several years doing their day to day word processing, spreadsheets and databases in text mode, where getting different sized text or different fonts was almost impossible. This also wasn't just in the 80s, it was still somewhat true in the early 90s, not very long before the beginning of the internet as we know it.

Still, I wonder if things are really any better now, as we're all using software interfaces built on something else that's not really appropriate for the job. HTML.

bibiver 3 days ago

I saw this and was like, what?

> have we advanced much in 30 years?

IDEs have changed a lot, specially with AI-assisted ones. The author kind of acknowledges it, but imho it's a paradigm shift. Not just "a major difference".

> The only major difference that we are starting to see might be AI-assisted coding, but this is a feature mostly provided by a remote service, not even by the installed code!

Then I realized it’s a post from 2023. IDEs have changed a lot since then. Autocompletion has evolved from merely suggesting function names to completing 20 lines of code in the blink of an eye. It's great for productivity, but it also makes you lazy, to the point where you can't live without it.

In my opinion, software engineers should “disable the autopilot” from time to time, just like airline pilots must occasionally land without it. Otherwise, you end up becoming too dependent on it.

  • nxor 3 days ago

    Is this not an overstatement? How does a person understand code if they write so much of it with AI?

    • FpUser 3 days ago

      Because I tell AI exactly what and very often how to write the code to avoid sub-optimal solutions AI so keen to propose if not properly directed.

      As for autocompletion, not sure about every tool but CLion and other IDEs I have from JetBrains are genius. Yes they can autocomplete multiple lines of code with a single keystrokes and no I do not really want to write it myself as it mostly boilerplate code I've written many times and autocompletion just predicts it.

      • nxor 3 days ago

        How can you recognize optimal / suboptimal solutions if you need to use AI in the first place? As for boilerplate, I thought there were ways to automate this without AI, but I guess that makes sense to me. Not trying to sound accusatory, just jarred by the AI hype generally

        • FpUser 3 days ago

          >"How can you recognize optimal / suboptimal solutions"

          Maybe because I have 40+ years of programming under my belt starting with machine codes and every type of software one can imagine.

      • zahlman 2 days ago

        > and no I do not really want to write it myself as it mostly boilerplate code I've written many times and autocompletion just predicts it.

        See, I don't want my code to require boilerplate in the first place.

    • pohl a day ago

      Speaking only for myself, I do it by reading

    • tpmoney 3 days ago

      The same way a tailor understands stitches even if a sewing machine creates most of the stitches a tailor will ever stitch. Because understanding code is tangential to writing it. Which isn’t to say that writing it doesn’t help solidify knowledge and certainly plenty of people learn by doing and there may even be skills that atrophy as a result of not writing code in the same way that skills for writing assembly have atrophied with the use of higher level languages. But ultimately it is possible to understand some code without having written most of it by hand.

qingcharles 3 days ago

What amazes me every day is that I'm using literally the exact same GUI to build apps that I was over 30 years ago, with Visual Studio.

You could sit someone down from 1991 (Visual Basic 1.0) in front of Visual Studio 2026 and they would immediately know where everything is. (it still has BASIC in there too)

anonymousiam 3 days ago

I read this earlier today. I agree 100% with the author's opinion that Turbo Pascal had the best IDE ever. I had been using it on the CP/M platform (Z-80 only), and it was so much better than everything else in that era that it was like magic.

rednafi 3 days ago

I love reading about and occasionally tinkering with older text editors like vim, emacs, ed, acme, ex, nedit, and as such.

For actual work, though, I’ve been using VS Code exclusively since its inception. Electron might be a bloated mess, but spending time on alternatives doesn’t feel worth it. Maybe that’s because I didn’t grow up in the golden era of computing and can’t make the vim workflow stick no matter how hard I try.

I’m pretty sure twenty years from now, this generation of developers will get blurry-eyed reminiscing about how fast and feature-packed VS Code was, and how Microsoft built the best GUI text editor of its time.

As for TUI editors, I love micro because it has mouse support and doesn’t make you memorize a spellbook just to move around.

SomeHacker44 3 days ago

We had Symbolics Genera, not mentioned, and way better than anything else mentioned here, IMO.

constantcrying 3 days ago

The arguments for using TUI IDEs are just very poor. Developers should not be relying on something as loaded with legacy bloat like the terminal, to do development.

Zed has remote editing support and is open source. Resource consumption is a bizarre proposition, considering what abstractions the terminal has to be forced into to behave something like a normal window.

Really, TUIs are not very good. I get it, I use the terminal all the time and I will edit files with vim in it, but it is a pointless exercise to try to turn the terminal into something it was never meant to be and try to have it emulate something which would be trivial on a normal OS window. To be honest it makes me cringe when people talk about how much they perform tasks in the terminal, which would be much easier done in a graphical environment with proper tools.

  • jeroenhd 3 days ago

    One thing that's nearly impossible to replicate on modern systems is the extremely tight feedback loop these TUIs had. Keyboard latency was near non-existent while basic calculators these days will happily take a hundred milliseconds to process a key press.

    We don't need to go back to the 66MHz era, but it's embarrassing that programs running on a dozen computer cores all executing at several gigahertz feel less responsive than software written half a century ago. Sure, compiling half a gigabyte of source code now finishes before the end of the year, but I rarely compile more than a hundred or new lines at a time and the process of kickstarting the compiler takes much longer than actual compilation.

    A terminal is no more than a rendering environment. With some workarounds (a custom renderer and input loop most likely), you can probably compile Zed to run in a FreeDOS in the same environment you use to run Turbo Pascal. I doubt you'll get the same responsiveness, though.

    • badsectoracula 3 days ago

      AFAIK Borland C++ (even on Windows) used to read the source from whatever editor buffers you had already in the IDE and since the compiler was part of the IDE, it cached various states in memory, which is why it was so fast (for a C/C++ compiler anyway - Delphi was much faster) even on slow hardware. Meanwhile Visual C++ (and modern IDEs) had you autosave the file to disk so the compiler, that was launched as a separate program (often for each file), could read it (and rebuild its internal state from scratch for every single file).

      • dapperdrake 3 days ago

        From what I remember researching it really js this.

        Today, Python, Rlang, PHP, Java, and Lisp bring these features. But not C. Oh the irony.

        • somat 3 days ago

          C does as well, that is half the point of make. When building a large C project it will first create a bunch of object files from the source files then link them into an executable. make then keeps track of what source files have changed and rebuilds only those object files. The first build is slow, subsequent builds are much faster. only needing to compile one file then link them.

          At least that's the theory, in reality make has a lot of warts and implementing a good solid make file is an art. Don't even get me started on the horrors of automake, perhaps I just need to use it in one of my own projects but as someone who primarily ports others code, I hate it with a passion. It is so much easier when a project just sticks with a hand crafted makefile.

          For completeness: The other half of make is to implement the rest of the build process.

          • uecker 3 days ago

            I would say autoconf/automake is not really useful anymore and probably somebody should just establish a new and simplified standardized setup and Makefile for C projects.

            And yes, efficient separate and incremental compilation is major advantage of C. I do not understand why people criticize this. It works beautifully. I also think it is good that the language and build system are separate.

          • badsectoracula 3 days ago

            I think you probably refer to something else than what i meant in my post above.

            Borland C++ had the compiler as part of the IDE (there was also a separate command-line version, but it was also compiled as part of the IDE). This allowed the IDE to not spawn separate processes for each file nor even need to hit the disk - the compiler (which was already in RAM as part of the IDE's process) would read the source code from the editor's buffer (instead of a file, so again, no hitting the disk) and would also keep a bunch of other stuff in memory between builds instead of reading it.

            This approach allows the compiler to reuse data not only between builds but also between files of the same build. Meanwhile make is just a program launcher, the program - the compiler - need to run for each file and load and parse everything it needs to work for every single source file it needs to compile, thus rebuilding and destroying its entire universe for each file separately. There is no reuse here - even when you use precompiled headers to speed up some things (which is something Borland C++ also supported and it did speed up things even more on an already fast system), the compiler still needs to build and destroy that universe.

            It is not a coincidence that one of the ways nowadays to speed up compilation of large codebases is unity builds[0] which essentially combine multiple C/C++ files (the files need to be aware of it to avoid one file "polluting" the contents of another) to allow multiple compilation units reuse/share the compilation state (such as common header files) with a single compiler instance. E.g. it is a core feature of FASTbuild[1] which combines distributed builds, caching and unity builds.

            Of course Borland C++'s approach wasn't perfect as it had to run with limited memory too (so it still had to hit the disk at some point - note though that the Pascal compilers could do everything in memory, including even the final linking, even the program could remain in memory). Also bugs in the compiler could linger, e.g. i remember having to restart Borland C++ Builder sometimes every few hours of using it because the compiler was confused about something and had cached it in memory between builds. Also Free Pascal's text mode IDE (shown in the article) has the Free Pascal compiler as part of the IDE itself, but in the last release (i think) there is a memory leak and the IDE's use keeps increasing little by little every time you build, which is something that wouldn't matter with a separate program (and most people use FPC as a separate program via Lazarus these days, which is most likely why nobody noticed the leak).

            [0] https://en.wikipedia.org/wiki/Unity_build

            [1] https://fastbuild.org/

    • constantcrying 3 days ago

      >One thing that's nearly impossible to replicate on modern systems is the extremely tight feedback loop these TUIs

      Why? Yes, VSCode is slow. But Zed and many neovim GUIs are extremely responsive. Why would achieving that even be impossible or even that hard? You "just" need software which is fast enough to render the correct output the frame after the input. In an age where gaming is already extremely latency sensitive, why would having a text editor with similar latency performance be so hard?

      Do you have any actual evidence that zed or neovide are suffering from latency problems? And why would putting a terminal in the middle help in any way in reducing that latency?

      • jeroenhd 3 days ago

        I'm not sure if you know what "terminal" means. I'm not talking about terminal emulators (the "terminal" program on macOS/Linux/Android/etc.) but actual, real terminals. The "terminal" is a text mode rendering mechanism built into computers of the terminal era. The closest modern operating systems come to it is the terminal-like environment you can get on Linux or the *BSDs by disabling the GUI, but even those merely emulate text mode, they still contain the stacks upon stacks of timers and necessary to process input peripherals.

        The problem is the entire software stack between the keyboard and the display. From USB polling to driver loops and GPU callbacks, the entire software stack has become incredibly asynchronous, making it trivial for computers to miss a frame boundary. Compared to DOS or similar environments, where applications basically took control over the entire CPU and whatever peripherals it knew to access, there are millions of small points where inefficiencies can creep in. Compare that to the hardware interrupts and basic processor I/O earlier generations of computers used, where entered keys were in a CPU buffer before the operating system even knew what was happening.

        VSCode isn't even that slow, really. I don't find it to be any slower than Zed, for instance. Given the technology stack underneath VSCode, that's an impressive feat by the Microsoft programmers. But the kind of performance TUI programs of yore got for free just isn't available to user space applications anymore without digging into low-level input APIs and writing custom GPU shaders.

        In small part, CRTs running at 70Hz or 85Hz back in the mid-80s, as well as the much smoother display output of CRTs versus even modern LCDs, made for a much better typing experience.

        • anthk 3 days ago

          PS2 keyboards and mice had direct interrupts thru IRQ's.

  • gldrk 3 days ago

    Terminals are full of legacy bloat, but TUIs don’t have to be. I don’t think Borland IDEs used ANSI.SYS.

    How is graphical vim even different from TUI vim? At least Emacs can render images.

    • bitwize 3 days ago

      Even 68k-based systems running in single digit megahertz could run full featured terminal emulation and have a lot of other stuff going too. There's legacy stuff in terminals, but compared to all the other stuff you've got going (Wayland, GTK, frickin' browser engine) it isn't bloated.

  • dardeaup 3 days ago

    I'd bet you never seriously used Borland's Turbo Pascal for DOS versions 5.5 or 6.0. That IDE was extremely FAST. A lot of really good software was written in it back in the day (pre-internet).

  • bitwize 3 days ago

    I know, man.

    Unless your window full of text is GPU-accelerated, tear-free and composited, with raytraced syntax highlighting and AI-powered antialiasing, what is even the point?

    TUIs are great if your structure them around keyboard input. There's more of a learning curve, but people develop a muscle memory for them that lets them fly through operations. I think the utility of this is sorely underestimated and it makes me think of my poor mom, whose career came to an end as she struggled with the new mouse-driven, web-enabled custoner service software that replaced the old mainframe stuff.

    The late 80s/early 90s trend of building GUI-like TUIs was really more to get users on board with the standard conventions of GUIs at a time when they weren't yet ubiquitous (among PC users). Unifying the UI paradigms across traditional DOS and Windows apps, with standard mouse interactions, standard pull-down menus, and standard keyboard shortcuts was a good thing at the time. Today it's less useful. Things like Free Pascal have UIs like this mainly for nostalgia and consistency with the thing they're substituting for (Turbo Pascal).

    • constantcrying 3 days ago

      You are conflating a method of interaction with a method of drawing things to the screen. These are totally different things. Whether you have a keyboard focused interface like vim or not, has absolutely nothing to do with whether you are drawing graphics by sending escape codes to a terminal emulator to render the interface.

      Neovim and it's frontends prove that if you remove terminal emulators the applications become better. The terminal emulator is just in the way.

      There is absolutely no reason to build that keyboard focused interface around the terminal. Just drop the terminal and keep the interface, just like neovim did.

      • bitwize 3 days ago

        Thats_just_like_your_opinion_man.gif

        • constantcrying 3 days ago

          This isn't reddit. Please do not do this, it is not only totally dishonest it makes discussion impossible.

          What I said about the separation of user interaction to graphics is also not an opinion.

          • JSR_FDED 2 days ago

            You must be fun at parties.

  • tcoff91 3 days ago

    Neovim is my favorite editor and is a brilliant TUI.

    I think what TUIs get right is that they are optimized for use by the keyboard.

    I don’t care if they are a pain for devs to write vs OS APIs, they have the best keyboard control so I use them. I despise the mouse due to RSI issues in the past.

    • constantcrying 3 days ago

      Neovim instantly becomes a superior piece of software if you use any of the GUI frontends. If you use neovim inside a terminal you are just straight up using an inferior product, with less features and more problems. The terminal version is most likely slower as well as you now also have the entire legacy terminal overhead.

      >I think what TUIs get right is that they are optimized for use by the keyboard.

      Neovim is just as much a GUI as a TUI. You can even use it as a backend for VSCode. Nothing about the keyboard controls have anything to do with this.

      • alfalfasprout 3 days ago

        What do you get using a GUI frontend? I'm genuinely curious. I have a pretty modern neovim setup and have never missed having a GUI.

        Heck, on modern terminals there's even pretty great mouse integration if you want.

      • prinny_ 3 days ago

        > If you use neovim inside a terminal you are just straight up using an inferior product, with less features and more problems

        I use neovim like that and the selling point for me is that it's 1 less program that I have to install and learn with the added (crucial) benefit that it doesn't update on its own, changing UI and setting that I was used to.

        • constantcrying 3 days ago

          >benefit that it doesn't update on its own, changing UI and setting that I was used to.

          This exact thing remains true though, you are using the exact same neovim, but instead of it being wrapped inside a totally bizarre piece legacy software, it is rendered inside a modern graphical frontend. It looks mostly the same, except it handles fonts better, it is independent of weird terminal quirks and likely faster. There is no dowside.

          And again, your point about using TUI stuff because of the input method or whatever is just false. Neovide has the exact same input method, yet has a complete GUI. Using the terminal makes no sense it all, it is the worst neovim experience there is.

        • rkomorn 3 days ago

          > it's 1 less program that I have to install

          It ships with your OS?

      • tcoff91 3 days ago

        Yeah but I also use a bunch of other stuff inside of Kitty so by using it in Kitty it composes well with the rest of my tools. Kitty windows and neovim splits integrate perfectly with smart splits. I even get images in the terminal and in Neovim.

  • marstall 3 days ago

    and yet zed is straight up slower than turbo c++

    • constantcrying 3 days ago

      What? "Slower" how? And why would dev experience not matter more?

      TUIs are bizarre legacy technology, which are full of dirty hacks to somewhat emulate features every other desktop has. Why would any developer use them, when superior alternatives, not based on this legacy technology, exist and freely available?

      • jagged-chisel 3 days ago

        Lots of opinions in the thread without any substance to back it up. If you don’t like TUIs and terminals, that’s ok. But if you actually want to argue against them, let’s hear a substantive argument. What specifically is so bad about the TUI?

        • constantcrying 3 days ago

          They are built on ancient technology and need an enormous array of hacks to emulate basic features, which are trivial to do in any modern GUI.

          User experience is inconsistent with features varying wildly between terminals, creating a frustrating user experience. It is also making customization difficult. E.g. in a TUI IDE you can not have font settings. Short cuts are also terminal dependent, an IDE can only use those shortcuts the terminal isn't using itself.

          Something as basic as color is extremely hard to do right on a terminal. Where in a normal GUI you can give any element a simple RGB color, you can not replicate that across TUIs. The same goes for text styling, the terminal decides what an italic font it wants to use and the IDE can not modify this.

          They are also very limited in graphical ability. Many features users expect in a GUI can not be replicated or can only be replicated poorly. E.g. modern data science IDEs feature inline graphics, such as plots. This is (almost) not replicable on a Terminal. If you are using profiler you might want to plot, preferably with live data. Why arbitrarily limit what an IDE can do to some character grid?

          The terminal is just a very poor graphical abstraction. It arbitrarily limits what an IDE can do. Can you tell me why anybody would seriously try to use a terminal as an IDE? Terminals UIs are more complex, because they need to handle the bizarre underlying terminal, they are often less responsive, since they rely on the terminal to be responsive. There might be some very marginal improvement in resource usage, do you think that is even relevant compared to the much increased dev experience of a normal GUI?

          There absolutely is no real advantage of TUIs. And generally I have found people obsessing over them to be mostly less tech literate and wanting to "show off" how cool their computer skills are. All serious developers I have ever known used graphical dev tools.

          • nec4b 3 days ago

            As someone already mentioned before, I don't think you are talking about the same terminal as others are.

            >> need an enormous array of hacks to emulate basic features

            What are those hacks. As far as I can remember, TUIs ran faster on ancient hardware then anything else on today's modern computers.

            • constantcrying 3 days ago

              >As someone already mentioned before, I don't think you are talking about the same terminal as others are.

              People know perfectly well that I am talking about the way in which a terminal emulator can be used to display 2D graphics. By utilizing specific escape sequences to draw arbitrary glyphs on the terminal grid.

              >What are those hacks.

              Everything is a hack. TUIs work by sending escape sequences, which the terminal emulator then interprets in some way and if everything goes right you get 2D glyph based graphics. Literally everything is a hack to turn something which functions like a character printer into arbitrary 2D glyphs. Actually look at how bad this whole thing is. Look at the ANSI escape sequence you need to make any of this work, does that look like a sane graphics API to you? Obviously not.

              >As far as I can remember, TUIs ran faster on ancient hardware then anything else on today's modern computers.

              This is just delusional. Modern 2D graphics are extremely capable and deliver better performance in every metric.

              • zahlman 2 days ago

                > Look at the ANSI escape sequence you need to make any of this work, does that look like a sane graphics API to you? Obviously not.

                Of course it doesn't, because it isn't a graphics API. It's a styled text API.

                > Modern 2D graphics are extremely capable and deliver better performance in every metric.

                A big part of the complaint is https://danluu.com/keyboard-latency .

              • nec4b 3 days ago

                There are no escape sequences when running TUI apps in DOS. They have direct memory access to the video card.

                >> This is just delusional.

                That is a bit uncalled for.

                • constantcrying 3 days ago

                  Did you just not read the rest of my post?

                  We are not talking about DOS, we are talking about "modern" TUIs you would use on a modern Linux/Windows/MacOS system.

                  I even made that explicit in my first paragraph.

                  • nec4b 3 days ago

                    I don't think others are talking about what you are angry about. I said that with the first reply and I'm not the only one saying it. Nobody is trying to take Zed or Neovim away from you.

                    By the way one of the most frequent modern TUI apps that I use is Midnight Commander. It's a very nice app, which I use mostly when I SSH into a remote machine to manage it. Is there a 2D accelerated GUI that can help me do the same?

                    • constantcrying 2 days ago

                      >Is there a 2D accelerated GUI that can help me do the same?

                      Of course. Just mount it through ssh and use whatever file manager you already have. It is very silly to switch tools just because the machine is somewhere else.

                      Switching tools just to accommodate the machine being remote is just bizarre to me. You even said that you used mc mostly for remote machines. What is the point of that? Now you have to use at least two tools which do the exact same thing, except you only use one when the system is remote? Does that not seem like a total waste? It would be one thing if you said that mc is what you always used, but that is not the case, you actively switch tools just to accommodate the machine being remote. Why? Do you think that is reasonable at all, when something as simple as just mounting over ssh exists?

                      • nec4b 2 days ago

                        You seem to have very strong feelings when other people have different preferences then you. Why would use words like bizarre, delusional and total waste, when discussing such trivial matters.

                        >> Why? Do you think that is reasonable at all, when something as simple as just mounting over ssh exists?

                        In short yes. I use it mostly on remote machines and on my desktop Linux machine. Before that I used Norton Commander on DOS. I don't remote only from Linux machines but also from a Windows laptop. It is much quicker and easier to simple run "mc" in an ssh session when I need it than trying to mount the drive and then run another application on the local machine.

                        • jagged-chisel 2 days ago

                          And a remote command over SSH isn’t going to incur the network delay multiplied by every file you want to touch.

      • marstall 2 days ago

        i'm talking about turbo c++ of the 90s. it was a completely integrated experience. no screen switching was key. everything contained in one app. worth all the nostalgia but you kind of had to be there! and yep it was faster than zed.

  • lproven 2 days ago

    This is just a bundle of your opinions -- to which you're entitled, even if they're totally wrong -- but downthread I see you attempt to defend them and claim they are objective fact, which is simply ridiculous.

    TUIs are a superb tool. They were when they were first standardised in late-tera DOS apps in the late 1980s and early 1990s, and they still have a place today.

    Here are some primary reasons you have not considered in your rant:

    * UI standards and design

    TUIs bring the sensible, designed-by-experts model of UI construction and human-computer interface from the world of GUIs into text-only environments such as the terminal, remote SSH connections, and so on.

    For example, they let one set options using a form represented in dialog box, by Tabbing back and forth and selecting with Space or entering values, without trying to compose vast cryptic command lines.

    This is not just me; this is the stuff of jokes. This is objective and repeatable.

    https://xkcd.com/1168/

    https://xkcd.com/1597/

    * Harmonious design

    A well-done TUI lets users use the same familiar UI both in a GUI and at the console. This is the actively beneficial flipside of the trivial cosmetics you are advocating: you praise a text-mode app implemented in a GUI because it can do more. That is a poor deal; a ground-up native GUI app can do much more still.

    But TUIs bring the advantages of familiarity with GUIs to situations where a GUI is unavailable.

    * Common UI

    The apps you cite as positive examples are markedly poor at following industry-standard UI conventions, which suggests to me that you are ignorant that there are industry standard UI conventions. Perhaps you are too young. That is no crime, but it does not mean I must forgive ignorance.

    Nonetheless, they exist, and hundreds of millions of people use them.

    https://en.wikipedia.org/wiki/IBM_Common_User_Access

    TUIs allow familiar UIs to be used even when a GUI or graphics at all are unavailable.

    TUIs are not just about menus; they also define a whole set of hotkeys and so on which allow skilled users to navigate without a pointing device.

    * Disabilities and inaccessibility

    Presumably you are young and able-bodied. Many are not.

    GUIs with good keyboard controls are entirely navigable by blind or partially-sighted users who cannot use pointing devices. They are also useful for those with motor disabilities that preclude pointing and clicking.

    Millions use these, not from choice, from need.

    But because those tools are there, that means that they can also use TUI apps which share the UI.

    And the fact that this common UI exists for keyboard warriors like myself, who actively prefer a keyboard-centric UI, means that the benefits of a11y carry across and remain benefits for people who do not need a11y assistance.

    =====

    That's 4 reasons, intertwined, that you showed no sign of having considered. IMHO any 1 of the 4 is compelling on its own but combined any 2 would be inescapable and all of them together, for me, completely rebut and refute your argument.

trashface 3 days ago

By this point in college 30 years ago I had switched to mostly emacs, and was struggling with it - our program was unix based (solaris) with gcc. But a few years before that I was using turbo pascal, which was indeed a very fast ide, partially by virtue of how low latency hardware was back then.

I like these programs, mostly for that sweet low latency which is just gone today, but I wouldn't romanticize them as dev experiences. To experience it you can download free pascal today and use theirs which is just like turbo pascal (may even based on turbo pascal?). Its pretty clunky compared to what you get today, although, the debugger works which is more than what you can say for the majority of languages today.

never_inline 2 days ago

I learned programming in turbo C++ - not because I grew up in 1990s, but because Indian education system is stuck in 1990s.

> So the question I want to part with is: have we advanced much in 30 years? Modern IDEs have some better refactoring tools, better features, and support more languages, but fundamentally… they haven’t changed much.

I can't give up any of: intellisense, type checking as I type, semantically meaningful navigation.

paradox460 16 hours ago

I wish NeWS had taken off. Don Hopkins' videos on it and psiber are amazing.

andsoitis 3 days ago

Delphi - fantastic, modern RAD IDE. Borland heritage.

Can build native apps for Windows, Linux, macOS, iOS, and Android.

https://www.embarcadero.com/products/delphi

  • dardeaup 3 days ago

    Delphi is still very impressive. However, they missed out on a much greater opportunity. Part of Delphi's crown jewels is VCL which can only be used on Windows. If you use Delphi for an OS other than Windows you have to use FireMonkey/FMX. Lazarus has LCL which is VERY similar to VCL, but LCL on Lazarus is not limited to Windows. One can write a LCL application and it works the same on Windows, macOS, and Linux. If Delphi had extended VCL to macOS and Linux it would have become much more valuable. Just my $0.02.

    • andsoitis 3 days ago

      Don’t disagree that VCL across all the platforms would be a game changer.

      However, the quality and reliability of the Delphi experience together with mobile support overcome the VCL/FMX trade off in my books.

      • nobleach 3 days ago

        I had high hopes for Kylix back around the turn of the millennium. At that time my company was looking moving an organization with field agents to a full Linux-based system. Our options were: 1. Keep the existing CA Clipper accounts receivable/accounts payable apps and run on emulated DOS. 2. Attempt to leverage the Harbour language (CA Clipper compatible web based thing). 3. Rewrite the system in Delphi/Kylix. We actually got fairly far with Kylix and I'll always be a fan of Delphi. In the end a pure web-based rewrite won over all those original options. I feel bad for whomever took over that old PHP4 stuff!

    • spwa4 3 days ago

      VCL was ported to linux in the "Kylix" product, for both Pascal and C++. It was non-free and didn't see any uptake really.

      • dardeaup 3 days ago

        Granted, I never used Kylix, but it seems that it had all sorts of problems when it was first released. I don't remember, was Kylix available for Mac?

        • microtonal 3 days ago

          As far as I recall VCL was ported, but the IDE itself was running WINE (as it was written back then) and it was not very stable.

          I just googled and Wikipedia seems to confirm my memory: https://en.wikipedia.org/wiki/Borland_Kylix#Features

          • int_19h 3 days ago

            Kylix apps in general looks like Windows apps, which is to say, rather out of place on a Linux desktop.

      • badsectoracula 3 days ago

        IIRC it wasn't VCL but another framework like VCL that was built on Qt.

        LCL (Lazarus' equivalent of VCL) took another approach where the base stuff are very Windows-y (due to the VCL heritage) but the backends have to essentially implement not only the backend-specific (Gtk, Qt, etc) widget functionality but also a small subset of the Windows API.

        While this makes porting harder for the Lazarus developers, it makes it easier to port stuff between OSes and even port stuff from Delphi to Lazarus (some developers can also use both Delphi and Lazarus - e.g. AFAIK Total Commander uses Delphi for the 32bit builds and Lazarus for the 64bit builds).

        • OCTAGRAM a day ago

          It was called CLX, forms were stored in *.xfm, and it was based on Qt

RcouF1uZ4gsC 3 days ago

To me VB 6 was the height of RAD IDEs

You could throw together a CRUD app in under an hour interactively.

  • rbanffy 3 days ago

    VB was the GUI equivalent of Dataflex - you could design the screen and it would automagically create the data structures under it. I also remember, from the same period, Mantis (from Cincom Systems) that did the same for 3270 terminals and IBM mainframes.

    I often say Deteflex is Ruby on Rails for the VT100.

ta12653421 2 days ago

ha, he even lists RHIDE!! :-D

The only free solution back then in which you could have a free 32bit protected mode DPMI built-in. And a VESA linear frame buffer!!!

Good ol days :-D

garganzol 3 days ago

Far Manager on Windows, and Midnight Commander on Unix, work wonders for terminal-based development nowadays. Not only they allow you to have OS commands at the tips of your fingers, but also they allow you to navigate freely in the file system structure of a project while viewing/editing files with built-in or external editors.

UI/UX of those tools is pretty close to Borland IDEs, they have steep learning curve (at least 10x easier than vi/emacs).

alexshendi 3 days ago

My favourite is Texas Instruments PC-Scheme. Complete with Emacs-like editor. You could compile and evaluate regions in the editor. It is amazing what you can do in 2MB or even 640K.

matt7340 3 days ago

Great nostalgia! I fondly remember QuickBasic, and how excited I was to compile my BASIC code. And the rarely mentioned gem I thought was amazing at the time: Visual Basic for DOS!

justinhj 3 days ago

These dos style TUIs are live and well in commercial and industrial settings. Most often you see them in fast food order trackers where their simple clarity stands out.

dardeaup 3 days ago

A few others not mentioned:

DOS: FoxPro 2.x, dBASE III Plus, dBASE IV, Turbo Pascal 5.5/6.0 was probably the pinnacle for me

OS/2: Watcom VX-REXX - extremely powerful and productive

Windows: Delphi before .NET

  • niutech 2 days ago

    There is also Open Watcom for FreeDOS.

rimmontrieu 3 days ago

Thank for the article, this brings back so much good memories and nostalgias. Turbo Pascal was my first programming language a couple decades ago, at that time it felt like super power when you could tell the computer what to do by learning the language and just hit the Compile button.

The IDE was also so clean and intuitive, which was perfect for new programmers.

biql 2 days ago

I had to use Borland when studying but never really liked it. Later I tried Notepad+ with console for the first time and everything clicked. There is a console for compiling and running and there is a text editor for editing. It was the flow that made sense.

submeta 3 days ago

Ahh, memories. I started hacking in Emacs on my Amiga 2000 in 1988. And later in Turbo Pascal in 1991ish.

When I saw Visual Studio years later, or Visual Basic, these IDEs were doing so much more, but I‘d lose the ability to fully control the bare text. These MS tools wouldn’t allow me to write my code in my favourite text editor and version it. So they were nice and a curse at the same time.

astatine 3 days ago

Nostalgic! Turbo C was my preferred IDE over many years in the late 80s to mid 90s. What an amazing tool! Those key bindings, used in so many other IDEs since, are burned into muscle memory. Even after decades of not using them, they bring a smile back. CodeWarrior, the debugger, helped me understand what happens when you run a program more than literally anything else I read or was taught.

manithree 3 days ago

The Borland Turbos had a much bigger market impact, but I was blown away by the Zortech C++ IDE in 1988. I don't remember why any more, but I was very dismissive of other TUI IDEs of the day after using Zortech. Even in the early 1990's when I was professionally using PWB (Programmer's Waste Basket) I still felt Zortech's IDE was superior.

zzo38computer 3 days ago

MegaZeux was originally a DOS program, but now has emulated PC text mode (with some enhancements, such as SMZX and unbound sprites) and continues to use a TUI, although it is not as good as Microsoft and Borland (e.g. it does not have ALT+letters in a menu to select with different colour letters, etc).

mickeyp 3 days ago

The knocks against Emacs feel unwarranted. It has plenty of colour; it has mouse support, even in the terminal, but not all terminals support it, so it's optional. It also runs in a GUI with, you know, image support and whatnot.

You can rail against its defaults, but do not make misleading claims.

  • internet_points 3 days ago

    Yeah, the menu bar thing just makes no sense. Here's what a completely uncustomized emacs looks like: https://i.imgur.com/0vFsd3p.png

    If you for whatever reason absolutely need to run it in the terminal, then you'll have to either learn that F10 toggles the menu bar, but then it still looks like a real menu bar that you can navigate with the arrows and enter: https://i.imgur.com/ETA2Qhs.png (or you can `M-x xterm-mouse-mode` to use the mouse in the terminal).

    (That said, I'm sure the out of the box experience with Borland was quite a bit better back in the day, if you only needed Pascal or C++ support. And emacs really could do with a better default-theme; e.g. simply changing to the built-in modus-vivendi-tinted and it looks like https://i.imgur.com/lRAWzJK.png instead. Doesn't help with the tool-bar icons from 1999 or whatever though)

    • lproven 2 days ago

      > Here's what a completely uncustomized emacs looks like

      I think that the key thing you're missing here is that the contents of the menu matter as well as the visual presentation.

      Emacs's menus, in my (very) limited experience, expose a very strange hodgepodge of Emacs concepts and terms in a very odd grouping that presumably makes some kind of sense for Emacs folks.

      I am not an Emacs person. I use CUA interfaces everywhere. This determines and specifies the names of the menus, which ones have (...) meaning that they lead to a dialog box, which ones have (->) which means they lead to a submenu, and they have standard options in standard places.

      The Emacs ones are just... weird random noise, in a random layout, that makes no sense to me, and the few parts that are vaguely recognisable make little to no sense.

      It's not just the presentation. Users of menu-driven tools need the presentation and the content and the organisation of the content.

  • frou_dh 3 days ago

    Also, the set of top-level things in the menu bar is not static. So even if you cannot directly interact with it for some reason, it gives you a hint that new things are possible in particular contexts. (Same goes for the 'tool-bar' that's distinct from the menu-bar)

  • rbanffy 3 days ago

    I wonder if it's still possible to run Guy Steele's era EMACS.

sys_64738 3 days ago

I first used an IDE back in 1989 with MicroFocus COBOL from 1983. 30 years seems relatively new.

immibis 3 days ago

There's a recurring pattern that everything is worse now. There's a happy medium - the really old stuff is too basic, while the really new stuff is also basic. But why is the new stuff worse than the moderately old stuff?

shadowgovt 3 days ago

Oh, this reminds me: I should turn off the menu bar in emacs. Not like I ever use it.

majormajor 3 days ago

Lotta focus on TurboPascal vs Emacs or whatnot at the console level, but you couldn't give TurboPascal to a complete newbie any more than you could give them IntelliJ. The mouse is an advantage here, not a disadvantage.

  • Narishma 2 days ago

    The mouse worked fine in the Turbo IDEs.

KronisLV 2 days ago

> Both Free Pascal and QB64 are maintained and under relatively-active development, with their most recent releases in 2021… but they are mostly ignored because they expose arcane languages that most people have no interest in these days.

Touché. Personally I think Pascal (the FPC/Lazarus variety) was pretty cool, straight up one of the best ways to easily do cross platform GUI apps, something of that old RAD fame: https://www.lazarus-ide.org/index.php

I wish someone would prove me wrong, what are the best modern cross-platform options for native GUI?

At the same time, for everything else in similar circumstances (statically compiled executables, relatively safe to code and use), Go has replaced it for me, in great part due to both the ergonomics of the language, but also just how batteries included the standard library is.

prmoustache 3 days ago

author seems to be mistaking text editors and IDE. I wouldn't put Emacs in any of these categories, it is like an operating system without the kernel part but I am not exactly sure how to call it?

OCTAGRAM a day ago

Such recallings are incomplete without fake TUIs. TUI that are seemingly text, but in graphics mode. Norton Utilities were running in text mode with altered fonts and had nice UI elements. Acronis OS Selector 5 could run in multitude of different modes: ordinary text mode with pseudographics; Norton Utilities-alike text mode with altered fonts; VGA graphics mode; SVGA graphics mode. But whatever mode, it was drawing UI in a grid. Mouse cursor was the only thing not bound to grind, but when windows were dragged by mouse, they were jumping by discrete steps, staying grid aligned at all times. I was digging in. They are Watcom C++ binaries, and Watcom can remote debug them over COM port. No sources, of course, but I can trace CPU and watch memory. So far I know that it indeed has a concept of text buffer, and each cell is 4 bytes. These bytes encode background glyphs and colors. Text can be over window title, so window title skin is encoded in background glyph. Text can be over tab title. Text can be over button. And there are icons of operating systems in the list. And there are icons on toolbar buttons.

Acronis OS Selector 5 features not much UI. There is FDISK, advanced enough for its time. It can resize FAT partitions, and there is UI for adjusting size and other stuff. There is HTML-based help system with hyperlinks. And there is some editor for AUTOEXEC.BAT and CONFIG.SYS. Acronis OS Selector 5 is about switching OSes, and specific DOS or Windows 9x boot configurations are also "OSes", so editing of boot files is helpful. But apart from that there is almost nothing to do in this OS Selector.

I wish something more daily would be coded in such UI, but nope. And Acronis dropped that style. I don't know what happened with Selector 6 and 7, but it's like 8 is coming right after 5, and 8 UI is not grid-aligned anymore.

fred_is_fred 3 days ago

I wrote my first program using Borland Turbo Pascal probably around 1993 at my high school. I think those systems ran DOS 3.0 or maybe 5.0? It was all trivial stuff but I found it to be very helpful to debug issues.

jbverschoor 3 days ago

30 years ago we had visual C++, with an amazing debugger. Visual J++, Visual Basic. Normal Basic. All with amazing debuggers and / or “REPL”s and compilers times of 0 or next to nothing.

haolez 3 days ago

The fun thing is that, if something like Claude Code can be competitive with modern IDEs, these ancient IDEs could become competitive as well with some kind of integration with AI workflows :)

mikewarot 3 days ago

I was expecting to see the most productive IDEs of all time, Visual Basic 6 and/or Borland Delphi, but we're just ahead of those in this article.

While it was truly amazing that Borland managed to stuff a full text editor into a TSR under MS-DOS, and every new version of Turbo Pascal was faster and had more features, it all culminated somewhere around Delphi for me, and Visual Basic 6 for almost everyone else.

Then the world ended... Anders Hejlsberg was lost to Microsoft, and everyone went collectively crazy in at least two orthogonal ways.

First there was the obsession with C++ as "higher level" than Pascal and the view that it was for "adults", which was delusional. C++ generated a f*ckton more boilerplate and was brittle for the same functionality, at least when generating a GUI program.

Then there was Microsoft's obsession with .NET, which they never recovered from. They crammed all the bloat of an interpreter into everything imaginable, even the operating system. You were always having to get the latest .NET libraries to make things work. They destroyed Visual Basic over this, and it never recovered.

qiller 3 days ago

Loved Borland IDE at the time. I still miss Ctrl-KB/KK (IIRC?) style selections from time to time.

These days Far Manager (via far2l) or MC kind of scratch the itch for quick TUI edits.

  • lproven 2 days ago

    As other commenters have noted, the name for the Ctrl+Kx UI is the WordStar keystrokes.

p0w3n3d 3 days ago

We haven't lost. The best functionality of IDE is navigation (find implementations, calls etc) and refactor. Other functions are merely enhanced notepad.exe functionality

rajkhare05 3 days ago

I remember using Borland Turbo for C/C++ classes in my school and college days. In fact, it is still being used in most of the colleges in India even now. What nostalgia!

exceldrawing 3 days ago

Visual Cafe (I know they got a bad rep after Symantec bought them but I felt the bugs weren't major enough to make stop using it back then).

auggierose 3 days ago

Oh wow. Slightly surprised about the emotions these pictures evoke! Not that I would want to program with that today, but I sure had fun with it back then.

newswasboring 3 days ago

I grew up on the borland Turbo series. Learned C then C++ on it. Such nostalgia.

I was wondering, is there a way to get VS code to look like this? Maybe neoVim?

timpera 3 days ago

Substack needs to stop blocking VPN and data center IP addresses. I can't read the article because I'm on a train's WiFi…

bvan 3 days ago

Back when you could focus on the code you wrote and not worry about the overly cluttered distractions of the likes of VSCode.

fragmede 3 days ago

No mention of Visual Studio, as distinct from Visual Studio Code. No mention of JetBrains, PyCharm. The author didn't mention that Borland cost $99.95 in 1987, back in the day. Now, Visual Studio costs $499.92/mo. Yeah, the free versions aren't as fully featured or as well integrated as the thing you paid for, what else is new?

shevy-java 3 days ago

I honestly don't understand why we lose IDEs and editors.

I understand that software requires people maintaining it, but my point is more that I still don't understand the why.

We have some functionality, a lot of which could be re-used in editors and IDEs. But people rarely share stuff. They like to re-implement things. Again and again and again. This is not logical to me.

I'd like to have one editor that does EVERYTHING, but in a modular way so people decide what that editor can do. People could then just maintain one cohesive, feature-rich code base rather than each one duplicating what is already available in another IDE/editor.

msephton 2 days ago

Needs Macintosh System 7 IDE representation.

joe91 3 days ago

No love for MultiEdit? :(

  • alexott 2 days ago

    I’m also was wondering that it wasn’t mentioned. At some point I did all programming in it, compiling via command line, switching to Borland IDEs only for debugging….

a-dub 3 days ago

good memories of the borland stuff! (especially super duper high resolution 43 line mode!), it was where i actually cut my teeth on c/c++ before moving to coherent then linux!

foofoo12 3 days ago

Could we or do we have anything like that for the modern web?

fnord77 3 days ago

30 years ago I was using XEmacs version 19 something

andai 3 days ago

I enjoyed this. Make sure to read the comments too!

dusted 3 days ago

The issues mentioned in the article is generally applicable. While modern software does manage to make life marginally easier and simpler in some very specific ways, they generally manage to do so while also looking like absolute garbage, consuming hundreds of megabytes of memory to do tasks requiring kilobytes while burning through multiple orders of magnitude the amount of instructions actually needed to do the work.

Yes, Visual Basic was indeed the pinnacle, and today, it is QT, for what it's worth. But no, let's go write HTML and CSS, and when the Stockholm syndrome gets us bad enough, why not some React or Angular to get the party of pain going again ?

gcanyon 3 days ago

Looking at Windows UIs from the '80s -- did Microsoft just not know the phone numbers of any graphic designers? Or did they make it ugly on purpose?

  • layer8 3 days ago

    There are no Windows UI screenshots in the article.

    • gcanyon 3 days ago

      Fair —- but are you claiming either:

      1. The DOS screenshots in the article are in any way reflective of a designer’s input

      2. That Windows was a visually pleasing design?

      • layer8 3 days ago

        The DOS screenshots are reflective of the PC video hardware of the time. Text mode had a fixed 16-color palette [0] at best, the IBM font including graphics characters was preset, while the aspect ratio of the characters wasn’t fixed (the screenshots in the article are 80x25, but I used 80x40 or 80x50, with correspondingly more quadratic text cells). However, the screenshots aren’t quite representative of how things looked on a CRT monitor, however; it looked more vibrant and organic, if that makes sense.

        Personally I didn’t find Windows visually pleasing before Windows 95, but much of that can again be attributed to the PC video hardware limitations of the time.

        [0] https://en.wikipedia.org/wiki/Color_Graphics_Adapter#Color_p...

        • gcanyon 3 days ago

          Thanks for the trip down memory lane! I worked with DOS in… WordPerfect? I don’t remember for sure which word processing application it was. But I honestly don’t remember ever seeing anything remotely “graphic“ in my DOS days.

          • lproven 2 days ago

            Oh, there was.

            WordPerfect 6 had a full GUI mode with a very vaguely Win3-like GUI implemented in DOS.

            Borland Quattro Pro had one too.

            Microsoft Word could be flipped in and out of it: in it, you got WYSIWYG bold, italic, underline etc, and more lines on screen, but otherwise the UI remained much the same.

            PowerQuest imitated Win95 so well in PartitionMagic it was pixel-perfect.

            It was entirely a thing in the late DOS era. It let DOS apps look competitive, and yet demand far lower system requirements and run on much older machines than one needed for Windows.

            • gcanyon 2 days ago

              My time with WP ended with version 3? maaaybe 4

              • lproven a day ago

                Are you sure? Your earlier comments expressed uncertainty if it was WordPerfect at all.

                WordPerfect on the PC was a very niche app before version 4.2 which was the big hit. I knew a tiny handful of places that had copies of the older version but they weren't running it.

                It is on the edge of before my time -- I started my first job in late 1988 -- but before WP 4.2, WordStar still dominated, with some specialist users running DisplayWrite or MultiMate. I'm not American and I think the US market was different with shareware taken slightly more seriously, so more presence of PC Write, and maybe XyWrite or other tools little seen in the British Isles.

      • lproven 2 days ago

        > That Windows was a visually pleasing design?

        For a while, it was.

        Windows 2 was kinda ugly.

        https://guidebookgallery.org/screenshots/win203

        Windows 3/3.1/3.11 were fine.

        https://guidebookgallery.org/screenshots/win30

        Muted, boring, but you could look at it all day. And we did.

        95 improved it.

        https://guidebookgallery.org/screenshots/win95osr2

        Tasteful greys, spot colour.

        NT 4 improved that a bit more.

        https://guidebookgallery.org/screenshots/winnt40

        Categorised Start menu, for instance. But nearly identical.

        95/NT4 were visibly inspired by NeXTstep, IMHO the most beautiful GUI ever written.

        Then it all started to go a bit wrong. The first pebbles bouncing down the mountainside presaging a vast avalanche.

        Windows 98.

        https://guidebookgallery.org/screenshots/win98

        IE4 built in so Microsoft didn't get broken up my the US DOJ. Explorer rendered local content via HTML. Ugly extra toolbars. Some floating, some embedded in the task bar. Ugly gradients and blends in window title bars.

        Cheap and plastic and tacky.

        But that is around the time that media and gaming PCs went mainstream, home internet use (often over dialup) went mainstream, and the alternatives died out (Amiga, ST & GEM, Arm & RISC OS) or very nearly died (classic MacOS, NeXT merger, Rhapsody).

        So it's what many saw first and loved and remembered.

        Result, people write entire new OSes designed in affectionate homage:

        https://serenityos.org/

        Look at the toolbars. Look at the textures in the title bars. This isn't Win9x, this is specifically Win98.

        https://www.digibarn.com/collections/screenshots/KDE%201-x/i...

        Specifically:

        https://www.digibarn.com/collections/screenshots/KDE%201-x/h...

        <- textured title bars

        https://www.digibarn.com/collections/screenshots/KDE%201-x/t...

        <- gradients in title bars

        https://www.digibarn.com/collections/screenshots/KDE%201-x/m...

        <- Windows-style colour schemes

        KDE started out as a reproduction of Windows 98/98SE by a team who didn't realise that what they were looking at was WordPerfect 5.x instead of WordPerfect 4.x -- as the late great Guy Kewney put it:

        "WordPerfect 4.2 was a bicycle. A great bicycle. Everyone agreed it was a great bicycle, just about the best. So what Wordperfect did was, they put together a committee, looked at the market, and said: 'what we'll do is, we'll put 11 more wheels on it'."

        Win98 is Win95 festooned with pointless needless Internet widgetry because the DOJ was about to split MS into separate apps and OS companies, because MS drove Netscape into bankruptcy by bundling IE free of charge with Windows.

        Strip all that junk off and what's left underneath is a better UI. But the German kids writing their "Kool Desktop Environment" didn't realise.

        After that came WinME and Windows 2000, which turned down the bling a bit as the lawsuit was over, but it was only a blip.

        Then came XP with its "Fischer-Price" themes.

        Then Vista with gratuitous transparency everywhere because GDI.EXE had been ripped out and replaced with a compositor and that's no fun if you don't use some 3D features like see-through stuff.

        Then 7 toned that down a bit and everyone love it.

        Then the universally detested Win8, and then that was toned down and the Start menu put back for Win10, which is roughly what UKUI and Deepin copied in China, or Wubuntu in the West.

        Then Win11, as copied by AnduinOS and a few others, which for this long-term Windows user is the worst release ever. I can't even have a vertical taskbar any more. It's abhorrent.

ttul 3 days ago

My view of the PC dev era was through the lens of a kid growing up in the 1980s with a dad who programmed for a living. My dad was a big fan of the Borland IDEs starting with Turbo Pascal and then moving on to the world of C and C++ by the late-1980s. As a kid, my friend and I spent hundreds of hours in Quick Basic’s TUI - always trying to remake Super Mario Bros but never coming close to succeeding.

These early IDEs were fantastic at their job and worked so well given the constraints of the DOS environment of the time. It’s a shame that Borland the company eventually faded to black in 2015, but that’s how these things go. I wonder where all the geniuses behind the Borland IDEs ended up.

fithisux 3 days ago

I liked RHide a lot 23 years ago

Chinjut 2 days ago

The title needs to be updated. These are IDEs we had 32 years ago.

  • Chinjut 2 days ago

    This was titled "The IDEs we had 30 years ago..." in 2023. It is now 2025.

  • ej_campbell 2 days ago

    32 years ago was 1993. Those IDE's were from the 80's.

    • Narishma 2 days ago

      They were still very popular in 1993.

29athrowaway 3 days ago

The experiences so far that have been left behind:

- A RAD TUI like Microsoft Visual Basic 1.0 for MS-DOS

- DolDoc from TempleOS, with diagrams and sprites

- Clipper, DBase, FoxPro, etc

Sophira 3 days ago

The article itself is a nice look at the state of editors and IDEs compared to those of yesteryear... but the site itself restyles my scroll bar to be thinner and less usable, which kind of spoils the experience.

I hate it when sites do this. I don't want my window decorations to be restyled in the name of aesthetics. I need them to be usable.

csmpltn 3 days ago

Crimson Editor.

  • cwnyth 3 days ago

    A name I haven't heard in nearly 20 years. This was my go-to editor on Windows. Crimson Editor and Dev-C++ were great free tools.

sph 3 days ago

Now that CLI tools are in fashion again... has nobody thought to recreate a modern version of Turbo C++/Pascal?

I know there's Emacs and vim, but they're far too programmable and bloated compared to the elegance of TC++, which did one job, and one job only, very well. Also, despite being an Emacs power user at this point, it's never going to be as ergonomic and well thought out with its arcane chords, while TC++ conveniently shows all possible keybinds throughout its UI.

  • coolcoder613 3 days ago

    Have you seen tvision[0] and turbo[1]?

    [0] https://github.com/magiblot/tvision [1] https://github.com/magiblot/turbo

  • AlexeyBrin 3 days ago

    FreePascal has a text mode IDE similar to the old Turbo Pascal 7.0 that you can use in a Terminal. So you can use a modern Pascal compiler from it.

  • Brian_K_White 3 days ago

    A hundred years ago I found something called XWPE and managed to build it for sco osr5, and then pretty much never used it for real.

    (That doesn't imply I went with VS or similar fat ide, just that I didn't end up using xwpe for real. I tried code::blocks for a while but mostly just use geany or a plain editor.)

  • Dwedit 3 days ago

    DOS programs ran in Text Mode, and directly interacted with the keyboard hardware.

    Linux Terminal programs are running in an emulated terminal, and are bound by keyboard input restrictions that DOS programs did not have.

salvesefu 3 days ago

Schoolaged 1995 coder self with a fpuless Mac would like a word about what we lost/gained (no available c compilers at the time).

The need for tui argument is vague outside of muscle memory. Lots of beautiful poetry though.

That age of computing the author is romanticizing was expensive and corporate fed stupid (RIP Mr Bollenbach my hs cs teacher who gave us weekly insider tech reports).

I feel like tui folk need their stack/os/integrated environment...oh wait. Nevermind.

"Is FreeDos the Moderate Libertarian TempleOS?"

PeterStuer 2 days ago

The Think Pascal IDE was on another level for it's time.

geldedus 2 days ago

"Old man yelling at modern IDEs"

codezero 3 days ago

Fun fact not mentioned in the article is that nano descends from pico which emerged from being the default editor in the email client pine.

pragmatic 3 days ago

TUIs sucked and they still suck.

Programmers are trying to bring them back bc nostalgia I guess?

I floated the idea of TUIs to our data engineering team and got very negative responses. (My nostalgia for undergrad turbo pascal TUI I guess lol)

  • 1313ed01 3 days ago

    They are undeniably programmer-friendly though, no matter how hated by users. Much easier to do things when you are limited to just a grid of fixed size characters rather than the bizarre complexities of modern GUIs.

  • loloquwowndueo 3 days ago

    Care to elaborate as to why they suck?

    • bob1029 3 days ago

      It's not that TUIs suck in terms of their inherent capabilities. It's that they're generally a miserable tool for the job, especially if it's a big one.

      TUIs are like shovels. A perfectly rational tool for doing a little bit of digging. Visual Studio 2022 is like Bagger 293.

  • tcoff91 3 days ago

    TUIs are great! So fast and efficient to use and accomplish tasks in.

  • dardeaup 3 days ago

    Some do and some don't. Have you ever used any to develop an application?

    I suppose a lot of it is also relative. When I started with TUIs decades ago, we didn't have too many options. Turbo Pascal 5.5 or 6.0 was extremely nice to use back in the day.

  • realharo 3 days ago

    I think TUIs mostly suck for IDEs, but some tools like k9s or htop are nice.

    • tstenner 3 days ago

      Even k9s would profit enormously from detachable dialogs. Just let me do something without losing my current log view.

  • nec4b 3 days ago

    >> TUIs sucked

    Compared to what available at that time?

neuroelectron 3 days ago

I feel like we should go back to ASCII for programming languages. Does your IDE really need emojis? Parsing unicode is intractable.

  • zzo38computer 3 days ago

    I agree, and I still do use ASCII for (most) programming languages. I do not use emojis and most programs I wrote do not use Unicode (although often there is a use to go beyond ASCII, but even then, I will use better character sets rather than Unicode).