The Death of Utilitarian Programming

30 points by pyeri 2 days ago

Utilitarian coding is defined as follows: The code you write should be *directly* useful or serve the interest of at least one actual human being. It might appear somewhat abstract or vague, so examples might help. For example, I don't consider frameworks as utilitarian code. What you create are like the "frames" of a picture box, someone else (the user) will take it and draw the actual picture. Though you did help with part of the process, it's indirect at best. You're part of the supply chain here, not part of the team.

A clever and witty bash script running on a unix server somewhere is also not utilitarian coding, no human ever directly benefited from it.

Libraries can be somewhat utilitarian, at least more than frameworks. At least they provide some reusable functionality to the user out of the box like logging, scanning a barcode, fetching data from a URL, etc. But again, a lot of indirection and little lasting time, what did *you* learn about implementation and life in that process my friend?

It's my strong belief that our life's purpose isn't just about learning technology but also other non-technical things in life (such as life itself). By compartmentalizing themselves into libraries, frameworks, specifications, package managers, build and tooling, etc, many coders over the last decade have sort of divorced themselves from the intricacies and interaction with life itself.

A decade ago from now (i.e. circa 2014-15) is where I'd say utilitarian coding came to an end. The kind of programming that prevailed until then (mostly desktop programming) was highly utilitarian in nature. You used to develop a Winforms App for the client, with actual textboxes, dropdowns and buttons, tailored to their specific requirements and domain knowledge, what could be more utilitarian than that! You used to gain domain expertise and not just technology expertise.

As things started moving to the cloud, the interaction between the end-user and programmer became less and less, that's when utilitarian coding started dying too. As a new breed of specialists called "Agile Experts", "Scrum Masters", "Tech Advocates", "Thought Leaders", etc. started inserting themselves between the coder and end user, the former's role started morphing as the ostrich policy of dealing only with technology and nothing else. We started losing touch with domain expertise, and became branded as "python coder", "PHP scripter", "web developer", "AI developer", etc. That's how folks started churning out more frameworks, libraries, packages, stencils, helper scripts, etc. instead of worrying about actual problem solving with the stakeholders.

This is how things stand right now for the most part, desktop development and other forms of utilitarian coding have still maintained their small niche somewhere, but they're just a niche. But it's not a healthy development, nor is it sustainable long term. I strongly feel that this bubble is waiting to burst one day soon, and there will be a reversion towards utilitarian coding again. Even the cloud itself needs to be more utilitarian, a lot of needless clutter out there which can be simplified.

What do you think? Let me know in comments.

tacostakohashi 2 days ago

I think of it like this: in, say, the 1990s, "computers" / "tech" was fundamentally about doing _real-world_ things more efficiently. Word processing is a step up from a typewriter, desktop publishing is more efficient than typesetting, email is faster than fax or postal mail, spreadsheets allow for more efficient calculations and accounting, and obviously databases and industrial systems allow for efficient operations of businesses, warehouses, airlines, etc.

A few decade later, all the obvious, real-world, low hanging fruit applications of technology were filled, and "tech" turned into the weird, more self-contained world of "the internet", social media, advertising, the attention economy, bitcoin, high frequency trading, AI, where it's really just your computer/algorithm fighting someone else's computer/algorithm, and rather detached from the offline world.

constantcrying 2 days ago

What became incredibly obvious to me, after working in software and then as a mechanical engineer, was that software has absolutely no engineering culture.

Software has a deeply ingrained craftsman culture, which empathized personal flavor, unique approaches, stylistic debates over engineering. Surprisingly this gets worse in large organizations, where adherence to stylistic choices, specific tools and workflows, supersedes engineering at every point. Software is still full of fads, where every couple of year a new or old flavor is found, which is now the "right" thing to do, which will then be defended and attacked until a new fad is found.

  • hollerith 2 days ago

    Not all software is consumer software or web dev. The software used to control the space shuttle for example was created by an organization with a real engineering culture.

    • constantcrying 2 days ago

      Indeed. Embedded software has a very big EE influence, which comes with a real engineering culture.

      The stark difference between embedded development, especially in aerospace, to "normal" software development is really my point.

    • red_rech 2 days ago

      I’d wager it was created by scientists/physicists too and not “just” developers.

xnx 2 days ago

The microservices fad/wave was where people seemed to lose their minds. The "Solving Imaginary Scaling Issues (at scale)" meme encapsulated it for me. Most programmers of the time seemed far more interested in being architecture astronauts than making something useful. The overengineered hosting setups were also a major impediment to anyone who just wanted to make something useful.

Fortunately, AI-assisted coding seems to be wrestling back coding from developers and re-empowering domain experts and systems analysts. A full recession will probably shake out a lot more of the waste in software development.

  • surgical_fire 2 days ago

    It was always fun to work on companies doing microservice architecture for applications that would scale for thousands of users. Thousands. You could run it from a laptop probably, DB and all.

  • dartharva 2 days ago

    +1, came here to say this

  • red_rech 2 days ago

    > Most programmers of the time seemed far more interested in being architecture astronauts than making something useful.

    Of course they were, being an architect astronaut got you hired.

    You have to get passed the resume filters and the architecture interview to even have the chance to work on the internal enterprises tool no one uses anyway. People just respond to what will grow their career.

    • mountainriver 2 days ago

      Yep, the reality is it works… and that’s why things are often a mess

AnEro 2 days ago

Largely agree, given your definitions and clarifications, but I see some things are just co-related issues not directly a death of that programming approach. Where I see it is the gap between programmers and end users, scope of 'users' expanding to other programmers, and the increased complexity causes more abstract soft skill code delivery/management roles are entirely co-existing issues. Where they didn't cause the death directly, more a co-morbidity situation, didn't help, but it didn't cause the death. I'd say the primary cause is cost and complexity of operations, forcing the perspective shift from 'help at least one actual human being' to 'help at least <MINIMUM VIABLE MARKET SHARE> of users/developers'. I'd also as an aside argue frameworks and items directed at devs (that are well-designed), are still abstractly utilitarian, because, if they didn't exist a human would have to do the work of programming or doing the work manually so it would directly help at least 1 human.

cjs_ac 2 days ago

I think what you're describing is just a consequence of software companies becoming very large. Work for a small business and you'll be back writing utilitarian code.

jfrisby 2 days ago

Since this is HN, I'm gonna pick a nit.

> A clever and witty bash script running on a unix server somewhere is also not utilitarian coding, no human ever directly benefited from it.

Back around 2010, my friend Mat was doing cloud consulting. He wrote some code to screen-scrape the AWS billing and usage page for an account to determine how much had been spent day-over-day. This was, of course, all orchestrated via a bash script that iterated through clients and emailed the results to them (triggered by cron, of course).

He realized he had startup on his hands when something broke and clients started emailing him asking where their email was. Cloudability was born out of that.

I'd say that both the Ruby and bash code involved count as pretty utilitarian despite running on a server and not having a direct user interface.

  • boricj 2 days ago

    I'm gonna up the nit.

    Several years ago, I was the sysadmin/devops of an on-premises lab whose uplink to the rest of the company (and the proxy by extension) was melting under the CICD load.

    When that became so unbearable that it escalated all the way to the top priority of my oversaturated backlog, I took thirty minutes from my hectic day to whip up a Git proxy/cache written in an hundred lines of Bash.

    That single-handedly brought back the uplink from being pegged at the redline, cut down time spent cloning/pulling repositories in the CICD pipelines by over two-thirds and improved the workday of over 40 software developers.

    That hackjob is still in production right now, years after I left that position. They tried to decommission it at some point thinking that the newly installed fiber uplink was up to the task, only to instantly run into GitHub rate limiting.

    It's still load-bearing and strangely enough is the most reliable piece of software I've ever written. It's clever and witty, because it's both easy to understand and hard to come up with. The team would strongly disagree with the statement that they didn't directly benefit from it.

BlackFly a day ago

> If you wish to make an apple pie from scratch you must first invent the universe.

This take just pushes the notion of utilitarian to an extreme. In this framing, primary industry isn't utilitarian and only some subset of secondary industry is (chiefly B2C industry of products that aren't used as tools or parts of a final product by the consumer: a framemaker isn't utilitarian). The only utilitarian software in this definition is games or other entertainment software since otherwise the "utility" is an means to an end as a tool but that is exactly what a library is.

Making transistors is utilitarian. Making a library is utilitarian.

I am otherwise sympathetic to the pain points but I don't think there is an easy label for the layers of non-essential complexity that have built up and cannot easily be stripped away.

al_borland 2 days ago

I still write utilitarian programs all the time. If the code I’m writing isn’t addressing a need and making someone’s life a little easier, what am I doing?

The world doesn’t need another generic JavaScript framework, but a lot of people have little annoyances everyday that can be made better with code. This is my favorite code to write. Nothing is that impressive, technically speaking, but it changes how people work and makes their job suck a little less. I find this type of work much more fulfilling than working on some silly integration between 2 systems that will be gone in 3 years.

jmogly 2 days ago

Absolutely agree, for the most part. Luckily I think the tide is going out and developers are going to be forced to start actually solving problems in their domain out of necessity.

No more easy money = no more engineering for engineering’s sake, and companies are increasingly becoming more privy to the fact that architecture astronauts are a liability, and cloud is a 95% distractions meant to absorb time energy and money from IT orgs within large companies.

I’ve personally switched out of devops and to a domain aligned software team within my company. I am absolutely fed up with how wasteful and distracting the state of devops is to the actual business.

  • anonyfox an hour ago

    I don't know, my own DevOps practises evolve around straightforward shell/go/js snippets that try to enforce the absolute bare reasonable minimum of infrastructure as cheap as possible, but with scaling paths laid bare if needed ever. sometimes a few lambdas/workers+CDN solve problems for essentially zero cost, sometimes people are amazed how far a single VPS can carry them nowadays. Or how fast SQLite can be. DevOps is about the art of shipping value to customers as fast as possible, at scale, with minimum cost (but not being the developer itself usually). With continuous improvement loops. But I can count on a single hand how often I actually needed to "scale up" something so hard I needed full on data migrations and all.

    Small, sharp tools, my friend.

  • throwaway31131 2 days ago

    > no more engineering for engineering’s sake

    I'm not sure many successful engineering orgs did much of that but also the environment our creations live in is much different now.

    It was a huge task just to get our programs to run at all not very long ago. When I first started coding I was working on some code for a VLSI problem and I spent like a month just figuring out how to get the problem to fit on disk (not main memory, but on disk!). Now there are similar things that run on a laptop [0]. And it's not like I'm unique in having to come up with crazy solutions to problems tangent to the primary problem in this space. [1]

    Now the infrastructure for our code is amazing and the machines that execute them abound in resources, especially in the cloud. Now that the yoke has been cast off it makes sense more time is spent on the solution the actual problem you set out to solve in the first place.

    [0] https://github.com/The-OpenROAD-Project/OpenLane [1] How Prince of Persia Defeated Apple II's Memory Limitations https://www.youtube.com/watch?v=sw0VfmXKq54

andyjohnson0 2 days ago

<meta>

"Let me know in comments"

I seem to be seeing a lot more submissions to Ask HN that are basically blog posts. I'm not trying to police anyone, and if the mods are happy with this then ok. But I'm curious about whether there's actually a trend here, and whether hn users lack alternative places to post their thoughts. Something to so with twitter going down the drain?

gdulli 2 days ago

Why isn't a bash script running on a server utilitarian? I have a dozen cron jobs on my server doing different things for me. Why am I not "benefiting" from them in this definition?

Somewhere a bash script on a server might be calculating the interest my bank owes me, I'm directly benefiting from that too.

  • al_borland 2 days ago

    I have some scheduled jobs that generate reports that go out to people weekly. Without these they would be suck manuals trying to look things up and tracking it in Excel on a weekly basis. I could give it to them to run themselves, but it’s much better for it to just show up in their inbox.

    In other cases I have code that fixes error conditions when they arise. No one has to run it manually, but if it didn’t exist, they would end up with the ticket to fix it manually. Even if they forget it’s there, it is given them more time with each run.

pickledonions49 2 days ago

Interesting take. Most code I see floating around on the web is either useless or a copy of something else just written in a different way. I don't think this is going to end soon, if anything, "vibe coding" will continue make it worse.

  • pyeri 2 days ago

    At some point, someone will start calling it out. GenZ may not as they've taken these things as granted or "way of life", but GenA might if they ever start thinking critically and out of the box.

nextworddev 2 days ago

It’s best to think of software as just “content” inside VC funded software ecosystem

MattPalmer1086 2 days ago

I think you are mourning a world that never existed.

Before agile we had waterfall. Developers didn't interact with users, they got handed requirements by people who didn't know what was even possible.

It's true that software has become more abstract over time, as the need to standardise how things work overrides a bespoke approach most of the time.

AfterHIA 2 days ago

I'm with you pal but the underlying problem is a decline in ethics as technology development as more intimately paralleled the neoliberal expansion. In 60 some-odd years we've gone from the government giving money to institutions and inventors to create novel technologies to privatizing most parts of the development of our technical infrastructure. Now instead of SRI, Xerox PARC, University of Utah (Sutherland), many others we have concentrated capital and what are effectively oligarchic trusts at the core of our development strategy. This happens as ethics is withered down bit-by-bit through the efforts of groups like the Heritage Foundation + other, "libertarian" special interest groups, the unresolved social conflicts that created poor-whites out of Reconstruction, and the disdain the superstitious Christian right has had for popular culture since even before the post-war turn.

Computer ethics will not improve en-masse in the United States in years to come. We will get more privatization of the public good. We will get more protections for the monopolists. Social media manipulation and mass surveillance are just the beginning.

"How I Learned To Stop Worrying And Love The Palantir."

spankalee 2 days ago

Have you missed the entirety of the evolution of software development?

What is Winforms itself if non "non-utilitarian"? Most of an OS is non-utilitarian. Compilers, libc, databases, web servers, browser APIs, ffmpeg, OpenGL, Unity, etc., etc., etc...

2014 is a wild year to pin the end of "utilitarian" programming on, since all of the things you appear to complain about already existed by then. If anything the beginning of making programs for other programs and programmers was 1951/52 with the invention of the compiler. It's been downhill from there.

kjkjadksj 2 days ago

I think it comes from separating domain experts from writing software themselves. There was a lot of great software for different niche things that were written by people who were some what experts in these niche things that happened to write code as well. They knew what calculations people in this field would be doing, and simply wrote the function and wrapped it in some tool. Immediately useful software.

A lot of "academic" code that is pilloried by "real" software engineers is actually a great example of this. Is it the most performant? No. Could a random person off the street make use of this? No. But for those in the field, they know exactly what this tool is conceptually doing even if they don't know how it is made exactly. It is like a very specialized tool for a very specialized tradesman.

What is interesting about the shift away from utilitarian programming is that these "thought leaders" and other middlemen now find themselves in positions to set the narrative essentially thanks to monopolizing the space, change the very meaning of work to suit the software they happen to peddle rather than the other way around. We saw this with enterprise software and now we see this with AI tooling shoehorned into that enterprise software.

xg15 2 days ago

I mean, an example of a truly utilitarian software that solves a nontrivial problem would be good. Abstractions has gotten a bad name and you can certainly go overboard with it, but it's also a tool that you often need to solve more complicated problems.

Also, I think it misses a bit where programming came from. The idea of general computation was an abstract mathematical plaything long before it had concrete use cases.

  • al_borland 2 days ago

    We should also remember that most utilitarian software is built on top of those abstractions.

    If I make something utilitarian in Apple Shortcuts, my “code” is sitting on top of countless layers of abstractions and frameworks which make it all possible, which are also abstracted away behind a drag and drop interface.

    • xg15 2 days ago

      I think that's an interesting point though that is often neglected - not all abstractions are created equal.

      In a browser, both the DOM, web APIs, etc are abstractions - so are frameworks like React, ect. However, there is usually a lot less anger and attention directed at the former than the letter.

      My theory why this is the case is that the former is a "hard" abstraction boundary while the latter is "soft": For a web site, it's intentionally not possible to peek below the abstraction layer of the web APIs. However, in exchange, the browser also puts in a lot of effort to make sure that web devs don't have to go below the abstraction layer: By providing watertight error handling for everything below the layer and by giving rich debugging tools that work with the concepts given by the abstraction.

      In contrast, any frameworks and libraries that build on top of that are "soft", because developers can - and have to - lool into the code of the framework itself if some problem requires it.