aidenn0 17 hours ago

When interactive compilers and debuggers started to become more available, my father complained that junior devs would just make random changes to the code until it worked, rather than taking the time to understand it, as compared to when you had to wait for your punchcards to go through an overnight batch process.

It seems that lowering friction will always lower understanding, because industrious people will always try to get the most done, and inexperienced people will conflate getting the most done now with getting the most done in general.

  • YeGoblynQueenne 16 hours ago

    To be fair, I would guess that 90% of the programmers I know would never have learned to program if they had been forced to do it on ancient mainframes with punched card readers. Personal computers made programming something that any schmuck could learn. That sure includes me.

    But I think all that does is point to the fact that the average programmer's knowledge about computers and programming becomes poorer and poorer as time goes by. We are continuously pessimising for knowledge and skills.

    • cc101 14 hours ago

      I'm sympathetic to what you are saying, but you should include the increase in the number of things a programmer needs to know and to consider today. I've been programming for 57 years. There is a huge amount to know and the tasks are huge compared to what I knew and did back in 1968.

      Over that time I grew as a programmer as the work became more difficult. I couldn't keep up with the technology. In spite of my attempts to keep up, it seemed that every few years I had to further limit the scope of my work in order to cope. Judging by my experience, today competent programmers will fall further and further behind with what they know becoming more obsolete while they restrict their scope so they can learn the new work on the job. Young programmers won't need to know much of what today's competent programmers know. At the same time the increasing complexity of their assignments will require them to go deeper into new matters, and they in turn will become overwhelmed. And so on it will go.

      On the other hand, perhaps I don't know what I'm talking about. : )

    • BobbyTables2 13 hours ago

      I don’t think the issue is even even a lack of knowledge.

      I don’t fault a C developer for not knowing and appreciating the intricacies of a superscalar pipelined processor using out of order execution.

      But when a C developer writes a multithreaded program they need to really understand why multiple threads are necessary, proper use of locking, and how it will impact the overall application. They need to look beyond their tiny bit of code and the assigned task.

      Unfortunately a number of developers fully expect to drive on a busy street blindfolded and successfully reach their destination.

      • aidenn0 2 hours ago

        > But when a C developer writes a multithreaded program they need to really understand why multiple threads are necessary, proper use of locking, and how it will impact the overall application. They need to look beyond their tiny bit of code and the assigned task.

        Not really, just add the "volatile" keyword to global variables at random until the bugs go away!

    • euroderf 5 hours ago

      > if they had been forced to do it on ancient mainframes with punched card readers

      Interesting point. I for one started out with punched cards, and your point is valid: you had to think deeply when preparing your next program submission, because turnaround was slo-o-ow and often kept you in the computing center waiting, waiting, waiting.

      So one might argue that when we got terminals that submitted "virtual punch cards" and got back "virtual printouts" (this is late 70s), it debased the practice of programming. /s

      OTOH in this day and age, the quick compile times of Go are - to put it bluntly - absolutely wonderful.

  • BobbyTables2 13 hours ago

    In the past 20 years, I’ve seen some people make random changes until the code “worked”.

    Fortunately, I’ve had the privilege to be on teams where such were relatively few.

    Your father was certainly right.

  • senko 8 hours ago

    Linus echoing your father's thoughts in 2006:

    I happen to believe that not having a kernel debugger forces people to think about their problem on a different level than with a debugger. I think that without a debugger, you don't get into that mindset where you know how it behaves, and then you fix it from there. Without a debugger, you tend to think about problems another way. You want to understand things on a different _level_.

    Full email at https://lkml.org/lkml/2000/9/6/65

acuozzo 21 hours ago

Where did all of this trust come from?

When I first started hacking I had the expectation that every chunk of code I came across was broken in some way.

All of the software I relied upon was broken in some visible way. My Windows 95 installation would have multiple kernel panics per day. My Usenet reader would fail catastrophically when encountering non-ASCII text. My CD-ROM copies of games would freeze until I kicked the side of the computer which consistently worked.

I still see bugs everywhere nowadays, but they're more hidden and, honestly more frustrating since they're so opaque.

  • rchaud 16 hours ago

    > Where did all of this trust come from?

    A concerted PR operation from OAI and Microsoft pushing the belief that LLMs can 'reason' and thus be trusted with things beyond formulaic high school and college papers.

    • senko 8 hours ago

      I hadn't realized LLMs were a thing before Windows 95 and Usenet.

      Please try to understand the comment you're replying to instead of going with a knee-jerk reaction.

  • specificanxiety 20 hours ago

    When we decoupled results from capital. It doesn't matter how buggy your software is if people are forced to use it anyway, especially if you haven't turned a profit in ten years but you still get VC money anyway.

    Remember when "running a business" meant "making a good product and making some money in the process"? Yeah, me neither.

    • charlieyu1 19 hours ago

      Why are we blaming VC for bad products? There are always very profitable companies consistently chunking out bad products. Sometimes it feels like quality and profit is inversely correlated

      • happymellon 14 hours ago

        Microsoft isn't getting punished for it's bad experience.

        Where I work there is now a retention policy for all Microsoft products, including OneNote.

        Yep, if you managed to fight through the interface you are now greeted with an application that will lose your long term notes.

  • riehwvfbk 20 hours ago

    If you try to do this in a work context, you'll be told you are wasting time. Even if you aren't fired, you will not be considered for promotion: the way to do that is to have "a lot of impact". This means shipping a lot of half-baked stuff. The other piece of the puzzle you need is having good "work ethic". This is best demonstrated via late-night debugging heroics where you patch up the crud you shipped earlier while getting "impact" points. For whatever reason people who run companies believe that their customers wants "lots of crud quickly" instead of quality products.

    • JohnFen 20 hours ago

      How true that is depends entirely on what sort of company you're working for. It may be common with SV-style companies (and it shows), but it's not nearly as common in the rest of the software world.

danjl 17 hours ago

Writing code is easy compared to supporting, debugging and enhancing code. AI is much better at "greenfield" coding where you start from scratch, either on an entire app or a new feature. For anything non-trivial, it is terrible at debugging. At best it is a super-rubber-duck that is nice to talk to that might have a few words buried inside screenfuls of text that help a human realize what might be going wrong. We're still in the honeymoon phase, where AI has only written new stuff and hasn't been around long enough to have to support code with tens- or hundreds-of-thousands of LoC.

nuancebydefault 20 hours ago

Personally I fail to see how this worry can be 'new'.

A 'new' transportation worry: many car drivers don't know where to turn or even where they are heading without GPS.

  • Gigachad 12 hours ago

    Or more closely related, when I import a library or utilise an external tool, I don’t know how that works either.

    When I store a record in the database, I have no idea what Postgres does to do that, it just happens.

  • specificanxiety 20 hours ago

    I mean, LLMs are new. And if you can't see the difference between an entire profession using broken, hallucinatory tooling to write buggy code, and drivers using more convenient maps, then I'm not sure how to help.

    • nuancebydefault 4 hours ago

      I see the difference. Similarity of course does not mean being identical. I've read about the GPS/LLM similarities on HN, and when you think about it, it behaves a bit like a GPS that thinks up new locations and their paths 5 percent of the time. It is best to do some checks before hitting the gas.

  • joemazerino 20 hours ago

    When GPS coordinates are handled by LLM the fear won't be as novel.

specificanxiety 20 hours ago

Turns out, knowing stuff is important when you try to do stuff that you claim to be an expert in, instead of outsourcing it to a crappy incorrect tutorial generator. Competitive edge for computer programmers going forward: knowing how computers work.

  • blooalien 13 hours ago

    > Competitive edge for computer programmers going forward: knowing how computers work.

    Only if they're young or famous. Otherwise, they're automatically kicked to the curb because the common thinking these days is that anyone with even a single gray hair is "out of touch with technology" even if they've wasted their entire lives staying on the "bleeding edge". Apparently only the young understand technology, and the people who built and maintained it all their lives are just "clueless old farts" like the morons we've got in Congress passing laws about technology.

anonzzzies 18 hours ago

A lot of companies were already outsourcing to companies with humans who don't know how anything works.

mdlarson 19 hours ago

Did young coders ever know how their code worked?

  • 0x5f3759df-i 18 hours ago

    Back in my day we copied and pasted random stack overflow answers until something worked like real junior devs

    • tdeck 15 hours ago

      You had it easy! I'm old enough to remember copy-pasting things from random VBulletin forums and the comment section of PHP documentation. And sometimes from old mailing lists that showed up in search results :).

      • blooalien 13 hours ago

        You had copy and paste? Lucky!! In my day we actually had to type code by hand from books or magazines (or worse yet from memory).

    • dopidopHN 15 hours ago

      That my take on the whole AI coding industry. It’s on part with stackoverflow responses.

  • hn92726819 5 hours ago

    Senior devs generally know how code works, and all senior devs started as juniors. I think the difference is that every annoying/stupid issue turned into a lesson. Now it's one message to ChatGPT and then forgotten about.

roxolotl 20 hours ago

This is true of all levels of abstraction. The next question is: does this level of abstraction cost more than it’s worth.

kennethologist 18 hours ago

Just ask the LLM to walk you through each line of code, create an explain the dependency graphs and in a relatively short period of time they’ll know exactly how their code works. Using Claude Code is quite useful for this - I use it on GitHub repositories I’m curious about.

  • NBJack 17 hours ago

    I think the problem is "but why?" What incentive would they have when they just need a brief answer or solution, i.e. just make code do this, get the boss their answer on that, etc.?

    It's going to be fun when LLM agents do all the communication for us.

protocolture 14 hours ago

And newspapers will have people walking into traffic, and cars will lead to the extinction of horses, and if horse manure keeps piling up at this rate, london will be buried in a decade.

Ancalagon 20 hours ago

wont matter once the agents code review the agent-generated code

  • lxgr 19 hours ago

    If that happens, it'll presumably matter somewhat when it comes to continued employment of said coders.

  • tdeck 15 hours ago

    IMO the primary purpose of a code review is to check that what's written is understandable to at least one other developer besides the author. Having a machine be the primary reviewer kind of misses the point.

jxjnskkzxxhx 16 hours ago

Meh. Many coders don't know how transistors work, and they can still be productive.

If many "young coders" don't know how their code work but can solve more problems faster, is it really a problem?

  • blooalien 13 hours ago

    The problem isn't knowing how transistors work (although in the old days, you often did learn something about that in the process of learning computers), but rather not even knowing how basic code structure works at all, or simple required math, or logic. Not knowing the absolute basics of code and then thinking you're a "coder" because you can blindly copy/paste code without understanding it on any level is straight-up dangerous.

  • tdeck 15 hours ago

    Sure it's a problem if that code ever needs to be fixed or maintained. Or if it irreversibly alters data in a way that the "coder" didn't understand or intend. If it's a prototype or some kind of one-off with limited side effects, I guess there's not much risk.

  • Rury 13 hours ago

    Well, it's hard to solve a problem you don't understand right? When a problem fundamentally lies in a domain no one understands, how will it ever be fixed or solved? Best you can do is paper over the problem in some way, or somehow get randomly lucky.