jdw64 1 hour ago

The real issue, in my view, is not AI itself.

The problem is a management pattern: removing people and organizational slack because they don’t generate immediate profit, and then expecting the knowledge to still be there when it’s needed.

Short-term cost cutting leads to less junior hiring, and removes the slack that experienced engineers need in order to teach. As a result, tacit knowledge stops being transferred.

What remains is documentation and automation.

But documentation is not the same as field experience. Automation is not the same as judgment. Without people who have actually worked with the system, you end up with a loss of tacit knowledge—and eventually, declining productivity.

AI is following the same pattern.

What AI is being sold as right now is not really productivity. In many domains, productivity is already sufficient. What’s being sold is workforce reduction.

The West has seen this before, especially in the case of General Electric.

GE pursued aggressive short-term financial optimization, cutting costs, focusing on quarterly results, and maximizing shareholder returns. In the process, it hollowed out its own long-term capabilities. It effectively traded its future for short-term gains.

The same mindset is visible today.

The core problem is that decision-makers—often far removed from actual engineering work— believe that tacit knowledge can be replaced with documentation, tools, and processes.ti cannot.

Tacit knowledge comes from direct experience with real systems over time. If you remove the people and the learning pipeline, that knowledge does not stay in the organization. It disappears.

  • palmotea 32 minutes ago

    > The problem is a management pattern: removing people and organizational slack because they don’t generate immediate profit, and then expecting the knowledge to still be there when it’s needed.

    I think that's still a symptom. The real problem is ideology: the monomaniacal focus on profit-making business, which infects our political leaders, down to capitalists and business leaders, down to the indoctrinated rank-and-file. Towards the end of the cold war, the last constraint on it were abolished, the the victory over the Soviet Union made it unquestioned.

    The Chinese don't have that ideological problem. Their government appears to not give a shit about how much profit individual business make, they care about building out supply chains and a capabilities. They will bury the West, so long as the West remains in the thrall of libertarian business ideology.

  • stingraycharles 30 minutes ago

    Seems to me that - optimistically - this would shift the job of a software engineer into a more formal engineering role, and that the actual implementation is done by AI. In the same way in other areas, engineering and implementation differ and implementation can be (and is) automated.

    No idea how this should take form, though, and if it’s even realistic. But it seems like due to AI, formal specs and all kinds of “old school” techniques are having a renaissance while we figure out how to distribute load between people and AI.

    • ted_dunning 16 minutes ago

      That sounds right, but it can be superbly wrong because that presupposes that you can debug what the AI gets very confidently wrong.

      There are three legs to the stool: specification, implementation, and verification. Implementation and verification both take low-level knowledge and sophisticated knowledge of how things break.

  • vishnugupta 25 minutes ago

    > removing people and organizational slack

    You are spot on w.r.t every assertion you've made. When bean-counters took over the ecosystem they optimised immediate profitability over everything else. Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time. There's no room for experimentation, repair, or anything else.

    I've commented about lack of slack on several times here on HN because when I notice a broken system now a days, 90% of it is due to lack of slack in the system to absorb short term shocks.

    • acomjean 4 minutes ago

      I’ll note at the end of the last century I worked at IBM research which had a budget of 6 Billion dollars. Management was trying very hard to get better return on that investment. Even today IBM though often ridiculed in the tech space (sometimes they do deserve it) spends a lot on R&D.

cladopa 32 minutes ago

People are not perfect. I went to Ukraine just days before the invasion. Travel and Hotels in Kiev had become extremely cheap. You asked the Ukrainians about the possible invasion. "Not going to happen" everybody said."Russia talks always aggressively, but never does anything".

They did not properly prepare and as a result lost 20% of its territory in days.

Days after that I was back is Austria and could not stop thinking about some of the people I spoke with being dead.

Since that I have also been in Dubai and Saudi Arabia as an entrepreneur and engineer. "What are you going to do when drones are used against your infrastructure?" If you followed the Russian war and first Iranian strike it was obvious that drones were going to be used against them. "not going to happen" again.

The have lost tens of billions for lacking proper preparation. They could have been protected spending just hundreds of millions of dollars over years.

It is about humans, not AI.

  • teiferer 13 minutes ago

    In hindsight, it's easy to be smart. You picked two examples where somebody said "never gonna happen" and then it happened. How about the countless examples where somebody said the same and then the thing actually didn't happen?

    Take millions playing the lottery. To each of them, I can confidently say "you won't win, not gonna happen". For almost all of them I'll be right. There will be one who wins, were I was wrong, and they will say "see, told you so". That doesn't mean my prediction was wrong. It means you are having a reporting bias.

    • hnfong 4 minutes ago

      GP also probably had a sampling bias. The ones who were actually concerned about the impending Russian invasion presumably fled out of the country (or at least, away from the major cities to rural areas that probably see less fighting)

  • the-smug-one 3 minutes ago

    I'd say that Ukraine were very prepared for the invasion, though? They managed to survive for the first 2 weeks, leading to a long-term war. The Donbas war had already been going on for 8 years, and I don't think Ukrainians were under some illusion that those weren't Russians.

Animats 1 hour ago

> They can’t tell you what the AI got wrong.

AI code generators are trolls. They confidently plausible content which is partly wrong. Then humans try to find their errors.

This is not fun. It has no flow.

whycombinetor 1 hour ago

>I read the Fogbank story and recognized it immediately. Not the nuclear material. The pattern. Build capability over decades. Find a cheaper substitute. Let the human pipeline atrophy. Enjoy the savings. Then watch it all collapse when a crisis demands what you optimized away.

>In defense, the substitute was the peace dividend. In software, it’s AI.

Before it was AI, the cheaper alternative was remote contract dev teams in Eastern Europe, right?

  • Nux 1 hour ago

    India for the most part.

    • gitowiec 10 minutes ago

      No thank you, their food is only thing I like. The rest let it stay in India

netfortius 5 minutes ago

This is why a comprehensive computer science degree is necessary. Seeing and working only with the trees leads to destroying some forests, eventually.

allending 48 minutes ago

There's a certain irony in that the article itself is quite clearly assisted by AI. Not a criticism as I don't have a problem with AI assistance, but food for thought given the material being commented on.

  • rezonant 15 minutes ago

    The tropes that AI introduces into articles are very noticeable, quite annoying, and very unnatural -- they unfortunately don't write well. It seems people use them to "polish" up their writing but in reality it would have read better if they hadn't.

    My current pet peave is using period instead of comma, as in:

    > My people lived the other side of this equation. Not the factory floor. The receiving end.

    Ostensibly this is supposed to add gravitas, but it's very often done in places where that gravitas isn't needed, and it comes off as if I'm reading the script for an action movie trailer.

  • morningsam 5 minutes ago

    Made me stop reading a few paragraphs in. I don't have a "problem" in the ethical sense either, but as the sibling comment notes, the way LLMs write is rather grating. To make matters worse, a) people seem to use them to add pointless volume / "filler" to their texts, so now I have to wade through pages and pages of this stuff, and b) I have no easy way to distinguish between an article at least based on novel human insights vs entirely LLM-generated from a "write me something about X topic" prompt. I don't think it's a stretch to say that the latter just isn't worth reading given the state of the art.

dev_l1x_be 5 minutes ago

As an anecdotal evidence I code way more now with agents because i have an entity who has vast amount of knowledge about pretty much everything and I have the creativity to use that well.

RossBencina 58 minutes ago

Excellent post. Two stand-out points are deskilling through abolition of apprenticeship (or equivalent progression through the rank and responsibility), and loss of institutional knowledge, especially tacit knowledge stored in individual people. These are people problems more than they are technology problems. Without continuity of process and practice stuff gets lost. Sometimes change really is progress, for example software safety and security practices have progressed over the past 50 years, but other times change is just churn, or choices driven by misaligned incentives which will bite later, as the article describes.

  • RangerScience 30 minutes ago

    What comes to mind is how the cure for scurvy was simply… forgotten, causing it to come back.

raincole 9 minutes ago

First of all this is clearly AI-assist writing (being charitable here).

And the premise makes no sense anyway. The only risk of forgetting how to make shells is when other countries are making shells more efficiently. Non-western countries are not going to reject AI-coding, nor are they going to make software more efficiently by hand.

  • 0xpgm 2 minutes ago

    Programmers in non-western countries may not be able to afford $100 per month on vibe coding.

    They may keep taking the longer and harder route of a mixture of AI and hand coding.

anonzzzies 17 minutes ago

I saw academic rigor fall of a cliff in exchange for 'better job alignment' between end 80s when I had my first class after finishing highschool called 'Formal verification in software' on to beginning of the 2000s when I left giving the first class to new students 'Programming in Java'. All the 'teaching how to think' was replaced with 'how to get a well paying job'.

bit1993 36 minutes ago

Yes. Just like globalization created companies like TSMC, AI will do the same with software engineers that don't rely on LLM code generators be cause they can do it cheaply and sustainably.

Another reason is that LLMs train on the existing code we already know, don't expect new programming languages or frameworks this means that the software engineering skills that exist today will be relevant for a long time.

tjwebbnorfolk 1 hour ago

You could say COBOL has had this "problem" for 40 years also. That's why we need to constantly be inventing new ways of making things. The old ways are always forgotten over time.

If you REALLY need something long-forgotten, then you have lazy-load it back into being at significant cost. That's the price of constant progress.

  • LeCompteSftware 1 hour ago

    The point of the article is that sometimes the "old ways" really means "not particularly profitable or necessary in the short term" but the bill comes due in a crisis. The reason US/EU manufacturing was "the old ways" is that people could make easier money with financial engineering, an insight that extended all the way to Raytheon.

    COBOL is a bad example, but higher-level languages vs. assembly is not. If you write a lot of C you really don't need to know assembly.... until you stumble across a weird gcc bug and have no clue where to look. If you write a lot of C# you don't really need to know anything about C... until your app is unusably slow because you were fuzzy on the whole stack / heap concept. Likewise with high-level SSGs and design frameworks when you don't know HTML/CSS fundamentals.

    As the author says maybe AI is different. But with manufacturing we were absolutely confusing "comfortable development" with "progress." In Ukraine the bill came due, and the EU was not actually able to manufacture weapons on schedule. So people really should have read to the end of "building a C compiler with a team of Claudes":

      The resulting compiler has nearly reached the limits of Opus’s abilities. I tried (hard!) to fix several of the above limitations but wasn’t fully successful. New features and bugfixes frequently broke existing functionality.
    

    At least with Opus 4.6, a human cannot give up "the old ways" and embrace agentic development. The bill comes due. https://www.anthropic.com/engineering/building-c-compiler

    • anonzzzies 21 minutes ago

      But these are hard IT things a human programmer really struggles with as well. What % of software written is that? Very very low. Most software is dull and requires business vagueness to be translated into deterministic logic and interfaces; LLMs are pretty great at that as it is. If humans use their old ways to fix complex problems and llms do the rest, we still only need a handful of those humans. For now.

AHTERIX5000 23 minutes ago

Is this written by a real person though?

imrozim 46 minutes ago

How do you become a senior engineer if no one hires you as a junior anymore.

  • hkt 15 minutes ago

    Talk confidently in your interview with non-technical managers when the last senior has left and there's nobody there to check your work.

skybrian 1 hour ago

There was a time when companies had terrible development practices and could forget how to build, test, and deploy software, but is anyone seeing that now? We have much better development practices nowadays.

It doesn’t seem much like defense industry problems.

  • disgruntledphd2 26 minutes ago

    This still happens. Lots of my career has been figuring out what code is actually running in prod, and determining if it even works.

alecco 42 minutes ago

Speak for yourself. I now dare to code much harder problems and learning is bliss. No more having to sit down to dig needle-in-haystack through horrible documentation or random Stack Overflow posts.

LLMs are a magnificent tool if you use them correctly. They enable deep work like nothing before.

The problem is the education system focused on passivity (obeyance), memorization, and standardized testing. And worst of all, aiming for the lowest common denominator. So most people are mentally lazy and go for the easy win, almost cheating. You get school and interview cheating and vivecoders.

But it's not the only way to use LLMs.

Similarly, in Wikipedia you can spend hours reading banal pop-slop content or instead spend that time reading amazing articles about history, literature, arts, and science.

efitz 46 minutes ago

I disagree with the premise - interesting but I interpret the same fact pattern differently.

The history of technology is the replacement of manual processes with automated ones.

Consider a very basic process: checkout of a restaurant.

Writing the price of each item on a sheet of paper, manually adding them and writing the total was replaced with typing in the prices and eventually with just pushing the button for the item. Paper still exists for jotting down your order but within seconds of leaving the table it’s transitioned to computer.

This has enabled lots of desirable advances- speed, accuracy, new payment rails, and increasingly, elimination of the server in checkout- you tap a credit card on a tabletop device.

Did we “forget” how to do checkout? No. We purposely changed it.

But if the internet connection goes down or the backend server powering the cash register app goes down, there is an atrophied and not-regularly exercised skill set (maybe not even trained, IDK) that has to be implemented on-the-fly and it’s slow and frustrating for everyone.

Businesses don’t exercise (or perhaps even train) this process because it’s just not needed enough to warrant the cost.

Military procurement of weapons systems is hardly the place to point to as a technological tradition. There are lots of cases where no one pays the money to keep a production process in place; the reasons are all related to shortsighted “cost savings” or failing to anticipate changing needs.

With coding today, we are seeing the same kind of shift in priorities as my restaurant example. Having humans write code in the 2020 (pre-GPT) tradition was extremely inefficient in terms of time-from-idea-to-implementation.

We’ve found a new way to do the mundane part of that task (the mechanics of translating spec to implementation).

We are figuring out how to do that while preserving quality (and a lot of it is learning how to specify appropriately).

Will we “forget” how to “build” code?

No, but the skills to generate source code by hand will atrophy just as the skills to draw blueprints by hand atrophied with the advent of CAD.

Will we find examples where someone prematurely optimized away knowledge of a skill or process, incorrectly thinking it was no longer needed? Of course.

But the productivity gains we get will be so great on average that no one will go back to doing things the old way.

There will be old-timers and hobbyists who will preserve some of that knowledge; for most it will just be a curiosity.

  • drawfloat 32 minutes ago

    Everyone is taught at a young age how to do basic addition and multiplication. That's all check out requires. People are not taught at a young age how Rust lifetimes work or how to write human maintainable code.

    I agree, as with everything in 2026, the reality lands somewhere in the middle of the discourse online. But pretending this is in practice anything like the check out example is wrong.

  • latexr 17 minutes ago

    Those are terrible, truly awful and inadequate comparisons (though I do believe you are making them in good faith).

    CAD still requires you know what to do, and without CAD you can still draw blueprints by hand because you know what the result should be. Checkout is basic arithmetic you can do on a paper or even your personal phone. In both cases it is clear what the process is and what the output should be, and it doesn’t replace knowledge and training and certification.

    With coding, none of that is true. By and large, there is a trend of people who don’t know what they’re doing shitting out software, or people who should know better not verifying the very flawed output they get. That is already having negative consequences in people’s lives.

wg0 1 hour ago

>The combination of technical skill and the judgment to know when the AI is wrong barely exists in the market anymore.

I see a talent pipeline collapse in next 5 years. "Software engineering is over coding is a solved problem" as being chanted by semi literate media and the AI grifter's marketing departments would further scare away the allocation of human capital to software engineering easily commanding 3x rise in salaries due to resource shortage.

roenxi 49 minutes ago

> Leadership qualities. Our last hiring round tells you how rare that is: 2,253 candidates, 2,069 disqualified, 4 hired. A 0.18% conversion rate.

It's minor but this is just wrong. If you're going to hire 4 candidates, there could be 2,253 perfectly qualified candidates even if only 0.18% get hired. The conversion rate is meaningless; it just tells us how many jobs were on offer. There is no way that the skills this fellow wanted were so rare and difficult that only 1/500 candidates could possibly handle the job. Humans even in the 1/20 mark are pretty competent if you're willing to train them and legitimate geniuses crop up at around 1/200.

bsder 1 hour ago

> Optimized for minimum cost with zero margin for surge. On paper, efficient. In practice, one bad day away from collapse.

I'm going to steal that one and add it to Stross': "Efficiency is the reciprocal of resilience."

  • californical 1 hour ago

    Yes that is one key that resonated with me. The author did a great job of putting these recurring concepts into their own words

    The other that really resonated was something that I read before along the lines of… we think that once humanity learns something, that knowledge stays and we build on it. But it’s not true, knowledge is lost all the time. We need to actively work to keep knowledge alive

    That’s why libraries and the internet archive are so important. Wikipedia, too

Meirambek_VIDI 1 hour ago

Do you think this is a tooling problem or more about incentives and how engineers are trained now?

  • great_psy 1 hour ago

    I think the article is making the point that it is a cultural problem about cost cutting and short term thinking.

    • Meirambek_VIDI 16 minutes ago

      Yeah, agreed - short-term incentives seem to drive a lot of this. Do you think tools can help, or is it mostly cultural?

Scroll_Swe 47 minutes ago

"the west" ?

You mean the world?

Deepseek was being glazed here, Im sure chinese programmers use it like CC

dsign 41 minutes ago

This is some convoluted BS built on the premise that wars need to make sense, economically or otherwise. No, wars do not need to make sense. If a person, a dictator or a president, unilaterally starts a war that forfeits the lives of both the dictator's (possibly fabricated) enemies and its own people, that person is knowingly committing murder. Logically, such a person should be handled with at least as much prejudice as a lone wolf that opens fire on a crowd. So we need to fix our legal systems to be better at preventing wars, not our economic systems to be better at fighting them.

arjunthazhath 41 minutes ago

Hope we dont forget humanity one day!

rvz 1 hour ago

This will end with the way of COBOL with a few people that still have the expert-level understanding of refactoring old code without causing outages or service disruption.

We’ll see, but right now I now see developers 24/7 hooked onto their agents and in the future we will experience a de-skilling problem which clean code, best practices, security and avoiding NIH syndrome will be all flushed down the toilet.

wewxjfq 39 minutes ago

While the Fogbank story is a funny anecdote, I don't see it as a fitting example for atrophied skills. It's like writing a clean implementation of some software and it just doesn't match the legacy version until you realize that the legacy version had an unnoticed bug that made it behave the way it does.

heinternets 54 minutes ago

When you've run out of ideas just portray "the west" as some monolithic portrait in some decline-porn fan fiction as clickbait.

ekianjo 21 minutes ago

> The defense industry thought peace would last forever, too.

Not really since they are always pushing for more wars.

ktallett 1 hour ago

We have both forgotten how to make things and also decided we can make more profit letting someone else make everything for every market. We have moved to a generation fixated on maximizing profit. However there is logic there as the cost to access the ability to make things is prohibitively expensive. As someone who makes open hardware with a nod to the environment and reusability, you can not justify or even find more locally sourced options than China.

Coding is different though, coding doesn't have a cost barrier, it has a ability barrier. I think we will loose a lot of people who never were passionate about programming and perhaps go back to a happy equilibrium. AI is only production ready if you have someone who understands software development. AI will improve speed to market if you have the right team, it doesn't remove the need for some to learn to code. You will of course end up with startups using exclusively AI but they will be those who end up with major security breaches or simply cannot scale as the AI goes in the wrong direction for the future. Tbh that's probably a positive as it weeds out the start ups that are focused on buzzwords for funding and not product.

  • latexr 45 minutes ago

    > I think we will loose a lot of people who never were passionate about programming

    Anecdotally, what I’m seeing right now is the opposite. People who don’t care about programming are joining, while those who do care are getting tired of the bullshit and leaving. The good programmers are the ones leaving, the hacks are extremely happy to use LLMs.

    When shit hits the fan, there won’t be many people left to clean it.

throw4523ds 24 minutes ago

exactly, as they say everyone has to learn to code.

light_hue_1 9 minutes ago

> The West Forgot How to Make Things. Now It's Forgetting How to Code

Can we stop repeating this nonsense headline please? We did not stop manufacturing things.

Manufacturing is a huge industry in the West. https://en.wikipedia.org/wiki/Manufacturing_in_the_United_St...

The US manufacturing sector is the biggest it has ever been. Exports are at all time record highs. The only thing that declined about manufacturing is the jobs. We build way more than we ever did but with far fewer people.

What we did do is decide that basic items aren't worth it. Our capacity is limited, our labor pool is limited, expenses are high, it doesn't make sense to make trinkets when we can make complex high precision parts and devices.

But no, we did not forget how to make things. We chose to use our capacity in a smarter way.

trhway 50 minutes ago

Isn't that is the point of technological civilization development? People for example forgot how to weave on the handloom, or all the parts production and the maintenance for the watermills. And wooden sailships - top mastery of handling and engineering developed for millennia, gone.

As it was said - the future is here, it just distributed non-uniformly, so somebody is still and will be for some time sailing, manufacturing things and writing code.

BrenBarn 59 minutes ago

> After spending an additional $69 million and years of reverse engineering, they finally produced viable Fogbank. Then discovered the new batch was too pure. The original had contained an unintentional impurity that was critical to its function.

Same thing that happened to the unfortunate Dr. Jekyll!

immanuwell 40 minutes ago

when you offshore or automate away the hands-on knowledge, you don't just lose the workers, you lose the entire institutional memory, and no amount of money can buy that back overnight

locallost 56 minutes ago

I can't not write the tired comment of how ridiculous it is to criticize AI and then use AI to write your article. It's tired, but so is this writing style.

For the actual problem, I fear this can't be solved by warning people, the pain will need to be felt. The system we live in, basically free market capitalism, cannot do anything else except local optimization. Maybe it's for the best, I don't know. The alternative of top down planning wouldn't have this problem, but it would have other problems. I work for a mid size somewhat luxury brand, and the major goal right now is cost cutting and AI for efficiency everywhere instead of using it to create better products or better ways to reach out customers. When I think about who will buy our luxury products if all jobs were optimized out of existence, I don't have an answer, but again I think the pain will need to be felt to change course.

shevy-java 1 hour ago

> I run engineering teams in Ukraine. My people lived the other side of this equation. Not the factory floor. The receiving end.

With all due respect, but many european taxpayers help pay for Ukraine. I am not disagreeing on the premise of the West killing itself via systematic recessions - Trump invading Iran leading to inflation as an example - so a lot of things are going on that show a ton of incompetency both in the USA and the EU, but at the same time I also get question marks in my eyes when this criticism comes from a country that receives money from others. That money could instead go to make EU countries more competitive, for instance. I am not saying this should necessarily be the case, mind you; I fully understand the nature of Putin's imperialism. But we need to really consider all factors when it comes to strategic mistakes with regards to production - and that includes taking up debts all the time. There are always a few who benefit in war, just as they benefit from subsidies from taxpayers (inside and outside as well).

  • skhr0680 1 hour ago

    Ukraine is "receiving money from others"? We are benefactors of the Ukrainians' bravery and sacrifices. How much money could we have not spent if Hitler had been stopped in Czechoslovakia?

    • gib444 1 hour ago

      > Ukraine is "receiving money from others"?

      Yes. https://www.eeas.europa.eu/delegations/united-states-america...

      • latexr 35 minutes ago

        You are completely ignoring the argument of your parent comment. They are saying that money is being spent to the benefit and best interest of the spenders, that it’s not a handout.

        You are, of course, free to disagree and make your point, but ignoring the argument does not advance the discussion.

    • crotobloste 56 minutes ago

      > Ukraine is "receiving money from others"?

      Factually correct.

      > We are benefactors of the Ukrainians' bravery and sacrifices.

      Who's we?

      > How much money could we have not spent if Hitler had been stopped in Czechoslovakia?

      Very different situation, in all aspects.

      • collinfunk 49 minutes ago

        You see zero similarities between Hitler invading Poland and Putin invading Ukraine?

        • roenxi 33 minutes ago

          There are some pretty substantial differences. Russia is on the strategic back foot here trying to figure out a way to stop NATO's advance. They've only turned to violence after long attempts at resolving the tension diplomatically and the US has been implacable. Putin's actually been pretty hesitant in his escalations so far; he's 70 and has a long history of trying to avoid war.

          Hitler was more about wanting more land and resources for Germany, and he saw war as being a legitimate tool for achieving his aims that he deployed early and enthusiastically.

lava_pidgeon 1 hour ago

Rather bad premise in the article. 1.) Germany, Italy and Eastern Europe are very industrial regions. The author forgets defence is not only the industry. 2.) The author doesn't show any source that Chinese developers don't use AI

whatever1 1 hour ago

I don’t know, but the evidence shows that software engineering is not that deep of an art.

People come and go at rates that would not be sustainable in any manufacturing business.