lacker 12 days ago

For a long time IBM's AI strategy has been to do normal business, and claim it's AI-related in order to make themselves sound cooler.

This is the same thing. Any layoffs happening today don't really have anything to do with AI. The company just needs to do layoffs, and saying "we have layoffs because of AI" sounds better than "we have layoffs because revenues are worse than we expected".

  • sammywater 12 days ago

    Same thing at the company I recently worked at-- a maker of software for car dealerships & manufacturers.

    The product marketing mentions AI.

    I asked a staff data scientist (has been with company for several years) if AI is used in our products.


    It's quite amazing how pervasive Fraud & pseudo-Fraud is in the American economy. Regulators seem to turn a blind eye to so much of it. A recent example I saw was Food adulteration, with things such as sawdust [0]

    [0] "31 Foods You're Eating That Contain Sawdust"

    • verall 12 days ago

      I don't think adding cellulose to processed food in order to prevent caking is food adulteration. It's clearly listed on the ingredients and is perfectly safe.

      I do think it's fun to point out in the ingredients list, sometimes it's a little unexpected, like on a bag of chips. "Mmm, sawdust" is kind of funny because it sounds bad but is completely harmless.

  • mdgrech23 12 days ago

    Honestly IBM is joke.

    • BSEdlMMldESB 12 days ago

      they own redhat linux, hence they have enormous influence over systemd and the linux kernel, this all seems quite serious stuff to me

  • scarby2 12 days ago

    >"we have layoffs because revenues are worse than we expected"

    These days a lot of companies seem to be we have layoffs because everyone else is doing it and shareholders want short term gain.

    • tivert 12 days ago

      > These days a lot of companies seem to be we have layoffs because everyone else is doing it and shareholders want short term gain.

      This is why we need to pay CEOs so much: it's important to have a top-talent to ape the decisions that "everyone else" is making.

      • irvingprime 12 days ago

        +100 for the most effective use of sarcasm I've seen today!

  • fwungy 12 days ago

    IBM is a consulting company that is trusted to build enterprise solutions. McKinsey and the like can't do what IBM can.

  • idlephysicist 12 days ago

    Yeah them (and many others) just get Alan and Imelda to do the work.

mm007emko 12 days ago

The company I work for just jumped on the bandwagon and is actually searching for people with ML/AI experience. If you can use Spark, TensorFlow, Scikit and Keras, you actually have a better chance of getting the job than a Ph.D. who knows only one of the frameworks. (That's the way many corporations work, sadly.)

The only place where "intelligence" is in the AI is in its name. These are mathematical or logical models which resemble a behaviour of an intelligent being and if you throw ML into the mix, the AI models can actually learn on their own. But they are not creative. They can do amazing things but they have no comprehension of WHY they do these things or have (usually) no notion of truthfulness. They just repeat what they were learnt on or extrapolate from it (often wrongly because there is no critical thinking and fact-checking in contemporary models).

I see a lot of tell-tale signs of another bubble which is going to burst in a couple of years like it did in the 1980s.

Nothing to worry about. If someone's job security is endangered, they can either switch employer or do something else.

But if you can claim experience with these frameworks, enjoy the ride. Companies are going to pay you whatever you ask.

  • hn_20591249 12 days ago

    If you are using Tensorflow, Spark and Keras in your hiring decisions I’d say you are already behind the curve on technologies.

    • mm007emko 12 days ago

      Sure. But I'm a software engineer who is finishing a Ph.D. in applied informatics (coincidently in an area of time series prediction using ML). I'm not a manager.

      When I mentioned "AI Winter" in front of them they didn't know what I was speaking about. But they created a nice corporate ladder which anybody can climb and it was based on years of experience with the aforementioned frameworks. Python experience needed, Scala + Spark was an advantage.

      I don't know what are you planning to do. But ... I'm buying a huge load of popcorn and I will laugh my ass off when this bubble bursts in a couple of years.

      • juujian 12 days ago

        Never underestimate the ability of corporations to make a multi-decade business venture out of selling AI-bullshit to other corporations.

    • fnord77 12 days ago

      what are the current relevant technologies?

      • james-revisoai 11 days ago

        (not parent but -) It depends on area, generally: Provider APIs(ChatGPT through OpenAPI etc), langchain, huggingface transformers, pinecone/vector DBs are absolutely taking off.

        Lots of specialist ML models which required specific data collected carefully for a business task are no longer needed. Most ML roles until now, and research time, was spent collecting data and training specialist models.

        General models like ChatGPT or pretrained image models do better than a fresh model trained from scratch or even a finetuned small/medium model (e.g. BERT/T5) ever will these days.

        The special ML (pipeline of data --> train --> deploy --> manage/mlops/drift) used tech like PyTorch, Tensorflow, MLFlow.. and for more applicable levels (e.g. deployment), transformers, sci-kit learn, keras. However these are being replaced wholesale at many companies by langchain, huggingface inference API (for vision tasks) and pinecone/other vector DBs.

        Langchain is just a smart way to wrap and order API calls to OpenAI/ChatGPT/other providers really, with some prebuilt use cases. Right now there is less on the metrics/output side than with lets say "bring-your-own-data" ML models, which you could measure things like precision.

        Now, the old guard of ML (PyTorch, Tensorflow) is still used for training new models, open source replication attempts etc. But newer frameworks like JAX have not really taken off, as they have been entering the community as the community switched to using providers, rather than training their own models.

        There remains a subset still powerful for communication with C-suite: Using simple models like K-means to show clusters with readable axis. They tend to use sci-kit learn or R. But this is more classic data science than ML.

        There are also areas of AI so far relatively unaffected by ChatGPT etc - time series prediction (like OP, so it's less surprising they are using the old guard technologies), game engine AIs, non-discrete data, recommendation algorithms, some computer vision algorithms (especially Active learning). Some like HuggingFace (a commercial company running transformers Python module) are sort of inbetween given they serve both data-trained and the newer models.

  • jstx1 12 days ago

    > The company I work for just jumped on the bandwagon and is actually searching for people with ML/AI experience. If you can use Spark, TensorFlow, Scikit and Keras

    This has nothing to do with question though. No one is hiring people who know Spark and Tensorflow to replace jobs. The kind of job replacement OP is asking about will potentially come from having your company sign a huge contract with Azure or whatever, hook up a bunch of LLM agents and APIs to it and lay off your 90% of your client support department. It won't be "we heard about AI on the news so we hired someone who knows keras". Companies do this but they have been doing it for many years - it isn't new or interesting (and many companies figured out how to do it well along the way too).

    > But if you can claim experience with these frameworks, enjoy the ride. Companies are going to pay you whatever you ask.

    No, they won't. The pay is similar to other software development. If at some companies it's higher, it might be something like 10% higher, definitely not "pay you whatever you want".

    You're extrapolating too much from your company and your comment seems to be based on things that were relevant 5+ years ago, not on what's been happening in the field in the past 12 months.

slashdev 12 days ago

People buying into the hype too much? We've been early adopters of AI at my work, but we're still hiring. AI just improved our efficiency a bit.

I think if you can replace employees with ChatGPT you probably didn't need them to begin with. They weren't doing valuable work anyway.

  • johnny99k 12 days ago

    Our marketing department already started replacing content creators with chatGPT. While I don't think it will replace that many jobs right away, it will start replacing more junior roles, which will make it even more difficult to break into many industries.

    • leros 11 days ago

      I have stopped hiring content writers too. ChatGPT does as good of a job, especially with technical content that some writers often struggle with.

  • MontyCarloHall 12 days ago

    >I think if you can replace employees with ChatGPT you probably didn't need them to begin with. They weren't doing valuable work anyway.

    That's a large number of employees. Labor markets are way less rational and efficient than we think.

    Snark aside, I disagree somewhat. There are plenty of valuable but menial tasks (e.g. cut-and-dry CRUD app development) where >90% of the work can be done with LLMs. The question is, will this cause companies to ...

    1. ... keep their current workforce, but produce 10 times the number of CRUD apps? Is there (induced) demand for that?

    2. ... keep their current workforce, continue to produce the same number of CRUD apps as today, but with far more features and sophistication than the current product?

    3. ... cut 90% of their workforce and continue to deliver the status quo?

    • siva7 12 days ago

      Doing 3) will result in loosing your competitive edge when everybody else is using AI to improve the product. That's similar to not using enterprise software at all because paper and pen or office 365 will do the trick for now except when it's too late.

    • fifilura 12 days ago

      I wonder if those jobs would disappear/evolve anyway, AI or not.

      Frameworks are constantly evolving and setting up a CRUD app is probably easier today than 10 years ago.

      • vkazanov 12 days ago

        This might sound old but... did you ever try to set up a modern JS stack? :-)

        • slashdev 11 days ago

          This is too close to home, we just set that at up at work. The developer who spent all week on it looked like he was questioning his career choices.

        • fifilura 12 days ago

          I know :) I am just hoping it would be true. It should be true dammit.

  • fhd2 12 days ago

    I believe that's an overgeneralization - a lot of jobs when done badly don't immediately reflect in quarterly revenues (the usual definition of "value" in my experience). Companies can make their software unmaintainable, cause a major security issue, ruin their brand or lose customer trust without noticing it - before it's too late.

    But if management is up for having a job done badly, they usually achieve that regardless of whether an LLM is involved.

    Edit: I guess I actually agree with you, it's just that companies can have peculiar ideas of what they consider valuable.

23B1 12 days ago

Anecdotally, I know quite a few copywriters who've seen their clientele dry up, especially those deep in SEO and content marketing, thought leadership, that sort of thing.

I have no data to support this, but I have seen plenty of LinkedIn headlines switch to "Prompt engineer:

willcipriano 12 days ago

Wouldn't you have to first automate the job? Speculative layoffs because something is going to be automated "any day now" seems unwise to me, how will the task be completed tomorrow?

Or is the idea that businesses already automated stuff and the management is so incompetent that IBM has to send around a pamphlet to remind them to layoff the people now sitting on their hands?

  • whstl 12 days ago

    The idea in those cases is not so much about automating jobs, but rather retaining less people but still getting the same productivity because the tools are helping.

    As an avid user of Copilot and ChatGPT, those tools definitely make me faster. Sure, I gotta review everything it writes. But it's already easier to review Copilot code directly in my VSCode window than that of a junior or mid-level developer in Github.

    I definitely wouldn't replace an employee with AI tools, though, and I don't see that changing...

    Also: whether we can replace less experienced devs and not drive the remaining ones crazy, that remains to be seen.

    • willcipriano 12 days ago

      If you are being effective at managing a business, you'd be keeping track of employee utilization at some level. What I'm saying is, if your employee utilization suddenly drops for reasons that it won't recover from, you should consider layoffs. If it hasn't dropped, you haven't really replaced a job with AI yet and are putting the cart before the horse.

  • vsareto 12 days ago

    Companies have shown they'll do speculative layoffs or hiring. It's possible it's just an excuse in this case, but they might also genuinely believe the AI hype.

  • stuckinhell 12 days ago

    The idea seems to be boosted productivity = less people needed in all departments.

    The only real blocker is waiting on Azure secured GPT endpoints with some contract legalese about protecting our data from our Microsoft reps.

sokoloff 12 days ago

I have a friend (really) who is no longer replacing front-end devs who quit at their mid-stage startup. Says that he’s getting enough lift from ChatGPT-4 to make up enough of the gap that so far it’s not worth replacing them as the team can pick up the slack.

iknownothow 12 days ago

Question for OP - How big is the company you work for?

I work for a company that has around 50 technical employees and I'd say use of LLMs is putting pressure on the company against hiring more employess rather than actually laying people off.

My armchair estimate is LLMs make the employees ~5% more efficient "on average" (not everyone is using it or using it effectively), which is 15-30 minutes more efficient per day. That would mean you'd start thinking about laying off 5 people if you're a company with 100 technical employees. If you're a smaller company, it would be premature to lay off employees based solely off of the impact of LLMs.

  • stuckinhell 12 days ago

    It's a big company, everyone on hackernews knows it.

dpflan 12 days ago

Great question, and one to be skeptical of. IBM doing layoffs and saying they will be an AI company, is well, the slumbering giant trying to keep itself relevant and squeeze costs to pump stock.

  • chasd00 12 days ago

    The ironic thing is IBM is already an AI company. I doubt anyone there has a full grasp of all the dusty corners of that operation overall.

giobox 12 days ago

> The IBM layoffs articles have been passed around executive management recently.

That IBM interview was highly speculative, and was arguably as much a smoke screen for layoffs they wanted to do anyway as it was a prediction of their future AI plans. No one at IBM has been layed off due to AI yet either, they simply "expect" they can do it in years to come, which may well be true.

I don't think this is common yet anywhere serious, as realistically you aren't going to be replacing an IC with an LLM yet, despite the hype, with very few exceptions.

Arvind Krishna is as much trying to associate IBM with the current AI investor craze as he is making a sensible statement about the future of work in that interview, and it should be seen as the investor marketing it is. IBM have done this in the past too - remember the Watson AI ads with Bob Dylan? Now no one remembers the Watson brand.

Planning to reduce headcount by 7800 people because you have awesome AI technology coming down the pipeline sounds a lot better to some investors ears than firing 7800 people because company isn't performing well, and investors are often rewarding AI news handsomely in the stock market recently.

I can't even remember the last time I saw Arvind or senior IBM staff being interviewed in the mainstream media at all before he uttered the word AI.

kevinventullo 12 days ago

I’m a front-line manager and the idea of replacing any of the IC’s on my team with “AI” is absolutely ludicrous.

tivert 12 days ago

I think you've have to be specific about what kind of staff. Developers? Users?

ChatGPT has only been out for about six months. Even if there are big layoffs coming, going from released technology, to implementation in a customer domain, to being comfortable laying off significant staff in that time-frame seems extremely aggressive. I would guess layoffs in that time frame would actually be fore other reasons, though "AI" could possibly been used to obscure the true reasons.

  • stuckinhell 12 days ago

    Preliminary but it looks like everyone ? AI boosted productivity = less needed employees.

    The only blocker for mass company adoption right now is waiting for Azure secured GPT endpoints (up to our standards anyways), but we are talking to our Microsoft rep.

swader999 12 days ago

We need to hire more QA and requirements/analysis people because we are more productive. I think only half our devs are really using AI regularly too.

achrono 12 days ago

Skeptical of layoffs happening for non-"tech" companies due to AI, but lower HC for next year is happening in at least some orgs for sure, especially at entry levels. Cf. Bill Gates' comment of having a white-collar worker made available.

DANmode 12 days ago
  • marifjeren 12 days ago

    I'm skeptical.. It doesn't even sound like they've actually started on whatever AI/ML projects he's talking about. They're having profitability problems and needed to cut costs. I assume the AI mention here is just fluff so the layoff news doesn't sound so bad

    • JoelMcCracken 12 days ago

      What he is saying is that they're reevaluating strategic priorities; this is a statement akin to meta saying they're laying off VR/metaverse engineers and hiring AI engineers.

      • tomtheelder 12 days ago

        I don’t think that’s right. He is pretty transparent about the layoffs being a cost cutting measure to try to increase profitability.

        • JoelMcCracken 12 days ago

          Yeah, true; I should have been more specific, that the mention of AI there is "we're changing priorities to develop AI-based offerings", not "we are replacing staff with AI".

    • DANmode 12 days ago

      It may or may not be the whole truth, but the article is an example of what OP is asking for..

      • marifjeren 12 days ago

        > the article is an example of what OP is asking for

        I don't dispute that. It just might also be a lie.

ruuda 12 days ago

No, but we did create an open position for an engineer to build internal tooling around LLMs.

aborsy 12 days ago

The tech has the potential to displace workers, or cause unemployment.

However, unemployment will never happen, if you know humans. There is going to be a lot of outrage, and regulators will regulate it like drugs!

MoSattler 12 days ago

AI won't directly lead to large-scale layoffs. It's more of a trend where companies begin to hire fewer people. They might not replace those who retire or leave.

red_admiral 12 days ago

Chegg certainly seems to be in a tight spot.

nashashmi 12 days ago

First should come the training to use AI then should come the layoffs of those who don’t use AI.

But it sounds like your company is not interested in training. And would rather hire from outside first. So fire now and Hire AI enhanced staff next.

I have an aversion to such companies. But the other kinds of companies are not firing staff because of AI. Instead They are increasing staff workload, a fallout of lots of staff finding better jobs.

chasd00 12 days ago

kind of the opposite, i've been told from on high to not use it at all and tell everyone below the same.

  • stuckinhell 12 days ago

    That's our official policy right now as well . . .

i2cmaster 12 days ago

Middle management is bad enough when it's done by trained competent people. I can't imagine working under the direction of one of these LLMs.

spaceman_2020 12 days ago

If GPT-4’s diminishing abilities are any indication, employees have nothing to fear.

The tool has somehow become less impressive over time.