WarOnPrivacy 12 days ago

Leading contender for actual quote:

     "... we expected to get orders for five [IBM 701] machines, we came home with orders for 18."
  • hinkley 12 days ago

    Out of the 20 potential customers they pitched to.

    So they were counting on a 25% success rate and got 90%.

    • thmsths 12 days ago

      I wonder if it caused any issues. Getting 3 times the amount of orders can be great, but it can also be pyrrhic victory, depending on your ability to deliver.

      • t0mas88 12 days ago

        With this order size there is no public pricing information. So if you can't deliver fast enough you would adjust prices or specifically charge more for the first few delivery slots.

        Compare it to ordering very high price items with long lead time like airliners, you pay for a specific delivery slot, not just the item at a random moment. And you can buy options to more deliveries in a specific timeframe, which influences the price of your order.

      • SoftTalker 12 days ago

        At the time (if TFA is correct) the 701 had already existed for a year. So it was only a question of building more of them, not that they had sold 5 vs 18 of something that didn't exist yet. But, they were also most likely on the hook for installing and running them -- at that time a computer like that would have been leased with installation services and on-site operations staff included.

      • bityard 12 days ago

        From a salescritter's perspective, that is frankly not their problem.

  • graymatters 11 days ago

    The premise of the article (if it can be called that) is incorrect. He made the statement about the 5 computers in a TV interview (and it was not related to business trip referred in the article). I saw the recording of the interview some 10-15 years ago. Don’t remember which network. Perhaps NBC?

    • anamexis 11 days ago

      Well, the article is certainly better sourced than you thinking you saw a counter example 10-15 years ago on TV.

    • ivan_gammel 11 days ago

      What you say doesn’t make sense at all. The article describes a shareholder meeting in 1953, when IBM was already aware that the market is much bigger. It had to happen significantly earlier, but not later than in 1940s, because the concept of programmable electronic machine (a computer, not calculator) was established only then. And they were selling a lot of calculators by then. At that time there was no such thing as TV broadcasting casually interviewing various people.

ecshafer 12 days ago

I haven't thought about this much before, but I think it must be a myth. Going from the https://en.wikipedia.org/wiki/History_of_IBM wiki on the history of Wikipedia, there were "Computing machines" in the 30s referring to their calculators and tabulating machines. IBM was already selling more than 5 of these devices, so if the 1943 date was true, it makes no sense. So it referring to a single machine having a market of 5 devices, that might be true.

  • mixmastamyk 11 days ago

    There were already more than five electro-mechanical computers built by the end of WWII. And several electronic ones, shortly after that. So the paraphrase is indeed nonsense.

  • potato3732842 12 days ago

    Yeah, there were so many specialist analog computing machines out there in the 1940s and earlier that the pop culture interpretation of the quote as being "for all digital computers of all types the world over" just doesn't pass the sniff test.

tbrownaw 12 days ago

It may be apocryphal, but it's not all that wrong.

Those "about five" computers even have names: AWS, Azure, GCP, ...

  • ghaff 12 days ago

    Sun’s CTO repurposed the quote to make exactly that point in the 2000s, likely before any of those existed. It’s very much an oversimplification but if you squint it’s not totally wrong either.

    • lysace 12 days ago

      Sounds like something Jonathan Schwartz (the ponytailed COO @ Sun at the time, I believe) could have said, did you mean him?

      His blog was strangely addictive at the time. Great writer.

      • ghaff 12 days ago

        No, I'm sure it was Greg though I don't think you can get to the Sun blogs any longer. But that's not to say that Jonathan didn't reuse the line himself. (I was an IT industry analyst at the time--and am again.)

        Ah, but here's a reference to it: https://www.cnet.com/tech/tech-industry/the-world-needs-only...

        I'm guessing it was when Sun started talking up Sun Grid though that part I'm not sure of but the timeframe of Stephen's article pretty much matches.

        • lysace 12 days ago

          Ah!

          Oh, Stephen Shankland. He did a number of high quality reviews of this software product I was the tech lead on; about 10-15 years ago.

          High quality software reviews were rare already back then.

    • tw04 12 days ago

      I believe his was actually: the network is the computer.

      And he was right, he just didn’t anticipate greedy US ISPs would set progress back 2 decades.

      • ghaff 12 days ago

        Oh, please. Pray tell, inform us about how ISPs held back progress for 2 decades. Good broadband access could perhaps have come earlier and cheaper but it basically came soon enough once web-based services were available. I'm not going to argue that US ISPs are universally great but saying that they held back progress by "2 decades" is pretty much ignorant. Especially given that Sun was presumably mostly talking about the context of business computing at the time.

        • dredmorbius 12 days ago

          You could go back a couple of decades earlier (1960s--1980s) and point at AT&T who specifically prohibited third-party devices on their network (answering machines and even neck-rests or phone-book covers were among the prohibited items),[1] and had flatly rejected packet-switched routing as an obvious threat to their monopoly.[2]

          Unix itself (and Linux, Android, and MacOS) wouldn't have existed save for a 1954 consent decree which prohibited AT&T from entering the software business.[3] When the company found itself with an accidental operating system the only thing they could do was give it away for free. "From Ken with love".[4]

          ________________________________

          Notes:

          1. Partially supported here: <https://www.promarket.org/2023/02/20/when-considering-breaki...>. Phone book covers was AT&T v. Winback & Conserve Program, Inc. Hush-a-Phone was an earlier case in 1956 involving a cup-like device, physical only, with no electrical or electronic components: <https://en.wikipedia.org/wiki/Hush-A-Phone_Corp._v._United_S...>.

          2. <https://en.wikipedia.org/wiki/Protocol_Wars#Early_computer_n...>

          3. <https://arstechnica.com/tech-policy/2011/07/should-we-thank-...> <https://www.aeaweb.org/articles?id=10.1257/pol.20190086> and <https://pubs.aeaweb.org/doi/pdfplus/10.1257/pol.20190086> (PDF)

          4. <https://sanctum.geek.nz/presentations/a-brief-history-of-uni...>

          • ghaff 12 days ago

            I could go back into the history of the telecommunications monopoly in the US but that hardly seems relevant to the modern Internet.

            • dredmorbius 11 days ago

              More than fair point.

              That said, there's a long history of entrenched incumbents exerting anticompetitive pressures against new entrants. AT&T itself was founded in part out of spite against the telegraph monopolies, who opposed it. I've pointed to Bernard Stern's 1937 article "Resistances to the Adoption of Technological Innovations" several times on HN: <https://news.ycombinator.com/item?id=20532443>. The markdown may now be found at <https://rentry.co/szi3g>.

              The ISP situation is somewhat different, though if you look at early dial-up providers (Prodigy, AOL, CompuServ, and MCI Mail), those were generally and quite unambiguously aimed at creating captive markets. They differed from the more open general ISPs such as Earthlink, Mindspring, The World, SonicNet, etc., who largely offered protocols-based access (straight TCP/IP, SMTP/email, FTP/HTTP file/content access, IRC, and the like. Those have largely been subsumed by telco oligopoly providers, e.g., the new AT&T, Comcast, and Verizon in the US.

              I don't know to what extent there was or wasn't overt resistance to expanded Internet services, though the US's general lagging on widespread and high-speed broadband provision relative to global peers remains an issue. It was a significant talking point during the COVID-19 lockdowns when much in-person interaction switched to videoconferencing, a functionality still poorly supported by much of the population's Internet service.

        • PaulDavisThe1st 12 days ago

          1994: a UWashington CS PhD student quits their PhD and goes to help @home (IP-over-cable) get started, on the basis that they would provide symmetric up/down bandwidth at full capacity.

          How long was it until this became a reality?

        • tw04 11 days ago

          >Oh, please. Pray tell, inform us about how ISPs held back progress for 2 decades.

          In most of the country, until Google Fiber started rolling out, we were stuck with cable companies offering a max of 5Mbps upload, and starting to push caps. In wide swaths of the US, that is still the case.

          Meanwhile in many of the Nordic countries, Germany and parts of southeast Asia, people had 100Mb fiber to their homes in the early 2000s.

          But yes, I'm ignorant and have no idea what I'm talking about...

Analemma_ 12 days ago

People really love apocryphal quotes that portray famous or disliked figures as morons. Bill Gates never said "640k ought to be enough for anybody" either, yet that circulates to this day.

  • mywittyname 12 days ago

    And they never pass the smell test.

    Clearly the Chairman of IBM in the 50s doesn't believe they will only ever sell 5 computers. What would be the point in investing all of those resources into building anything with that sort of limit?

    It's obvious to anyone who understands business and takes a few seconds to consider the quote that he's likely talking about a specific product that is currently priced out of the market. The 5 in that statement is probably sourced from looking at their current clients and seeing A) who could afford such a machine and B) for which of those clients does buying the 701 make clear economic sense? The 20 companies they pitched to probably fell into category A, but they just miscalculated how many of those fell into category B.

    The goal was to drive down costs through economies of scale.

    • II2II 11 days ago

      I don't like the smell test because the smell test usually depends upon hindsight. Granted, I also don't like how most of these (supposed) quotes are interpreted: as a foolish lack of insight from industry leaders.

      Take actual IBM quote in context. Not only was the 701 a tube based machine, memory was a huge problem at the time, and computers were still being designed for business or scientific applications. The 701 was a scientific machine. It's use was pretty much limited to univiersities and military applications. It doesn't much matter wheter the number was 5 or 18 or 500, the potential market was limited. On the other hand, I doubt that IBM held similar views on machines targetted at the business market. Given how early in the history of programmable digital computers this was, it wasn't entirely clear the situation would change. That took the development of integrated circuits for both logic and memory. (The transistor may have been a prerequisite for the development of ICs, but they were insufficient for the development of computers that were accessible to anyone outside of major institutions. They were simply too expensive to reliably manufacture at scale.)

  • nonrandomstring 12 days ago

    Having a single person to identify with an idea via a quote or famous formula is a very human kind of shorthand we love. People still say "Edison invented the light bulb", even though we know that's not strictly true at all.

    People love revisionism in general. Equally we like it when doubt is cast on revered icons to take them down a peg. Or when grand villains and "found to be not as wicked as we once thought". We love treasured theories being overturned. We live in an age on the last frontier of truth, where any controversial claim that throws mud on a cherished belief is popular, just for it's iconoclasm.

    Meanwhile I think smart people say "I don't care if it's actually true or not." Did Jesus or Plato or Cicero or Machiavelli actually say that? Who cares? It doesn't matter. If it's a good story that makes a clear point or illustrates an idea, then it's useful. So long as it's not something deeply offensive or unfair to attribute, whether Watson actually said it is irrelevant. It speaks to a more abstract truth about scale and underestimates. It's the kind of fallible thing he would have said... might have said.... maybe should have said! As a famous businessman Watson would no doubt be proud to own that and have it associated with his name, even if slightly erroneously.

  • bitwize 12 days ago

    Today, 4 GiB, which was enough to run an entire university back in the 90s, is what the most rinky-dink Wal-Mart special laptops come with and Windows barely runs at all on that much.

    If Bill Gates had said "640k ought to be enough for everybody", at the time he could hardly be blamed for doing so, as single-user desktop machines of the day still typically shipped with 1/10 or 1/5 that much.

  • ciberado 12 days ago

    On the other hand, sometimes se non è vero, è ben trovato. We need stories. Models. The Gandhi we know was not the real one, same for Churchill or any other person, and the same thing happens with some villains. My personal point of view is that apocryphal quotes are just an extension of that mechanism, and can be useful in the construction of our thoughts.

    • shermantanktop 12 days ago

      Sure, but the formula is so clunky. Just take:

      A) a person famous for their brilliance or other quality

      B) a humdrum everyday failing experienced by regular people, such as hubris, poor ability to predict the future, problems in school, difficulty in relationships, etc.

      Mix them together and you get "Einstein flunked math in primary school" and Freud saying "who knows what women want" and other stuff.

      I'm all for stories, but these aren't very good ones.

      • lynguist 11 days ago

        Nah the formula is different:

        1. Take some conditional and true statement: “The original IBM PC can address 1024KB memory of which 640KB are in user space.”

        2. Remove all conditionalities: “The PC has 640KB RAM as that’s all one needs.”

        3. Attribute it to a famous figure: “Bill Gates said that 640K is all anyone ought to need.”

  • bombcar 12 days ago

    Even if Gates never said it, someone involved in the design of DOS decided that 640k would be the demarcation between normal memory and "reserved".

    So somebody, somewhere, decided that 640k was "good enough" vs 700k or whatever.

    • layer8 12 days ago

      It was decided by IBM for the IBM PC. The 8088 CPU had 1 MB of addressable memory, and some of it had to be reserved for the BIOS ROM, video hardware, and other expansion cards. So the exact limit was a trade-off between application RAM and hardware expansions. You could also phrase it as “384 KB is enough for BIOS and expansion cards”.

      Moreover, 640K is kind of a natural division point in hexadecimal notation. Address segments in the 8088 memory model were 64K, hence segments 00000–90000 were for RAM, and segments A0000–F0000 were for ROM and hardware.

      • ghaff 12 days ago

        There were so many gymnastics to get a bit more memory to work with. Multiple autoexec.bat and config.sys files for many games and the like.

        • netRebel 12 days ago

          Exactly that got me into my current career. I, too, juggled drivers as a kid to get Wing Commander/Doom etc. to actually start.

    • wrs 12 days ago

      This happens all the time. Early MacOS put flags in the upper byte of heap pointers, because somebody thought “16MB is enough for anybody”. Physical address extensions had to be added to 32-bit Intel because 4GB turned out to not be “enough for anybody”. Now “64-bit” processors today have 48-bit physical address spaces (or less), but we’ll see…

      • rvense 11 days ago

        Well, as you probably know, the first Mac shipped with a 68000 that didn't have the full 32 bit physical address space either, so that upper byte was basically free real estate.

        (The original Mac hardware is in many ways a masterpiece if you like tricks like this.)

        • wrs 11 days ago

          Indeed, but I didn't count Motorola's decision there as "24 bits is enough for anybody", because they didn't break anything in the 68000 when they added more bits to the address space (in other words, it was forward-compatible). It was the Heap Manager that broke!

    • bitwize 12 days ago

      More like someone at IBM. The memory map mapped BIOS, video, and PCjr cartridge memory into the upper memory area. Other contemporaneous, non-IBM-compatible x86 systems of the era could relax this restriction. The Tandy 2000 loaded its BIOS from disk (instead of having it in ROM) and ran MS-DOS (indeed, its BIOS API was IBM-compatible even though hardware wise it was not), and could access up to 768K of user memory flat out (896K with aftermarket expansions). This briefly gave it an advantage handling large spreadsheets and the like.

      • kmeisthax 11 days ago

        When will processor manufacturers realize that finite pointer sizes are always inevitably insufficient?

  • rqtwteye 12 days ago

    A lot of Einstein quotes are either misquoted and/or not by him. Same for Churchill.

    Even the current news likes to pick quotes out of context. Trump says a lot of dumb things but often when I hear the full context of something people are mad about, he didn't say that. I am sure the right wingers do the same.

thayne 12 days ago

> Some people question how much of the internet is a place that documents history, and how much of the internet is a place that writes and recreates history.

So, basically the same as things written before the internet existed. It's not like people didn't write down myths and legends on paper, or stone tablets for that matter.

  • PaulDavisThe1st 12 days ago

    > It's not like people didn't write down myths and legends on paper, or stone tablets for that matter.

    Across broad swathes of the planet (i.e. all of the Americas), they did not (until very very recently in the overall scheme of human history)

    • samatman 12 days ago

      *most of the Americas.

PeterStuer 12 days ago

A litle bit more cloud consolidation and you could argue we're nearly there.

codeulike 11 days ago

There are similar statements from other people in the computing field around that time

https://www.sciencebase.com/science-blog/predictive-text-dar...

Sir Charles Darwin (grandson of the naturalist) who was head of the UK’s computer research centre, the NPL (National Physical Laboratory) said in 1946:

“it is very possible that … one machine would suffice to solve all the problems that are demanded of it from the whole country”

Douglas Hartree, Mathematician and early UK computer pioneer said in 1950: "We have a computer here in Cambridge, one in Manchester and one at the [NPL]. I suppose there ought to be one in Scotland, but that's about all."

This 1969 talk by Lord Bowden about the 1950s explains the thinking behind that statement:

https://www.chilton-computing.org.uk/acl/literature/reports/...

I went to see Professor Douglas Hartree, who had built the first differential analysers in England and had more experience in using these very specialised computers than anyone else. He told me that, in his opinion, all the calculations that would ever be needed in this country could be done on the three digital computers which were then being built - one in Cambridge, one in Teddington and one in Manchester. Noone else, he said, would ever need machines of their own, or would be able to afford to buy them. He added that machines were exceedingly difficult to use, and could not be trusted to anyone who was not a professional mathematician, and he advised Ferranti to get out of the business and abandon the idea of selling any more. It is amazing how completely wrong a great man can be.

  • rvense 11 days ago

    It's easy to laugh at quotes like this, but they really had a different understanding of the word computer than we do. These machines were in fact huge and required specialists to use and didn't really do much except calculate. It's fairer to say that they were misjudging what computers would become.

    I'm not sure I can say that I would have been able to look at an early 1950's computer and imagine a 1960's DEC or IBM machine under every bank, or whatever, much less desktop publishing or Defender of the Crown...

    • codeulike 11 days ago

      Oh yes definitely. But its fascinating because its looking back at the dawn of a completely new thing in the world, and how hard it was for people to see where it was going to go.

      Interestingly if you look back at Alan Turing's writing around that time, he seemed to have a much better grasp of what a _big_deal_ this was all going to be. It would be amazing to go and fetch Turing with a time machine and bring him to our time. Show him an iPhone, his face on the UK £50 note, and Wikipedia's list of https://en.wikipedia.org/wiki/List_of_openly_LGBT_heads_of_s...

    • philipwhiuk 11 days ago

      > It's easy to laugh at quotes like this, but they really had a different understanding of the word computer than we do. These machines were in fact huge and required specialists to use and didn't really do much except calculate. It's fairer to say that they were misjudging what computers would become.

      Or were they predicting what is yet to come.

      "These machines were in fact huge and required specialists to use"

      Sounds like Azure to me.

bitwize 12 days ago

As I understand it... one of the reasons why the Soviets fell behind in computer technology was because back in the 60s, while Soviet engineers had good designs that were state-of-the-art for the era, the communist economic planners estimated the requirements for computer manufacture to be one per university or government department for a total of maybe a few thousand, while Western manufacturers were getting orders into the tens or hundreds of thousands... and they had to come up with new technologies to produce the machines faster and cheaper in order to keep up, let alone compete with other manufacturers. So the market in the west grew explosively, requiring concomitant growth in innovation, and that put the Soviets on the back foot, requiring them to smuggle in and reverse engineer System/370s, PDPs, etc. in order to stay current.

  • logicalfails 12 days ago

    Any good books or sources on this? I would be interested to read more

    • bschne 12 days ago

      +1, wasn’t aware of this, curious to learn more

  • InvisibleUp 12 days ago

    It also didn’t help matters that Stalin was greatly opposed to cybernetics, resulting in no research done on the topic until 1954, the year after he died. And even then, things didn’t really kick off until 1958.

  • vegabook 12 days ago

    See: intel / iphone

dr_dshiv 11 days ago

I had thought, but perhaps incorrectly, that it was JCR licklider when discussing time sharing computers.

The idea that you could have massive computing centers that were serving world needs…

Sounds ridiculous, but then AWS, Google, Microsoft…

I can’t find it right now, so I’ll have to leave it to others, but his “Man Computer Symbiosis” paper from 1960 is amazing and you probably ought to read it in light of recent tech… https://worrydream.com/refs/Licklider_1960_-_Man-Computer_Sy...

jvandonsel 11 days ago

"I predict that within 10 years, computers will be twice as powerful, ten thousand times larger, and so expensive that only the 5 richest kings of Europe will own them."

     - Professor Frink
metalman 12 days ago

The way things are going this might be an over estimate, what with the possibility of a space based completely stable billion cubit QPU's, beaming all out data around with lasers, 3 might do it.

HeyLaughingBoy 12 days ago

I can't imagine any salesperson making a statement like that!

  • bluGill 12 days ago

    Sometimes they will - when they are predicting who might buy and thus how much they need to make to hit their numbers. No salesman want to have unmeetable sales goals.

    still rare though.

unyttigfjelltol 12 days ago

IBM confirmed they went into a sales cycle in 1953 expecting to sell five units of their first machine, the IBM 701 Electronic Data Processing Machine. We don't know precisely how or to whom this estimate of five units was conveyed beforehand, but the gist of the statement appears likely.

amelius 12 days ago

"is", not "will only ever be"

  • TheCoelacanth 12 days ago

    The worldwide part is also made up. He wasn't talking about the entire world market, just the companies they tried to sell to on one specific marketing campaign.

ChrisMarshallNY 12 days ago

"Don't believe everything you read on the Internet." -George Washington

js98 12 days ago

This website is unreadable on mobile.