I guess I sort of get it, but I always kind of wondered why Silicon Graphics never made a "prosumer" computer. With the (kind of) exception of the Nintendo 64, it seems like most of the Silicon Graphics machines were tens of thousands of dollars in the 90's.
I am curious what the world would be like now if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it. For that matter, I sometimes wonder what it would be like if Nintendo had released kits for the N64 that let you use it more as a computer.
I'm just thinking that it's all about timing; if they had released something in 1997 for "prosumers", before OS X came out, would Apple have its same market position now? Would we all be using SGiPhones? Would every tech startup get all their engineers little SGI laptops?
The SGI Indy was their low end line with a price starting around the $5K range which was much cheaper than the Indigo2 line. That was about the same price range as a Dell Dimension Pro150 at the time.
They went on an acquisition binge in the mid 90's buying Cray, Alias, Wavefront and, Intergraph just when the PC industry busses where starting to catch up and they they announced that they were dumping MIPS and moving to Itanium....
But the sub $2k price wasn't really possible. Remember even the 256K version of the Pentium Pro was more than $1000 for the CPU alone in the mid 90's.
The UNIX Wars, PC improvements, NT optimism, Itanium mess and a directionless M&A killed a lot of companies. Dec, Novell, SGI etc...
Dell targeted that high end consumer market, with enthusiasts as a their target to avoid the higher support costs and low profit of the home market.
As someone who was running SGI's at the turn of the century in the movie industry...nothing about SGI really promoted brand loyalty, it was a tool to run software that you needed. Once Maya was ported to linux, you were far better off running a cluster than buying more expensive SGI units.
The good old days almost never were. I still remember having to re-insert the same CD what felt like dozens of time for even a simple IRIX install. I also remember that the remote install required bootp, tftp, and password-less root rsh! on the server...so you ended up using the CD's to install.
That said, I do miss CXFS and CXVM, their clustered filesystem.
Yeah that's fair, as I mentioned, I wasn't really doing anything with Silicon Graphics in the 90s, so I'm only able to look back at this stuff from a kind of historical perspective, and it's easy for these things to kind of feel more legendary than they actually were as a result.
$2,000 might have been a bit ambitious, but sub $3,000 does seem like it was achievable, but it does make sense that Dell kind of swooped in and took that market.
You know, it's weird, because I always kind of considered Linux as a bit of a niche geeky thing. I run it, I like it, but I always sort of felt like I was the weird one, and I think for consumers I am, but reading about this stuff it looks like Linux caught on in workstation space pretty quickly. I noticed that the expensive professional video editing software has been used in Linux for awhile, it's just the consumer and prosumer side that struggled, and as you mentioned it has had a port to Maya for quite awhile.
While way too complex to accurately flesh out here, Linux took off mainly because the GNU userland had been in progress for almost a decade before and was ready, while the legal issues and rift on the 386BSD caused it to miss the window of common cdrom drives and the commercial internet.
But yes, I remember being forced to do some silly radio show in ~1997-8 where I was brought in to talk about Linux, as I had saved him a lot of money setting up DNS/Sendmail/Pop on PC's vs the decstations he was using.
Harley Hahn was the other person on the show, and despite Nasa and others using it, he was insistent that it would never replace commercial unix.
As I was maintaining OSF/1, SunOS, Irix, Interactive, SCO, AIX, AT&T, coherent UNIXs at that point, along with systems like MPE/IX, os400, OS/2, NT, Netware etc... day to day; it was quite obvious how much time and money it saved.
IIRC SCO single user desktop license w/networking was ~$800 and you had to pay another $600 more to get the development kit...without updates.
Obviously the huge bump in popularity was when LAMP became popular.
> it looks like Linux caught on in workstation space pretty quickly.
At least for film and visual effects, this is because the industry was previously dominiated by SGI and most major studios had pipelines built on IRIX.
When NT systems became cost competitive, those studios were reluctant to switch even though the hardware was compelling, due to major challenges with the OS switch. However once Linux came along, it was easy since the OS was so similar. Cheaper hardware, same-ish OS. Done deal.
Even a sub-$3k SGI would have been extremely difficult to pull off. PC manufacturers, even ones making higher end workstations, benefitted from Intel's economies of scale. Even if they were ordering custom motherboards they were picking a lot of components out of a catalog.
SGI machines had a lot more bespoke components. Even if they chased higher volumes they would have a far smaller scale than the PC space. Even then how would SGI chase volume even if they had a "cheap" option? They had no experience with retail sales. None of their reps would have tried to sell machines with lower price tags and sacrifice a commission.
The Indy (at $5k to start) was derided as an "Indigo without the go". I don't think SGI would have had any success building their own take on the PowerMac 4400. If an Indy couldn't make SGI customers happy there would be no way for a lower spec machine to do so.
Also, keep in mind that an Intel-based system *competitive with an SGI system* was still *much* more than $2K to $3K.
You cannot simply compare the price of a no-name corner shop parts bin PC with something designed to the same standards as an actual workstation and declare that the latter "should" cost the same as the former.
I had an Indy on my desk when it came out, with a 21" CRT monitor. I think they thought that's what the Indy was. I'd still go downstairs to another lab to play with the NeXT cube that was there.
It was the culture of the time -- they didn't want to move downmarket and seem like PCs, so they (and their competitors) kept the workstation class a bit more premium and hoped to get enough business/tech purchases from the likes of academia and entertainment. They weren't really trying to sell to devs so much.
I am sure there was a whole bunch of factors, but, at least from my lay perspective, it seems like a recurring theme was that these companies kind of underestimated how good even relatively cheap Intel CPUs were going to get in the 90's, making it so that even consumers could afford a pretty powerful computer, or at least powerful enough to be "useful", and then it becomes less obvious why you'd spend $5,000-15,000+ on a fancy workstation; if I can get 80% of the value of an SGI with just a decent Pentium and a bit of extra RAM, for 1/4 the cost, most people are going to go with that.
Of course, I don't know what I'm talking about, I'm confident a lot of people on HN know more about this than I do, this is just a Wikipedia-level understanding of this seems to indicate to me.
Still, I do like to think about it. A part of me thinks, and I have no way to confirm this, that there might have been bigger ambitions for the N64, to convert it into a "real" computer, so you could have a "real" SGI machine at home (though obviously less powerful than an Indy or something).
I got a lot of flack from leadership at the lab where I worked in '97 and '98 when I bought Dual Pentium Pro 200MHz and later a Dual Pentium II 400MHz that ran Linux. They asked why I didn't buy HP, or SGI desktops. I pointed out that the grant i had- for $5K - wasn't really enough to buy a single machine (our lab had the cheapest SGI Indy, which IIRC was $10K) from HP or SGI, while for that amount of money, I could get dual processor experience, and more RAM/disk than the lowest-end HP or SGI. I ported many codes over from SGI/Tru64 to Linux on those machines.
Later I got a $15K budget to build a cluster, and again was asked why I didn't just buy a single SGI machine, and I pointed out I was able to get six PCs, each of which was 50% the speed of a single SGI machine, and I could cluster them to get experience running MPI jobs. 10 years later, all the labs had retired their non-Linux machines aand were running various white box PCs or HP servers running Linux, and I had 10+years of experience running HPC jobs on Linux clusters.
If you had a very large budget you could get high-end SGIs that had custom capabilities that no PC at the time could match but even then I specifically chose to not need those capabilities until they were commodity.
One thing about this period that kinda annoyed me in the tech press, is that it always felt like these companies were making new/better computers for "their existing customers" as if they were only ever competing against their own older products.
Another thing, which perhaps "grinds my gears" a lot more, is that this late-90's/early-00's shift to PCs happened before Linux was sufficiently taken seriously. So lots of high-end applications that started on UNIX migrated over to Windows NT. And once you're firmly on Windows, its much harder to go to Linux. (Whereas commercial UNIX to Linux is easy.)
So now there are whole markets (that used to support commercial UNIX) where Linux users get the middle finger, and as someone who hates Windows, this really ticks me off.
Don't forget that Microsoft did an awful lot of non-technical work to "facilitate" the porting of large workstation applications to Windows NT.
If you had a large-scale workstation application, Microsoft would assign you a relationship manager whose job was to convince you to port your software to Windows NT. In addition to wining & dining your execs, they would provide lots of engineering resources: Free Windows, developer tools, and documentation licenses; direct access to Windows engineering teams to help with issues you ran into doing your ports (for which they'd strongly push you to use native APIs); assistance with choosing and deploying hardware; and sometimes even free hardware and on-site "sales engineers" for a time to work through initial bring-up. This type of love-bombing never lasted forever, but for most vendors at the point where they started guiding a vendor towards a more "normal" relationship they had the revenue justification for that normalization.
Microsoft used the same strategy to get people to port their DOS and then Macintosh applications to Windows, to port their games to Microsoft's platforms (whether from DOS to Windows or from other consoles to Xbox and PC), to port their client/server applications and then web applications to Windows Server, and so on, and it has *always* been extremely effective.
I hate Windows now because I'm comparing it to modern Linux and macOS, but in the 90's wasn't Windows NT pretty competitive with some of the commercial Unix offerings? I thought it had some pretty cool stuff in regards to non-blocking IO, and NTFS was pretty ahead of its time compared to most of the Unix filesystems at the time I think?
I wasn't really writing code in the 90s, so it's tough for me to say with confidence, but I thought I read somewhere that Windows NT was, in some regards, objectively better than most of its competition in the 90's.
The NT kernel was designed around non-blocking async I/O; all I/O in NT is async and leveraged completion ports. I believe Solaris is the only other OS with IOCP.
Back then, Linux had either a primitive scheduler or O(n) scheduler, until 2003. NT shipped with an O(1) scheduler.
I personally feel NT still does high pressure memory management better than Linux. No opinion about other OSes, although macOS will ask the user what to force quit -- then again, given people have been seeing memory ballooning in random apps, including OOTB apps on macOS 15... Apple has other issues.
NT is great. It's just plagued by Win32. And those designers...
This is pretty much correct. Windows NT was far better than DOS and feature competitive with the Unices of the time, along with being available on more modest hardware.
The Unix wars were also raging, and compatibility between Unices still hadn't been sorted out (and arguably never was) so a company with workstation class software had to port their code between mostly compatible operating systems and wildly incompatible GUI frameworks. So shipping an NT product wasn't the big deal that it seemed.
That makes sense; even within the Linux world in the year of our lord 2025, binary compatibility is still kind of an issue. I have had issues getting regular Linux binaries working in NixOS [1], and even getting stuff working between Ubuntu and OpenSUSE and Fedora can be a pain.
It totally makes sense why developers would see Windows NT as the future here; it gave you most of the features you'd want from Unix-land (and I think some new stuff too that wasn't available in Unix?), and having "platform to rule them all" is appealing to most developers for obvious reasons. It doesn't hurt that Win32 isn't too hard to code against (at least it wasn't when I played with it 15 years ago).
I do find it a bit strange that there wasn't really a "de facto" Unix that people coded against, like a clear winner that people liked, but I guess if the hardware was too expensive to run it, that's going to cut down on usage; we didn't really get "standard Unix" until OS X.
It was fairly competitive, but it also took quite awhile to build up a full set of applications. Real NT didn’t run a lot of older Windows apps. And the hardware it ran on didn’t scale vertically in the server market like the UNIX systems. So it took some time for the software to fully arrive and some time for the hardware to get fast and big. And the UNIX vendors failed to see that the initial disadvantages were just that: initial. With the price of PC hardware behind them, Microsoft overcame those disadvantages quickly. Had a real UNIX been available on the same hardware and had a similarly aggressive and well financed vendor behind it, there might have been a chance. But none of the UNIX vendors took that market seriously. We shouldn’t forget though that MS hedged their bets on PC hardware and ported NT to DEC Alpha. So even they did not see the inevitability of the rise of PC hardware.
> I am curious what the world would be like now if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it.
I'm not sure it would have made much of a difference as the low end of the market was driven by gaming and multimedia (i.e. movies on PC screen), and the high end of the graphics market was driven by rendering for expensive video and film production. In short, what you needed to play Fallout (1997 release) at home was an order of magnitude less than you needed to render CGI for Men in Black (1997 release).
> if they had released something in 1997 for "prosumers", before OS X came out, would Apple have its same market position now
I believe Apple would have been just as successful as they are today. Software availability (e.g. Adobe and Microsoft Office) and ease of use were very good on Mac, and not so good on workstations like SGI. There was a very small, very demanding market for SGI's workstations (rendering super high quality graphics), and there was a huge market for what Apple supplied (business, print, web graphics, music production, video production). In short the software ecosystem wouldn't have happened. Apple did a great job when OSX came out of making it easy-ish for app developers to port to OSX and allowed users to run their old mac software on their new OSX powered mac.
> Would we all be using SGiPhones?
Probably not. System V unix was very expensive to license (hence Android being Linux based and iOS being based on BSD and Mach) and would have added considerable cost to each mobile device based on licensing at the time. A lot of what made it possible to package up a modern smartphone was open source software + low cost components with ridiculous capability (for their cost). None of this was of interest to SGI where they were focused on high-end equipment with little commodity appeal.
I agree with the software availability, but I'll just note that Adobe Photoshop and Illustrator were ported from classic macos to Irix with a Unix porting toolkit. Adobe Framemaker was also available.
I spent years thinking about this at the time, and wrote up a crystalized super-long "why?" in an SGI retrospective comment on an earlier thread, https://news.ycombinator.com/item?id=39960660 which you might find of interest.
Feedback from SGI insiders or pointers to HBR case studies welcome.
Application compatibility mattered a lot back then. There was no Microsoft Word or Excel for IRIX and Google Docs / Google Sheets didn't exist, for example. Many of the prosumers wouldn't have been able to use it as a daily driver. I recall that Adobe ported Photoshop to IRIX at one point, but I think it was gone by '97.
I wonder how much of this is a chicken-egg problem though; IRIX didn't have enough of a user base to justify porting over a lot of these applications, and because they didn't have these applications it couldn't develop a user base big enough, etc.
I think what you're saying is totally correct, if you're going to be spending a lot of money on a computer, you need to make sure it can actually do the stuff you need it to, and in the 90's that does more or less imply Microsoft Office compatibility.
Still, I do kind of wonder if SGI had actually tried to penetrate this market, that maybe they could have made this work. They could have licensed and ported some of these applications over themselves, and if they had gotten big enough then maybe some of these companies would have ported these things over.
Well, they tried producing high end Windows NT machines at one point, that would have run Office, had OpenGL support, etc. But no one bought them. So its not clear that doing the same but with Irix would have gone any better.
The last-gasp part of SGI I would have like to have seen develop was Cellular IRIX. If I recall it was a sort of distributed OS where the scheduler was node aware and could parcel out tasks within a single process to other nodes. What NFS did for the local/remote file system split, Irix would do for compute. Never came to fruition.
That sounds pretty cool, I guess the idea was that the scheduler could figure out smartly where to put applications and minimize latency in the process?
Yes, much like you would in an HPC system (much of Cellular IRIX was supposed to have been influenced by what Cray and SGI learned from each other when they merged).
More details (PDF) here. Remember this is from 1998. Reading it again in 2024 there is of course quite a bit of foreshadowing of what we ended up with in cloud. And you won't be surprised to learn that the author ended up passing significant time working at both Microsoft and AWS after their SGI days.
Closest thing I can think of is this: https://en.wikipedia.org/wiki/SGI_Visual_Workstation
It had some custom hardware choices different from typical PCs of the day (IIRC it had a crossbar instead of a system bus. It needed a special HAL to boot NT; Windows 2000 was the last version that supported the machine. It was extremely expensive for its capabilities and everybody in my lab looked at it and concluded "SGI is dead."
The Innovators Dilemma explains this, but was not well known yet in the late 90s. In short, companies were unable to give up high profit margins, which led to their doom.
> why Silicon Graphics never made a "prosumer" computer
They were stuck in a hopeless corporate mindset where their customers were all deep pocket mega corps. You had to go through a sales process to buy one, there was no demo machine at the local computer shop. They also had a brain damaged sales team that would do things like e.g. under-spec hardware with too little RAM to make a sale leaving customers with costly hardware that struggled under loads damaging their reputation (e.g. imagine a class room full of underspecd machines leaving students thinking "this pos made Jurassic Park?").
> if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it.
They would look exactly like the Indy or the later O2 which were their entry level machines.
> I sometimes wonder what it would be like if Nintendo had released kits for the N64 that let you use it more as a computer.
I think SD TV's of the time would have made rendering text a bit difficult forcing you to use large font making any substantial text work impractical (e.g. word processing or code editing). However, a better N64 fantasy IMO would be to include an optical drive in addition to the cart slot, double the RAM, increase the pitiful 4kb texture cache, then for good measure - N64 themed kb, mouse and Ethernet adapter ;-).
> I'm just thinking that it's all about timing; if they had released something in 1997 for "prosumers",
That line would be the SGI Visual Workstations. The first two, the 320 and 540 had custom motherboards with SGI chipset, RAM and GPU. The GPU used unified memory and you needed to buy expensive custom SGI RAM DIMMS. They only ever ran Win NT 4.0 and 2k because of the GPU drivers. Then there was the 230, 330, and 550 - strait up ATX Wintel boxes with Nvidia cards. My main desktop is a 550 case with a Threadripper in it. But again, IMO, they had no clue how to sell to small shops and individuals and by then, too little, too late.
> I think SD TV's of the time would have made rendering text a bit difficult forcing you to use large font making any substantial text work impractical
Yeah, in this hypothetical conversion kit they've have to give you some kind of better connections to plug into a monitor, and obviously some kind of conversion kit, maybe a double-expansion-pak, and maybe some kind of external optical drive or something similar to the N64 DD with a CD drive.
I dunno, I don't think it would necessarily have failed, but we'll obviously never really know. Sony did that with the OtherOS stuff for awhile on the PS3, and I think there was a subset of people who really liked it for scientific computing.
Back then, when SGI quit the graphics business and went to supercomputers only (having NUMA as IP, and going with Linux) the GPU devs each ended up at Nvidia.
Apple had to pick Be, Inc. or NeXT (who had Jobs). They went with the latter. Be, later on, went to Palm, but that was too little, too late.
SGI was one of the many Unix mastodons who fell victim of PC hardware (x86-32, then x86-64) and Windows NT, Linux being good enough, with Apple taking the piece for graphics designers and audio engineering (many whom came from Amiga, and SGI). Oh, and I forgot to mention: they bet on the Intel Itanic.
I wouldn't pay a dime for these nowadays. Way too slow / inefficient W/performance. Which is really sad, but with current energy prices I just wouldn't be able to afford my Octane 2 using 1 kW anymore. I owned many SGI machines, really loved them (the sound card in the Indy was insanely good! And IndyCam from a time when hardly anyone had a webcam). Indy (various, including Challenger S), Indigo (including purple remake), Indigo 2, and Octane 2. But never an O2 :) nor a Fuel or Tezro ;) cases, although plastic and sensitive to scratches, were great aesthetical, too. While I would not want such machine anymore, I will cherish the memories though! Plus, there are some people reusing the cases. Cool beans! IMNSHO, SGI machines like Indy belong in any half decent Unix museum.
At a previous apartment, I managed to convince my wife we needed a giant server rack, so I started looking on Craigslist and eBay for one.
I really wanted an SGI server rack because I thought it looked cool, and because I found them historically very interesting, but sadly those appear to have a bit of a collectors' market and are too expensive. Instead, I ended up buying a Sun Microsystems rack, which to be fair is also pretty cool historically.
I would love to buy a broken SGI machine and put some modern hardware in there, just for the aesthetic. Maybe someday.
> I would love to buy a broken SGI machine and put some modern hardware in there, just for the aesthetic. Maybe someday.
The closest you can easily get is the 230/330/550 cases which are strait up ATX. I have a 550 case with a threadripper in it. The only issue was I had to modify the 750W PSU to fit in the case by replacing the IEC inlet with a short pigtail cord.
Unfortunately I cant find the pics of the build at the moment. It's a stock SGI 550 case so nothing special done to the case. I replaced the PIII Dual Xeon board with an Asus Threadripper board w/ 1920x CPU, 32GB ECC RAM, Radeon Pro W5700, 1TB NVMe, 10Gb NIC. Daily driver running Void Linux Musl.
The mod to the PSU was needed because the IEC inlet was practically at the edge of the PSU housing which is partly obstructed by the edge of the SGI case. I could have modded the case by cutting the steel but I didn't want to cut up a unique case when the PSU is easier to replace. So I opened the supply, unscrewed the IEC inlet, de-soldered the leads from the inlet to the board then soldered in a length of 18 AWG cord. I covered the IEC inlet with a piece of 16 gauge (~1.2mm) aluminum with a 16mm hole for a cable gland and two screw holes for mounting with machine screws and nuts. The cord is secured by a metal cable gland and terminates to an in-line IEC socket. NOTE: I did replace the X and Y caps across the factory inlet with new ones I soldered to the PSU board with the cord.
The reason I ask, it's not like they were just making "components" like 3dfx or Nvidia at the time, they were making end-to-end workstations. Why is it such a stretch to think that they could have taken on Apple, especially since Apple in the 90s was kind of languishing?
> There are no new machines being produced, and high-end machines are in short supply. Less expensive machines can be had. We do not recommend using eBay to search, as the prices are usually extremely inflated.
Less expensive machines can be had on the non-eBay used market, or have we hit the point where the cheapest way to have such a machine is dedicated emulation like https://dmitry.gr/?r=05.Projects&proj=33.%20LinuxCard ?
There's a parts vendor who charges pretty high prices and a small, insular community that thinks his prices are just fine and bid everything up to that level as they try to acquire (many) duplicates of systems they already have, which results in a high barrier to entry for people interested in doing things with SGI systems.
Half right. Are you talking about Ian? that happened on one transaction.
Considering how tight knit our community is... I prefer to err on the side of respect, even if I disagree with some people's practices.
There's an SGUG admin I disagree with on this, but we get along anyways because we're both adults. I won't pretend I've been perfect, but quitting the booze in the quantity I was doing helped a bunch
If we can get an FPGA core of the Indy's XL/24 board and helper chips, you can buy a cheap pack of R5ks on alibaba last I checked and probably make a smol sgi board
The main box says "Tech-Pubs.net, or TechPubs, is a public wiki cataloging the hardware of the former Silicon Graphics Corporation." but you have so much more content than a catalog of hardware.
Do you plan to expand to other things that SGI produced? Publications? Swag?
* If you can find ways to collect and preserve copies of the software, that can be big value. Every version and variation of a title has sometimes been important after the fact. Maybe use archive.org for historical archiving of software with unclear licensing status, and link to it from your wiki, with your wiki providing the background text and organizing that isn't archive.org's strong suit. (I'm not talking about piracy, but just trying to ensure that any copy at all survives, which had been a real problem on some other platforms of this era. Also, it's easier to get a company to say that such-and-such software from a company three acquisitions ago is OK for people to run on vintage boxes and in emulators and museums, than to ask them to find and provide a working copy, which they usually cannot.)
* You might want to capture provenance/copyright info for uploaded photos, like Wikipedia kinda does. Easier to capture it at upload/linking time, than to try to reconstruct it later.
I obtain copyright permission of all original images before uploading and give people clear contacts to do it. I refuse to install extensions that could clobber the setup (many aren't compatible with postgres). I also make an attempt to contact rightsholders. the only stuff I don't have permission for is a few system image placeholders and the logos of companies. The SGI trademarks are basically gone though, HPE ain't using them.
> If you can find ways to collect and preserve copies of the software, that can be big value. Every version and variation of a title has sometimes been important after the fact. Maybe use archive.org for historical archiving of software with unclear licensing status, and link to it from your wiki, with your wiki providing the background text and organizing that isn't archive.org's strong suit. (I'm not talking about piracy, but just trying to ensure that any copy at all survives, which had been a real problem on some other platforms of this era. Also, it's easier to get a company to say that such-and-such software from a company three acquisitions ago is OK for people to run on vintage boxes and in emulators and museums, than to ask them to find and provide a working copy, which they usually cannot.)
Over at IRIXNet we do have some files and such preserving what's not likely to get us in trouble, but I choose to not deal with copyrighted works on tech-pubs (which unlike IRIXNet is my pet project, not a part of IRIXNet)
Well, for me, it's just not feasible for me to take the gambles archive.org and others do. I am a professional locksmith in IRL, and "piracy" could lead me to losing my livelihood as my license could be revoked. I also have only so many hours in a day to keep stuff running. You think I need more mailboxes to be checked etc?
Plus, nowadays, dmca rightsholders will bypass you, going straight to the host provider who will lock you down for gods know how long until someone gets back to you.
I am not like Peter Plank, who ran nekochan.net. I have professional infrastructure,paid developers and a few sysadmins who help me run it all. It costs me nearly a grand a year to keep the lights on.
Is it worth the risk? Honestly, it's not! The only things I host are IRIX install media, and HPE moreorless said informally/noncommittally, "don't sell it and we're not gonna care!"
How about adding information about IDO, like the static recompilation project being used in N64 game decompilation projects? It enables compiling code in a linux environment and having it match 1:1 as if it were run by the original compiler. There's even significant progress in decompiling the entire codebase as well.
If you don't tell people where, how are we gonna fix it? I found ONE broken image. one. It took me hours to find it. Please don't just vaguely tell me one image somewhere is broken. Please be specific next time!
That language isn’t necessary, for me the site has always loaded with no images so i thought it was obvious, apologies if that’s not what you or others have seen.
Well that's a "let's get details to replicate issue" Level problem. Clearly, images are supposed to load. Give me details to chew on. I mean, if you're on HN, you know better than to say "It's broken" level of detail.
I guess I sort of get it, but I always kind of wondered why Silicon Graphics never made a "prosumer" computer. With the (kind of) exception of the Nintendo 64, it seems like most of the Silicon Graphics machines were tens of thousands of dollars in the 90's.
I am curious what the world would be like now if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it. For that matter, I sometimes wonder what it would be like if Nintendo had released kits for the N64 that let you use it more as a computer.
I'm just thinking that it's all about timing; if they had released something in 1997 for "prosumers", before OS X came out, would Apple have its same market position now? Would we all be using SGiPhones? Would every tech startup get all their engineers little SGI laptops?
The SGI Indy was their low end line with a price starting around the $5K range which was much cheaper than the Indigo2 line. That was about the same price range as a Dell Dimension Pro150 at the time.
They went on an acquisition binge in the mid 90's buying Cray, Alias, Wavefront and, Intergraph just when the PC industry busses where starting to catch up and they they announced that they were dumping MIPS and moving to Itanium....
But the sub $2k price wasn't really possible. Remember even the 256K version of the Pentium Pro was more than $1000 for the CPU alone in the mid 90's.
The UNIX Wars, PC improvements, NT optimism, Itanium mess and a directionless M&A killed a lot of companies. Dec, Novell, SGI etc...
Dell targeted that high end consumer market, with enthusiasts as a their target to avoid the higher support costs and low profit of the home market.
As someone who was running SGI's at the turn of the century in the movie industry...nothing about SGI really promoted brand loyalty, it was a tool to run software that you needed. Once Maya was ported to linux, you were far better off running a cluster than buying more expensive SGI units.
The good old days almost never were. I still remember having to re-insert the same CD what felt like dozens of time for even a simple IRIX install. I also remember that the remote install required bootp, tftp, and password-less root rsh! on the server...so you ended up using the CD's to install.
That said, I do miss CXFS and CXVM, their clustered filesystem.
Yeah that's fair, as I mentioned, I wasn't really doing anything with Silicon Graphics in the 90s, so I'm only able to look back at this stuff from a kind of historical perspective, and it's easy for these things to kind of feel more legendary than they actually were as a result.
$2,000 might have been a bit ambitious, but sub $3,000 does seem like it was achievable, but it does make sense that Dell kind of swooped in and took that market.
You know, it's weird, because I always kind of considered Linux as a bit of a niche geeky thing. I run it, I like it, but I always sort of felt like I was the weird one, and I think for consumers I am, but reading about this stuff it looks like Linux caught on in workstation space pretty quickly. I noticed that the expensive professional video editing software has been used in Linux for awhile, it's just the consumer and prosumer side that struggled, and as you mentioned it has had a port to Maya for quite awhile.
While way too complex to accurately flesh out here, Linux took off mainly because the GNU userland had been in progress for almost a decade before and was ready, while the legal issues and rift on the 386BSD caused it to miss the window of common cdrom drives and the commercial internet.
But yes, I remember being forced to do some silly radio show in ~1997-8 where I was brought in to talk about Linux, as I had saved him a lot of money setting up DNS/Sendmail/Pop on PC's vs the decstations he was using.
Harley Hahn was the other person on the show, and despite Nasa and others using it, he was insistent that it would never replace commercial unix.
As I was maintaining OSF/1, SunOS, Irix, Interactive, SCO, AIX, AT&T, coherent UNIXs at that point, along with systems like MPE/IX, os400, OS/2, NT, Netware etc... day to day; it was quite obvious how much time and money it saved.
IIRC SCO single user desktop license w/networking was ~$800 and you had to pay another $600 more to get the development kit...without updates.
Obviously the huge bump in popularity was when LAMP became popular.
> it looks like Linux caught on in workstation space pretty quickly.
At least for film and visual effects, this is because the industry was previously dominiated by SGI and most major studios had pipelines built on IRIX.
When NT systems became cost competitive, those studios were reluctant to switch even though the hardware was compelling, due to major challenges with the OS switch. However once Linux came along, it was easy since the OS was so similar. Cheaper hardware, same-ish OS. Done deal.
Even a sub-$3k SGI would have been extremely difficult to pull off. PC manufacturers, even ones making higher end workstations, benefitted from Intel's economies of scale. Even if they were ordering custom motherboards they were picking a lot of components out of a catalog.
SGI machines had a lot more bespoke components. Even if they chased higher volumes they would have a far smaller scale than the PC space. Even then how would SGI chase volume even if they had a "cheap" option? They had no experience with retail sales. None of their reps would have tried to sell machines with lower price tags and sacrifice a commission.
The Indy (at $5k to start) was derided as an "Indigo without the go". I don't think SGI would have had any success building their own take on the PowerMac 4400. If an Indy couldn't make SGI customers happy there would be no way for a lower spec machine to do so.
Also, keep in mind that an Intel-based system *competitive with an SGI system* was still *much* more than $2K to $3K.
You cannot simply compare the price of a no-name corner shop parts bin PC with something designed to the same standards as an actual workstation and declare that the latter "should" cost the same as the former.
I had an Indy on my desk when it came out, with a 21" CRT monitor. I think they thought that's what the Indy was. I'd still go downstairs to another lab to play with the NeXT cube that was there.
It was the culture of the time -- they didn't want to move downmarket and seem like PCs, so they (and their competitors) kept the workstation class a bit more premium and hoped to get enough business/tech purchases from the likes of academia and entertainment. They weren't really trying to sell to devs so much.
I am sure there was a whole bunch of factors, but, at least from my lay perspective, it seems like a recurring theme was that these companies kind of underestimated how good even relatively cheap Intel CPUs were going to get in the 90's, making it so that even consumers could afford a pretty powerful computer, or at least powerful enough to be "useful", and then it becomes less obvious why you'd spend $5,000-15,000+ on a fancy workstation; if I can get 80% of the value of an SGI with just a decent Pentium and a bit of extra RAM, for 1/4 the cost, most people are going to go with that.
Of course, I don't know what I'm talking about, I'm confident a lot of people on HN know more about this than I do, this is just a Wikipedia-level understanding of this seems to indicate to me.
Still, I do like to think about it. A part of me thinks, and I have no way to confirm this, that there might have been bigger ambitions for the N64, to convert it into a "real" computer, so you could have a "real" SGI machine at home (though obviously less powerful than an Indy or something).
I got a lot of flack from leadership at the lab where I worked in '97 and '98 when I bought Dual Pentium Pro 200MHz and later a Dual Pentium II 400MHz that ran Linux. They asked why I didn't buy HP, or SGI desktops. I pointed out that the grant i had- for $5K - wasn't really enough to buy a single machine (our lab had the cheapest SGI Indy, which IIRC was $10K) from HP or SGI, while for that amount of money, I could get dual processor experience, and more RAM/disk than the lowest-end HP or SGI. I ported many codes over from SGI/Tru64 to Linux on those machines.
Later I got a $15K budget to build a cluster, and again was asked why I didn't just buy a single SGI machine, and I pointed out I was able to get six PCs, each of which was 50% the speed of a single SGI machine, and I could cluster them to get experience running MPI jobs. 10 years later, all the labs had retired their non-Linux machines aand were running various white box PCs or HP servers running Linux, and I had 10+years of experience running HPC jobs on Linux clusters.
If you had a very large budget you could get high-end SGIs that had custom capabilities that no PC at the time could match but even then I specifically chose to not need those capabilities until they were commodity.
One thing about this period that kinda annoyed me in the tech press, is that it always felt like these companies were making new/better computers for "their existing customers" as if they were only ever competing against their own older products.
Another thing, which perhaps "grinds my gears" a lot more, is that this late-90's/early-00's shift to PCs happened before Linux was sufficiently taken seriously. So lots of high-end applications that started on UNIX migrated over to Windows NT. And once you're firmly on Windows, its much harder to go to Linux. (Whereas commercial UNIX to Linux is easy.)
So now there are whole markets (that used to support commercial UNIX) where Linux users get the middle finger, and as someone who hates Windows, this really ticks me off.
Don't forget that Microsoft did an awful lot of non-technical work to "facilitate" the porting of large workstation applications to Windows NT.
If you had a large-scale workstation application, Microsoft would assign you a relationship manager whose job was to convince you to port your software to Windows NT. In addition to wining & dining your execs, they would provide lots of engineering resources: Free Windows, developer tools, and documentation licenses; direct access to Windows engineering teams to help with issues you ran into doing your ports (for which they'd strongly push you to use native APIs); assistance with choosing and deploying hardware; and sometimes even free hardware and on-site "sales engineers" for a time to work through initial bring-up. This type of love-bombing never lasted forever, but for most vendors at the point where they started guiding a vendor towards a more "normal" relationship they had the revenue justification for that normalization.
Microsoft used the same strategy to get people to port their DOS and then Macintosh applications to Windows, to port their games to Microsoft's platforms (whether from DOS to Windows or from other consoles to Xbox and PC), to port their client/server applications and then web applications to Windows Server, and so on, and it has *always* been extremely effective.
Wonder why other companies didn’t do the same.
I recall the snobbery of the other companies who thought you should go to them hat in hand for the pleasure of giving them gobs of money.
I hate Windows now because I'm comparing it to modern Linux and macOS, but in the 90's wasn't Windows NT pretty competitive with some of the commercial Unix offerings? I thought it had some pretty cool stuff in regards to non-blocking IO, and NTFS was pretty ahead of its time compared to most of the Unix filesystems at the time I think?
I wasn't really writing code in the 90s, so it's tough for me to say with confidence, but I thought I read somewhere that Windows NT was, in some regards, objectively better than most of its competition in the 90's.
The NT kernel was designed around non-blocking async I/O; all I/O in NT is async and leveraged completion ports. I believe Solaris is the only other OS with IOCP.
Back then, Linux had either a primitive scheduler or O(n) scheduler, until 2003. NT shipped with an O(1) scheduler.
I personally feel NT still does high pressure memory management better than Linux. No opinion about other OSes, although macOS will ask the user what to force quit -- then again, given people have been seeing memory ballooning in random apps, including OOTB apps on macOS 15... Apple has other issues.
NT is great. It's just plagued by Win32. And those designers...
https://users.soe.ucsc.edu/~scott/courses/Spring01/111/slide...
> In 1996, more NT server licenses were sold than UNIX licenses
This is pretty much correct. Windows NT was far better than DOS and feature competitive with the Unices of the time, along with being available on more modest hardware.
The Unix wars were also raging, and compatibility between Unices still hadn't been sorted out (and arguably never was) so a company with workstation class software had to port their code between mostly compatible operating systems and wildly incompatible GUI frameworks. So shipping an NT product wasn't the big deal that it seemed.
That makes sense; even within the Linux world in the year of our lord 2025, binary compatibility is still kind of an issue. I have had issues getting regular Linux binaries working in NixOS [1], and even getting stuff working between Ubuntu and OpenSUSE and Fedora can be a pain.
It totally makes sense why developers would see Windows NT as the future here; it gave you most of the features you'd want from Unix-land (and I think some new stuff too that wasn't available in Unix?), and having "platform to rule them all" is appealing to most developers for obvious reasons. It doesn't hurt that Win32 isn't too hard to code against (at least it wasn't when I played with it 15 years ago).
I do find it a bit strange that there wasn't really a "de facto" Unix that people coded against, like a clear winner that people liked, but I guess if the hardware was too expensive to run it, that's going to cut down on usage; we didn't really get "standard Unix" until OS X.
[1] https://news.ycombinator.com/item?id=42703720
The open source Unixes were tied up in license wars right at the moment Linux became stable.
It was fairly competitive, but it also took quite awhile to build up a full set of applications. Real NT didn’t run a lot of older Windows apps. And the hardware it ran on didn’t scale vertically in the server market like the UNIX systems. So it took some time for the software to fully arrive and some time for the hardware to get fast and big. And the UNIX vendors failed to see that the initial disadvantages were just that: initial. With the price of PC hardware behind them, Microsoft overcame those disadvantages quickly. Had a real UNIX been available on the same hardware and had a similarly aggressive and well financed vendor behind it, there might have been a chance. But none of the UNIX vendors took that market seriously. We shouldn’t forget though that MS hedged their bets on PC hardware and ported NT to DEC Alpha. So even they did not see the inevitability of the rise of PC hardware.
I feel there was a real mental block (that Linux partially helped destroy, mind you) that "real computing" couldn't be done on "commodity chips".
And then the Internet blew everything up and out, and Linux machines were cheap and capable of web serving just as well as any workstation.
They sort of tried with a machine running Windows.
The SGI Visual Workstation was meant to be that. It was quite a bit more expensive that 2K though.
https://en.wikipedia.org/wiki/SGI_Visual_Workstation
Where I worked in 1999 we got one. They had a neat unified memory architecture. For some video tasks they were fantastic.
But the issue was that the Intergraph had better machines for PC 3D.
https://en.wikipedia.org/wiki/Intergraph
Then Intergraph got eaten by Nvidia who leveraged the huge demand for gaming hardware to improve their GPUs again and again.
Also Linux on x86 with Nvidia became an attractive proposition for CAD and some 3D modelling.
> I am curious what the world would be like now if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it.
I'm not sure it would have made much of a difference as the low end of the market was driven by gaming and multimedia (i.e. movies on PC screen), and the high end of the graphics market was driven by rendering for expensive video and film production. In short, what you needed to play Fallout (1997 release) at home was an order of magnitude less than you needed to render CGI for Men in Black (1997 release).
> if they had released something in 1997 for "prosumers", before OS X came out, would Apple have its same market position now
I believe Apple would have been just as successful as they are today. Software availability (e.g. Adobe and Microsoft Office) and ease of use were very good on Mac, and not so good on workstations like SGI. There was a very small, very demanding market for SGI's workstations (rendering super high quality graphics), and there was a huge market for what Apple supplied (business, print, web graphics, music production, video production). In short the software ecosystem wouldn't have happened. Apple did a great job when OSX came out of making it easy-ish for app developers to port to OSX and allowed users to run their old mac software on their new OSX powered mac.
> Would we all be using SGiPhones?
Probably not. System V unix was very expensive to license (hence Android being Linux based and iOS being based on BSD and Mach) and would have added considerable cost to each mobile device based on licensing at the time. A lot of what made it possible to package up a modern smartphone was open source software + low cost components with ridiculous capability (for their cost). None of this was of interest to SGI where they were focused on high-end equipment with little commodity appeal.
I agree with the software availability, but I'll just note that Adobe Photoshop and Illustrator were ported from classic macos to Irix with a Unix porting toolkit. Adobe Framemaker was also available.
I spent years thinking about this at the time, and wrote up a crystalized super-long "why?" in an SGI retrospective comment on an earlier thread, https://news.ycombinator.com/item?id=39960660 which you might find of interest.
Feedback from SGI insiders or pointers to HBR case studies welcome.
Application compatibility mattered a lot back then. There was no Microsoft Word or Excel for IRIX and Google Docs / Google Sheets didn't exist, for example. Many of the prosumers wouldn't have been able to use it as a daily driver. I recall that Adobe ported Photoshop to IRIX at one point, but I think it was gone by '97.
I wonder how much of this is a chicken-egg problem though; IRIX didn't have enough of a user base to justify porting over a lot of these applications, and because they didn't have these applications it couldn't develop a user base big enough, etc.
I think what you're saying is totally correct, if you're going to be spending a lot of money on a computer, you need to make sure it can actually do the stuff you need it to, and in the 90's that does more or less imply Microsoft Office compatibility.
Still, I do kind of wonder if SGI had actually tried to penetrate this market, that maybe they could have made this work. They could have licensed and ported some of these applications over themselves, and if they had gotten big enough then maybe some of these companies would have ported these things over.
It's tough to say.
Well, they tried producing high end Windows NT machines at one point, that would have run Office, had OpenGL support, etc. But no one bought them. So its not clear that doing the same but with Irix would have gone any better.
The last-gasp part of SGI I would have like to have seen develop was Cellular IRIX. If I recall it was a sort of distributed OS where the scheduler was node aware and could parcel out tasks within a single process to other nodes. What NFS did for the local/remote file system split, Irix would do for compute. Never came to fruition.
Archive.org has a relatively detailed Tech Report from SGI on Cellular Irix.
https://web.archive.org/web/19970706222424/http://www.sgi.co...
That sounds pretty cool, I guess the idea was that the scheduler could figure out smartly where to put applications and minimize latency in the process?
Yes, much like you would in an HPC system (much of Cellular IRIX was supposed to have been influenced by what Cray and SGI learned from each other when they merged).
More details (PDF) here. Remember this is from 1998. Reading it again in 2024 there is of course quite a bit of foreshadowing of what we ended up with in cloud. And you won't be surprised to learn that the author ended up passing significant time working at both Microsoft and AWS after their SGI days.
https://cug.org/5-publications/proceedings_attendee_lists/19...
Closest thing I can think of is this: https://en.wikipedia.org/wiki/SGI_Visual_Workstation It had some custom hardware choices different from typical PCs of the day (IIRC it had a crossbar instead of a system bus. It needed a special HAL to boot NT; Windows 2000 was the last version that supported the machine. It was extremely expensive for its capabilities and everybody in my lab looked at it and concluded "SGI is dead."
The Innovators Dilemma explains this, but was not well known yet in the late 90s. In short, companies were unable to give up high profit margins, which led to their doom.
That's kind of what they were trying to do with the 320 and 540.
I worked on their e-commerce store, you could configure and buy and ship one from the web site in 1999-2000.
The prices were still really high.
they did. it was called the moosehead. it replaced the indy and i think was the last of the mips boxen before they became a pc integrator.
> why Silicon Graphics never made a "prosumer" computer
They were stuck in a hopeless corporate mindset where their customers were all deep pocket mega corps. You had to go through a sales process to buy one, there was no demo machine at the local computer shop. They also had a brain damaged sales team that would do things like e.g. under-spec hardware with too little RAM to make a sale leaving customers with costly hardware that struggled under loads damaging their reputation (e.g. imagine a class room full of underspecd machines leaving students thinking "this pos made Jurassic Park?").
> if Silicon Graphics had made a model that was less than $2,000, but still ran Irix and had some amount of the 3D processing stuff with it.
They would look exactly like the Indy or the later O2 which were their entry level machines.
> I sometimes wonder what it would be like if Nintendo had released kits for the N64 that let you use it more as a computer.
I think SD TV's of the time would have made rendering text a bit difficult forcing you to use large font making any substantial text work impractical (e.g. word processing or code editing). However, a better N64 fantasy IMO would be to include an optical drive in addition to the cart slot, double the RAM, increase the pitiful 4kb texture cache, then for good measure - N64 themed kb, mouse and Ethernet adapter ;-).
> I'm just thinking that it's all about timing; if they had released something in 1997 for "prosumers",
That line would be the SGI Visual Workstations. The first two, the 320 and 540 had custom motherboards with SGI chipset, RAM and GPU. The GPU used unified memory and you needed to buy expensive custom SGI RAM DIMMS. They only ever ran Win NT 4.0 and 2k because of the GPU drivers. Then there was the 230, 330, and 550 - strait up ATX Wintel boxes with Nvidia cards. My main desktop is a 550 case with a Threadripper in it. But again, IMO, they had no clue how to sell to small shops and individuals and by then, too little, too late.
> I think SD TV's of the time would have made rendering text a bit difficult forcing you to use large font making any substantial text work impractical
Yeah, in this hypothetical conversion kit they've have to give you some kind of better connections to plug into a monitor, and obviously some kind of conversion kit, maybe a double-expansion-pak, and maybe some kind of external optical drive or something similar to the N64 DD with a CD drive.
I dunno, I don't think it would necessarily have failed, but we'll obviously never really know. Sony did that with the OtherOS stuff for awhile on the PS3, and I think there was a subset of people who really liked it for scientific computing.
They could have been NVidia (at best, more realistically 3dfx), but certainly not Apple.
Back then, when SGI quit the graphics business and went to supercomputers only (having NUMA as IP, and going with Linux) the GPU devs each ended up at Nvidia.
Apple had to pick Be, Inc. or NeXT (who had Jobs). They went with the latter. Be, later on, went to Palm, but that was too little, too late.
SGI was one of the many Unix mastodons who fell victim of PC hardware (x86-32, then x86-64) and Windows NT, Linux being good enough, with Apple taking the piece for graphics designers and audio engineering (many whom came from Amiga, and SGI). Oh, and I forgot to mention: they bet on the Intel Itanic.
I wouldn't pay a dime for these nowadays. Way too slow / inefficient W/performance. Which is really sad, but with current energy prices I just wouldn't be able to afford my Octane 2 using 1 kW anymore. I owned many SGI machines, really loved them (the sound card in the Indy was insanely good! And IndyCam from a time when hardly anyone had a webcam). Indy (various, including Challenger S), Indigo (including purple remake), Indigo 2, and Octane 2. But never an O2 :) nor a Fuel or Tezro ;) cases, although plastic and sensitive to scratches, were great aesthetical, too. While I would not want such machine anymore, I will cherish the memories though! Plus, there are some people reusing the cases. Cool beans! IMNSHO, SGI machines like Indy belong in any half decent Unix museum.
At a previous apartment, I managed to convince my wife we needed a giant server rack, so I started looking on Craigslist and eBay for one.
I really wanted an SGI server rack because I thought it looked cool, and because I found them historically very interesting, but sadly those appear to have a bit of a collectors' market and are too expensive. Instead, I ended up buying a Sun Microsystems rack, which to be fair is also pretty cool historically.
I would love to buy a broken SGI machine and put some modern hardware in there, just for the aesthetic. Maybe someday.
The old Sun Ultra 24's make cool looking PC cases as well
> I would love to buy a broken SGI machine and put some modern hardware in there, just for the aesthetic. Maybe someday.
The closest you can easily get is the 230/330/550 cases which are strait up ATX. I have a 550 case with a threadripper in it. The only issue was I had to modify the 750W PSU to fit in the case by replacing the IEC inlet with a short pigtail cord.
Would you mind sharing a picture?
Unfortunately I cant find the pics of the build at the moment. It's a stock SGI 550 case so nothing special done to the case. I replaced the PIII Dual Xeon board with an Asus Threadripper board w/ 1920x CPU, 32GB ECC RAM, Radeon Pro W5700, 1TB NVMe, 10Gb NIC. Daily driver running Void Linux Musl.
The mod to the PSU was needed because the IEC inlet was practically at the edge of the PSU housing which is partly obstructed by the edge of the SGI case. I could have modded the case by cutting the steel but I didn't want to cut up a unique case when the PSU is easier to replace. So I opened the supply, unscrewed the IEC inlet, de-soldered the leads from the inlet to the board then soldered in a length of 18 AWG cord. I covered the IEC inlet with a piece of 16 gauge (~1.2mm) aluminum with a 16mm hole for a cable gland and two screw holes for mounting with machine screws and nuts. The cord is secured by a metal cable gland and terminates to an in-line IEC socket. NOTE: I did replace the X and Y caps across the factory inlet with new ones I soldered to the PSU board with the cord.
Case swapping is wrong on these.
Genuine question...why not?
The reason I ask, it's not like they were just making "components" like 3dfx or Nvidia at the time, they were making end-to-end workstations. Why is it such a stretch to think that they could have taken on Apple, especially since Apple in the 90s was kind of languishing?
It was hard enough for Apple to take on Apple.
Next basically tried, and only survived by being acquired.
If they had wanted to survive, they needed to buy Dell at the right time.
From https://www.tech-pubs.net/wiki/IRIX_101 -
> Why are machines that run IRIX expensive?
> There are no new machines being produced, and high-end machines are in short supply. Less expensive machines can be had. We do not recommend using eBay to search, as the prices are usually extremely inflated.
Less expensive machines can be had on the non-eBay used market, or have we hit the point where the cheapest way to have such a machine is dedicated emulation like https://dmitry.gr/?r=05.Projects&proj=33.%20LinuxCard ?
There's a parts vendor who charges pretty high prices and a small, insular community that thinks his prices are just fine and bid everything up to that level as they try to acquire (many) duplicates of systems they already have, which results in a high barrier to entry for people interested in doing things with SGI systems.
Half right. Are you talking about Ian? that happened on one transaction.
Considering how tight knit our community is... I prefer to err on the side of respect, even if I disagree with some people's practices.
There's an SGUG admin I disagree with on this, but we get along anyways because we're both adults. I won't pretend I've been perfect, but quitting the booze in the quantity I was doing helped a bunch
If we can get an FPGA core of the Indy's XL/24 board and helper chips, you can buy a cheap pack of R5ks on alibaba last I checked and probably make a smol sgi board
Thank you very much for this.
The main box says "Tech-Pubs.net, or TechPubs, is a public wiki cataloging the hardware of the former Silicon Graphics Corporation." but you have so much more content than a catalog of hardware.
Do you plan to expand to other things that SGI produced? Publications? Swag?
Nice. Couple comments:
* If you can find ways to collect and preserve copies of the software, that can be big value. Every version and variation of a title has sometimes been important after the fact. Maybe use archive.org for historical archiving of software with unclear licensing status, and link to it from your wiki, with your wiki providing the background text and organizing that isn't archive.org's strong suit. (I'm not talking about piracy, but just trying to ensure that any copy at all survives, which had been a real problem on some other platforms of this era. Also, it's easier to get a company to say that such-and-such software from a company three acquisitions ago is OK for people to run on vintage boxes and in emulators and museums, than to ask them to find and provide a working copy, which they usually cannot.)
* You might want to capture provenance/copyright info for uploaded photos, like Wikipedia kinda does. Easier to capture it at upload/linking time, than to try to reconstruct it later.
I obtain copyright permission of all original images before uploading and give people clear contacts to do it. I refuse to install extensions that could clobber the setup (many aren't compatible with postgres). I also make an attempt to contact rightsholders. the only stuff I don't have permission for is a few system image placeholders and the logos of companies. The SGI trademarks are basically gone though, HPE ain't using them.
> If you can find ways to collect and preserve copies of the software, that can be big value. Every version and variation of a title has sometimes been important after the fact. Maybe use archive.org for historical archiving of software with unclear licensing status, and link to it from your wiki, with your wiki providing the background text and organizing that isn't archive.org's strong suit. (I'm not talking about piracy, but just trying to ensure that any copy at all survives, which had been a real problem on some other platforms of this era. Also, it's easier to get a company to say that such-and-such software from a company three acquisitions ago is OK for people to run on vintage boxes and in emulators and museums, than to ask them to find and provide a working copy, which they usually cannot.)
Over at IRIXNet we do have some files and such preserving what's not likely to get us in trouble, but I choose to not deal with copyrighted works on tech-pubs (which unlike IRIXNet is my pet project, not a part of IRIXNet)
Thank you for your work. If you happen to have any thoughts on when it makes sense to collaborate with archive.org, and when not, I'd be curious.
Well, for me, it's just not feasible for me to take the gambles archive.org and others do. I am a professional locksmith in IRL, and "piracy" could lead me to losing my livelihood as my license could be revoked. I also have only so many hours in a day to keep stuff running. You think I need more mailboxes to be checked etc?
Plus, nowadays, dmca rightsholders will bypass you, going straight to the host provider who will lock you down for gods know how long until someone gets back to you.
I am not like Peter Plank, who ran nekochan.net. I have professional infrastructure,paid developers and a few sysadmins who help me run it all. It costs me nearly a grand a year to keep the lights on.
Is it worth the risk? Honestly, it's not! The only things I host are IRIX install media, and HPE moreorless said informally/noncommittally, "don't sell it and we're not gonna care!"
Thank you so much for all your work.
How about adding information about IDO, like the static recompilation project being used in N64 game decompilation projects? It enables compiling code in a linux environment and having it match 1:1 as if it were run by the original compiler. There's even significant progress in decompiling the entire codebase as well.
https://github.com/decompals/ido-static-recomp
https://github.com/decompals/ido-matching-decomp
As time permits! I'm the only current article writer at this time
Nice job!
Very Important: Are you gonna be adding an entry for the Japan-exclusive set-top box that had its own official port of DOOM done by Jonathan Blow? :p
Link?
Cyberknife ran on these I heard
Images broken?
If you don't tell people where, how are we gonna fix it? I found ONE broken image. one. It took me hours to find it. Please don't just vaguely tell me one image somewhere is broken. Please be specific next time!
That language isn’t necessary, for me the site has always loaded with no images so i thought it was obvious, apologies if that’s not what you or others have seen.
Well that's a "let's get details to replicate issue" Level problem. Clearly, images are supposed to load. Give me details to chew on. I mean, if you're on HN, you know better than to say "It's broken" level of detail.
the images are broken for me as well.
How am i gonna fix it if you don't tell me WHERE?
I'm not trying to be a dick. I wanna fix it. I need to know how and where it's a problem!
by the way the images still don’t load for me now, none of them.
Give. Me. Details. To. Replicate. Please.
Browser. Screenshot. URL. OS. I'm on Brave/ GNU/Linux and it works.
Where?
I can't believe your site does not include the notorious SGI screwdriver, the mighty SGI #9980915
http://industrialarithmetic.blogspot.com/2011/04/sgis-finest...
https://www.reddit.com/r/retrobattlestations/comments/fele3v...
My trusted SGI #9980915 sits behind me right now -- it has served me remarkably well all these decades...
[dead]
[dead]