Sohcahtoa82 12 days ago

The linked Subpixel Zoo article taught me that Pentile is still actually incredibly popular.

My first Pentile screen was on my Motorola Droid 4 phone, and it was awful. Small text was often impossible to read depending on the color of the text and its background. The massive gaps between colors made solid red, green, or blue areas of the screen look like a checkerboard. It basically had a screen-door effect before VR became mainstream and made "screen-door effect" a household term.

So it hit me as a surprise to know that Pentile is still popular and used today. I guess it's just gotten better? Smaller gaps between the subpixels? Maybe higher resolutions and pixel densities hide the weaknesses shown in my Droid 4?

  • mdasen 12 days ago

    Early PenTile displays often had a different arrangement: https://en.wikipedia.org/wiki/PenTile_matrix_family#/media/F...

    You can see that it's Blue, Green, Red, Green on the horizontal/vertical axis so each red sub pixel is separated by two green and one blue sub pixels.

    Modern PenTile displays usually use a triangular layout: https://static1.xdaimages.com/wordpress/wp-content/uploads/w...

    I'm not an expert at text rendering, but it seems like you'd be able to get an RGB sub pixel combination a lot closer together with this triangular layout than the linear one.

    But also, the Droid 4 was simply lower resolution. Apple moved to 330 pixels per inch in 2010 and the Droid 4 was 275 PPI in 2012. So the Droid 4 was poor resolution even for its time and the PenTile layout would make it even worse by removing a third of the sub pixels.

    Today, the Galaxy S25 has 416 PPI and the iPhone 16 460 PPI so it is dramatically more pixels.

    Pixel density would have the largest impact, but I think that the triangular layout that modern displays use probably also helps. You talk about the screen door effect and I feel like the triangular layout wouldn't have as much of that issue (but I'm not an expert on sub pixel rendering).

  • wffurr 12 days ago

    >> Maybe higher resolutions and pixel densities hide the weaknesses shown in my Droid 4?

    That's exactly it. Droid 4 resolution was low enough that the subpixel arrangement was clearly visible. Newer displays are dense enough that subpixels aren't visible at all.

tantalor 12 days ago

The snake moves weird because the subpixels aren't square.

I would increase the snake's horizontal speed (per subpixel) relative to vertical, so speed in either direction is the same from the user's perspective in the actual viewport.

mrandish 12 days ago

As an obsessive arcade retro-gamer who custom built a high-end emulation cabinet with a 27-inch quad-sync analog RGB CRT - I approve of this video! As soon as he described running into the green pixels problem, I knew he was going to learn something interesting. Sub-pixel structure, phosphor chromaticity, etc are such a fun rabbit hole to dive down. And it's still highly relevant in today's OLED, QLED, etc displays.

Also, a tip for when you play classic arcade or console retro games from the 80s and 90s: they will look much better (and be more authentic) if you play on a CRT - or, if playing via emulation, just turn on a CRT emulation pixel shader (CRT Royale is good). These pixels are art which the original devs and artists intentionally hand-crafted to use the way CRTs blend colors and scanlines anti-alias naturally. You deserve to see them as they were intended to be seen. Just look at what you've been missing: https://i.redd.it/9fmozdvt6vya1.jpg

  • playa1 12 days ago

    I’m a fan of authentic retro hardware. That sounds like an awesome cabinet, I would love to spend hours and pockets full of quarters playing on it.

    This post caused me to go down a rabbit hole about CRT simulation.

    Looks like it is a thing.

    https://github.com/blurbusters/crt-beam-simulator

    • mrandish 12 days ago

      > That sounds like an awesome cabinet

      Why, yes! Yes it is. :-)

      If it sounds good, you should consider one of your own. My goal was not the retro nostalgia of my recreating my parent's shitty 1970s living room TV but instead creating a cabinet that would allow me to explore these games each in their authentically correct resolution and frame rate and at the maximum quality and fidelity possible. That's why I chose an analog RGB monitor and the last, fastest GPU ever made with a native analog RGB signal path (R9 380X). I run a special version of the MAME emulator called GroovyMAME made just enable technically correct native display via analog signals on a CRT. http://forum.arcadecontrols.com/index.php/board,52.0.html

      I created this cabinet over ten years ago, before emulation pixel shaders were a thing. If you don't want to go to the effort, expense and time of acquiring and calibrating a high-end CRT, good pixel shaders can now get you >98% of the way there much easier and cheaper.

  • azinman2 11 days ago

    I’d love to hear more about your setup. Do you have a blog or anything documenting it?

    • mrandish 11 days ago

      I probably should put a permanent post up somewhere but I'll just give you a recap of info that may matter if you're interested in having your own emulation arcade cabinet. I highly recommend this forum to deep dive because it has sub-forums for controls, monitors, cabinets, etc. http://forum.arcadecontrols.com/

      First, you need to understand your goal in creating a cabinet. Unlike some people, my goal was NOT recreating long-ago nostalgia like playing console games on my parent's living room TV. There's nothing wrong with that but as someone who wrote games professionally back in the 80s and later became a video engineer, I wanted to play these games in their original native resolutions, aspect ratios and frame rates. But my goal went beyond original authenticity to ideal authenticity. Even back in the day, an arcade cabinet's monitor would be left on 100 hours a week in an arcade and after five years be pretty thrashed. The joystick mechanisms would be worn and imprecise. Sometimes arcade manufacturers would cut corners by sourcing an inferior button instead of the top of the line. I had no interest in recreating that. I wanted to create the experience of playing the originals in their most ideal form. How they looked (or would have looked) with a brand new top of the line, period correct monitor perfectly calibrated, and pristine high-quality controls. What the manufacturer would have made in the 80s or 90s with no cost corners cut.

      My cab is based around a 27" Wells Gardner D9200 quad-sync analog RGB CRT. Wells Gardner made high-quality industrial CRTs specifically to go in arcade cabinets (Atari, Sega, Namco, etc). The D9200 is one of the last and best monitors they made and I bought it new from them shortly before they went out of business. It's very flexible as it scans four ranges of frequencies 15khz, 24khz, 31.5khz and 38khz. This covers very nearly all of the resolutions and frequencies of any CRT raster-based arcade machines ever made by global arcade manufacturers. 38khz supports resolutions up to 800x600 non-interlaced which is what I run my game selection interface in. Scanning to higher frequencies is also nice for running games from some later consoles like Dreamcast which were capable of displaying 480p natively. This lets me run all the classic arcade games in the native resolution, frame rates and frequencies. No scaling, averaging or interpolation.

      For my CRT to switch between all these frequencies on the fly it must be sent a properly formatted signal. Doing this natively is tricky and requires using a GPU with native analog RGB output. I use the last, fastest native analog RGB GPU - the Radeon R9 380x. The real magic however is using special display drivers and a special version of the MAME emulator called GroovyMAME to generate precisely correct horizontal and vertical frequencies matching the game code written for each arcade cabinet's original display hardware. GroovyMAME and the community around it have done remarkable work crucial to accurate historical preservation through precise emulation. Much of their work has now been mainstreamed into MAME, making it more accurate than ever. Dive into that rabbit hole here: http://forum.arcadecontrols.com/index.php/board,52.0.html

      To be clear, my high-end monitor and highly-tuned signal chain probably allow most of these games to look better than the original monitor in the original cabinet. While perfectly authentic, they aren't exactly 'historically accurate' because an average cabinet in an average arcade in the 1980s probably looked worse due to age, use and abuse. However, intentionally degrading original content to look worse to match some historical average jank, seems wrong to me. It's true some of the original monitors were connected with composite video, not component. Some of the cabinets had cheap, poorly shielded cables while mine has a double shielded broadcast studio cable with ferrite cores at both ends to eliminate cross-talk and noise. So I'm playing the original game code but presented as the people who made these games would have wanted their own personal cabinet - if they could take one home. However, I draw the line at modern revisionism like AI upscaling or frame gen. Because that's no longer the original pixels and frames in their most ideal form.

      Next is choosing your controls. Fortunately, many of the manufacturers of original arcade cabinet controls are still around like Happ (buttons), Wico (joysticks), etc. My cabinet has controls for two players as well as a trackball for games like Marble Madness and a counter-weighted spinner for games like Tempest. These are all interfaced to the emulation PC in the cabinet through USB control boards made by companies like Ultimarc. Each of the buttons is also backlit by an RGB LED and the colors of each button change to the button color that was on the original cabinet, for example, when playing Joust player 1 is yellow and player 2 is blue. This also indicates which controls are active in each game.

      Selecting games is done via a joystick driven menu. Software to do this is called a frontend and there are a variety ranging from open source to commercial. I use a commercial one called Launchbox because it handles calling various emulators, interfacing with control boards, organizing and maintaining the game library of thousands of titles across a dozen platforms very well. I actually use the BigBox mode of Launchbox which is made for dedicated emulation cabinets. Another nice touch is integrating various databases arcade historians have created. While browsing the game library it's fascinating to read the history of how the game was made, see the original arcade cabinet and the launch advertisement along with the usual game logo, title screen and gameplay video. Linked data like this allows you to follow the evolution of various game types, companies and franchises over time from their origin to their end point.

      CONCLUSION: All of the above is, admittedly, pretty obsessive. If you want a terrific arcade/console emulation cabinet you DO NOT need to do what I did (or even half of it). However, I recommend not just buying a cheap mini cabinet from Costco. To be fair, while the worst cheapies are awful, the best of that class isn't that bad. But you can do much better with just a little more money, thought and care. Things like authentic arcade controls, and rolling your own cheap, used PC will allow you to run a frontend you can add other platforms and games to - and - MOST IMPORTANTLY run a CRT emulation pixel shader on the output. I recently upgraded the PC in my cabinet and bought a used corporate PC on eBay for less than $100 delivered. It's more than fast enough to emulate everything up to PS2 perfectly and I have no interest in emulating later consoles on a CRT cabinet because that's when games started being written for flat screens. I love my CRT but I'm not a purist. CRTs are expensive, hard to maintain and finnicky analog gear. As a video engineer I have to admit recent versions of the best CRT emulation shaders like CRT Royal running on a high-end flat screen are very impressive. If I was building my cabinet today, I might go with a very carefully selected, high-end flat screen instead of a CRT. Frankly, the kind of flat screen I'd want might cost more than a very good used CRT but it would provide some flexibility to do things a CRT can't. And there would be some trade-offs vs my best-ever-made CRT but engineering is all about trade-offs and there's nothing that's ever going to be perfectly ideal on every dimension someone like me cares about.

      • azinman2 11 days ago

        Thank you for all this. Quite the dedication! How often do you play it?

        • mrandish 10 days ago

          I'll be the first to admit that I may have gone a little overboard in creating my arcade emulation cabinet. But given I started in the industry creating games in the 1980s, hung out in the arcades, owned and repaired arcade machines and, later, went into video engineering - maybe it's not that crazy. Plus, at the time I made this cabinet I still had several original arcade cabinets but needed to make room in my basement arcade for more pinball machines. So I decided to see if I could make "One cabinet to replace them all." And I got sufficiently close.

          When my cabinet was new I played it almost daily for the first year. Now I play it at least once or twice a week but I've had the cabinet for nearly 15 years (and have upgraded the PC and front-end a couple times). However, there are still periods were I play it almost daily. These tend to happen either when I get into deep diving a genre (for example, Japanese shmups) or when there's a significant new title enabled (or fixed) in MAME or another emulator that I find especially interesting.

          An example would be when the unreleased Atari game Marble Madness II was added to MAME. This was an extremely rare unreleased ROM which was unfortunately hoarded by a couple of collectors for many years and considered 'at-risk' from a historical preservation perspective. Once the game ROM finally found it's way to being safely archived online, MAME added support for it. To be clear, Marble Madness II was unreleased for good reason (the reason being it sucked). But I love the first Marble Madness and it's a historically significant, influential title (with an awesome soundtrack) so diving into its (wisely) aborted descendant was fascinating. It is, indeed, not at all a good game. What was interesting was exploring the ways it's not good, and most importantly - why. After all, it's based on a now-legendary mega-hit game with an innovative play style and distinctive visuals. That should be pretty hard to screw up. As you might guess, none of the original Marble Madness team were involved in MMII. But, clearly the MMII team played a ton of the original, yet they somehow managed to misunderstand what made it so great.

          Other times, I'll boot up the cabinet because some classic game I never really got into will be featured on a retro-gaming blog or YT channel and that'll pique my interest in exploring the title as well as its precursors and descendants. That's why it's handy to have the full game libraries of every arcade cabinet title, a couple dozen 80s home computers and every game from every home console from the first generation (Atari VCS - 1977) to the sixth generation (Sony PS2/Gamecube - 2001) all browsable by platform, year, manufacturer, play style, genre and rating. When I come across some reference to a game on the Japanese Sharp X68000 computer being a derivative of an earlier game on the Amiga 1000, and both being inspired by a 1982 arcade title - I can play them all back-to-back and compare. I doubt I'll ever not love being able to conveniently play any classic retro-game in a full cabinet with top notch controls, pixel/frame accuracy and maximum fidelity.

  • starfezzy 11 days ago

    Not a fan of the blurry LCD look - it's like someone hacked a .ini to set antialiasing 4x higher than its limit then placed a screen door in front of the monitor.

    I gamed during the transition from CRTs to LCDs. Nobody was preferring the "graphics" of a CRT.

    The real downgrade was when gaming shifted from PC to console and killed dedicated servers. We used to pick servers with 10ms latency. Now people think 60-100ms+ is fine.

  • crazygringo 12 days ago

    > You deserve to see them as they were intended to be seen.

    I've never bought that argument, and I grew up playing games on CRT's.

    The reality is that different CRT's had wildly different characteristics in terms of sharpness, color, and other artifacts -- and it also varied tremendously depending on the type of video output/input. Were you hooked up to a TV? A cheap monitor? An expensive monitor?

    About the only thing you can say for sure is that CRT's were blurrier. And the comparison image you provide is completely misleading because the brightnesses are totally different, which suggests that the LCD/LED version isn't using correct gamma. If you used a random CRT her skin tone also had a good chance of turning greenish or whatever because that's just what CRT's did -- the color artifacts could be atrocious.

    I definitely appreciate wanting to blur the image in an emulator to remove the jaggies, and the retro CRT effects are a cool novelty. But I just don't buy the argument that it's "how the games were intended to be seen", because there was just way too much variation and the screen quality was so low. It's like only listening to music from the 90's on cheap speakers over the roar of the road because it's how the music was "intended to be heard". No it wasn't. It's just the best you had at the time.

    • amiga386 12 days ago

      > I just don't buy the argument that it's "how the games were intended to be seen"

      I do, though.

      At its most extreme, compare how this CGA imagery looks like on an NTSC television, with the crisp, clean signals that generated it. The demo makers here absolutely intend you to see this via NTSC, it will look like complete trash if you don't.

      https://int10h.org/blog/img/1k16_flowergirl_cga_1024_colors....

      (from https://int10h.org/blog/2015/04/cga-in-1024-colors-new-mode-...)

      This article gives you more examples: https://www.datagubbe.se/crt/

      And it links to this video with yet more examples: https://www.tiktok.com/@mylifeisanrpg/video/7164193246401383...

      There's no mistaking it. The artists on those 80s/90s games worked with the expectation of how it looked on display hardware of the time. The actual pixels, in complete precision on a vastly higher resolution display, look like shit. Or they look "retro".

      • crazygringo 12 days ago

        Your first link is using weird color hacks that maybe could have worked on some specific hardware, but nothing like that was used on any average popular video game of the time as far as I know.

        So that's not an example of how regular video game artists were working, it's an example of how some (current-day?) demoscene people are trying to push specific vintage hardware to its limits.

        And like I said -- you can apply a blur filter (or basic interpolation) to get rid of jaggies, that's totally understandable. The pixels weren't meant to be sharp square blocks, just blobs of color. But a lot of these pages are showing how CRT's supposedly looked so much better are doing a lot of cherry-picking -- the reality was that they looked like blurry color-distorted wavy jittery messes just as often. There just wasn't any kind of consistency between dramatically different displays. Artists couldn't plan for some highly specific amount of horizontal smear to look "just right" because there was gigantic variance.

        • amiga386 12 days ago

          > a lot of these pages are showing how CRT's supposedly looked so much better

          That's shifting the goalposts. The question is whether old games were intended to be seen on CRTs, or intended to be seen on LCD screens created years later. There's no question, they were intended to be seen on CRTs.

          The pixels were placed by artists who looked at how they rendered on a CRT, and they'd change pixels here and there, do hand dithering, and play with the colour palette until they got what they wanted on the CRT. That was the canvas they painted with. The artists didn't have high-resolution LCD screens.

          And the thesis of all the things I linked were "the artists intended you to see this on a CRT". And yet, people playing games in emulators on modern high-res LCD screens have picked up this unintended visual style and dubbed it "retro", and modern artists have created new art that was intended be seen on LCD and look "retro" while doing so. They didn't even get a CRT to check how it looks on it.

          Two different sets of artists, with two different intents, separated by time and fashion.

          • weinzierl 12 days ago

            "The question is whether old games were intended to be seen on CRTs, or intended to be seen on LCD screens created years later."

            I think the counter-argument is that they were intended to be seen on CRTs but the differences between CRTs were bigger than the difference between CRT and LCD.

            I don't know if this holds water but I think this is the point some try to make.

            • crazygringo 12 days ago

              Right, it's about the differences between CRT's.

              > The pixels were placed by artists who looked at how they rendered on a CRT, and they'd change pixels here and there, do hand dithering, and play with the colour palette until they got what they wanted on the CRT.

              The issue is, sure they could do that for their CRT. But plug in a different one and the colors are different, the hand-dithering effect looks totally different, etc.

              This stuff was drawn using zoom levels where the pixels really were squares. Obviously the artists looked at the preview to get a rough idea of how it would look blurry, but they also couldn't optimize too much for their particular display. It was more important for the design to be robust across a wide variety of displays, some of which would just look like crap no matter what.

              So I'm saying, if you just apply some blur it's fine. Nobody needs to be emulating the exact characteristics of a CRT to chase down some "artistic intent" that only vintage CRT's provide. Just blurring out the jaggies is really the only thing that was ever consistent across displays, and even that varied greatly.

              • mrandish 11 days ago

                > Nobody needs to be emulating the exact characteristics of a CRT to chase down some "artistic intent" that only vintage CRT's provide.

                While it's true that blurring is one aspect of CRTs, there are multiple different things we're talking about here. Let's get specific. This is an image of an Apple II's composite video output as seen on a naive RGB LCD.

                https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj...

                This is the exact same video output displayed on a CRT (or via a CRT shader properly decoding it).

                https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh...

                This is not just blurring. The colors aren't even there on the naive RGB image. This is because modern displays don't properly decode the pixel pattern data as specified in the RS-170A analog video standard. A CRT shader can do many things including add blur, cross-talk, noise, scanlines, etc. But it ALSO does something else - properly decode the bit patterns in the first image to add the colors in the second image. The bit pattern was put there on purpose by the original artist/dev. Not decoding it properly means the colors are wrong or missing.

                Admittedly, this is an extreme example. Most games shown undecoded in naive RGB still have roughly the right number of pixels, in about the right colors, and in about the right places. So people accept it. But without composite decoding, some colors will be incorrect and some shades will be missing. It's as objectively wrong as decoding surround sound improperly. I don't care if you use a shader to "Make it look more like a fuzzy-ass old screen." In fact, I'd prefer you didn't. Adding excess blurring or noise just degrades pixels I worked hard on. But please, when you play games I wrote almost 40 years ago, I ask that you properly decode the color data I painstakingly encoded by hand and tested on a variety of different displays from Amdek monitors to cheap ass old TVs. If you don't, you're not playing the games I wrote. You don't have to buy a CRT. Shaders are free - and just a few clicks away. Use a high-quality one that just properly decodes composite and doesn't add any degradation bullshit.

                Note: those images are borrowed from this blog, which is a good discussion of composite video color encoding on the Apple II but the same principles apply to all analog composite video sources and displays. http://nerdlypleasures.blogspot.com/2021/10/apple-ii-composi...

          • grumbel 11 days ago

            > whether old games were intended to be seen on CRTs, or intended to be seen on LCD screens created years later.

            A lot of older games were designed on grid paper or workstations, not on the consoles or homecomputer that were running them later on. Just look at all the NES and SNES games with broken aspect ratio (i.e. circles not being round), that's not rare outliers, but like half of the library.

            Also the CRT vs LCD comparison are extremely disingenuous to begin with, since you are not supposed to be so f'n close to the TV to begin with. If you watch a game at its intended viewing distance and screen size your eyeballs will smooth out the LCD picture just the same as they would a CRT. If you sit close enough to see the shadow mask of your CRT, you are using it wrong.

            While I agree that the pixel-art look is drastically overdone in modern retro games, it's not like it didn't exist back in the day. Most old hardware had sprite or layer scaling that allowed you to enlarge the image. The Pokemon in the battle screen on GameBoy for example are all heavily pixelated, so are many SNES games that make use of Mode7 or the enemy sprites in games like AstroBot on GBA. Meanwhile most of the C64 library uses a mode that requires pixels be twice as wide as tall, which also makes everything look blocky.

            In PC gaming most of this didn't matter to begin with, since the monitors where capable of far sharper and higher resolution images than your average TV, much like a LCDs, while most of the early games where still doing 320x200. So things did end up look blocky even on original hardware.

            That's not to say that CRTs don't have benefits, the motion clarity is much better than sample-and-hold/full-persistance LCDs and LCD scaling gets incredible ugly when it's not a integer multiple of the original resolution and colors/vibrancy of early LCD was also horrible. But most of those are slowly going away with black-frame insertion, 4k resolution and HDR.

            And yes, sometimes you come across a Sonic waterfall that is designed to specifically take advantage of CRT TVs, but those effects are pretty rare.

        • egypturnash 11 days ago

          Sonic the Hedgehog, 1991. First level of the game. Gorgeous translucent waterfalls on the family TV of the time. Weird solid vertical bars on a high-end monitor or in a modern emulation.

          https://www.youtube.com/watch?v=x0weL5XDpPs&t=45s

          Artists couldn't plan for a very specific amount of smear but they sure could plan for a general amount of smear.

      • wang_li 12 days ago

        I don't buy it either. Showing something that came out 30 years after the time in question is not supportive of the argument. People wrote games and developed games and made game art on CRTs. They just developed on what they had. No one was sitting down and factoring in blur, scanlines, phosphor persistence, or etc.

        • mrandish 12 days ago

          > They just developed on what they had

          Correct.

          > No one was sitting down and factoring in blur, scanlines, phosphor persistence

          I get why you'd assume that from today's digital context. But analog video was different. I created video games in the 1980s and I know a lot of other people who did too. We still get together and hang at places like the Hacker's Conference and reminisce about hand-coding composite video pixels on 8-bit processors in assembly language. Good times.

          Back in the day, we not only considered how the pixel data we put in memory would be displayed differently on screens, we had to iteratively test it because analog video output circuits weren't always consistent between platforms (Woz made things a lot trickier by saving a nickel on the video output of the Apple II). Take a look at this http://nerdlypleasures.blogspot.com/2021/10/apple-ii-composi...

          Part way down the page you'll see a clear example of how we could put a specific pattern of black and white pixel data in memory that would cause the monitor to display 15 different colors. What we put in memory was not what the monitor displayed. And we wrote whole games this way. It also wasn't just the Apple II. Every arcade board and computer system could have it's own quirks. So, thinking about the blur, scanlines, and phosphors you mentioned was actually the easy part. The hard part was manipulating the display circuit in unnatural ways by hand-counting CPU cycles to display pixels in the "off-limits" overscan area or to switch display modes in the middle of a raster. There's even a book about programming the Atari video system that's literally called "Racing the Beam" (as in racing the electron beam scanning the CRT raster 59.94 times every second with the CPU). Most 80s games programmers had to know a lot about analog video signals. In fact, that's how I eventually crossed over from computer programming to video engineering.

          • crazygringo 12 days ago

            But these are different things -- color hacks and overscan or switching display modes are about circumventing known hardware limitations in predictable and clever ways.

            The topic under discussion here is the pixel art -- the idea that the artist would be relying on a fixed amount of horizontal blur to get the amount of glint in the eye "just right". And that's what you couldn't do, because that blur would be dramatically different on different CRT's.

            The art was designed to be robust under blurry conditions that had extreme variation. It wasn't designed for some kind of ideal CRT so it would look "just right".

            • mrandish 11 days ago

              You're assuming that in the analog era content creators wouldn't bother because different analog TVs and monitors had differing fidelity and quality (or could be mis-adjusted). But we did bother. Most of us cared a lot about the pixels we made - maybe too much. We worked our asses off to make them as good as we could. It's no different than when I worked in an audio mixing studio. We had four different sets of stereo speakers sitting on the console and tied to a switchbox. When we were getting close to the final mix, and certainly during all of the mastering process, we'd switch between the audiophile grade speakers to the K-Mart speakers to the shitty car speakers. Of course, it sounded better on the better speakers and there was much less clarity in the cheap speakers. But we needed to make sure it sounded good enough on the bad speakers. This was just the normal way content creators in the analog distribution era worked.

              When making games I'd check the graphics on a good composite monitor but also on a TV through an RF switchbox. In the Amiga/Atari ST era we checked on analog RGB too. Commodore 64s had optional S-Video output which looked very good compared to composite video and light years better than RF. We checked it all and created content with it in mind. In the analog era I worked in games, video production and audio production. And in all three I can recall specific instances where we worked on aspects we knew would probably only ever be appreciated by those with better gear. This was especially true with visual effects work we did for broadcast television. We added detail that we saw on the studio master tape but which a lot of people never saw at home (at least until home DVD re-issues became a thing). We hated the limitations of the current distribution standards and of the gear we authored on (even though it was the best money could buy at the time). And we struggled mightily to overcome those limitations and preserve every shred of quality we could.

              Also, keep in mind that arcade cabinets weren't variable like consumer TVs. They used very specific monitors which had specific, sometimes non-standard, refresh rates. I never worked at an arcade company but I knew people who did and they often had the bare monitor tube that would be in the cabinet right on their desk during development. And in that era we only ever saw our game graphics on composite displays. All our monitors were composite video, unless you were senior enough to have a 80x25 serial terminal (which were amber or green text only). On the quad-sync analog RGB display I have now in my arcade cabinet, I've installed around 40 specific modelines so that they exactly match the original vertical and horizontal frequency of the monitor in, for example, a Williams Joust cabinet when I'm playing Joust. The CRT I have was made by Wells Gardner, a company that specialized in making CRTs for arcade cabinet manufacturers like Atari, Sega, Namco, etc.

              • crazygringo 10 days ago

                First, I just want to thank you for all your extensive comments. It's really cool to get to hear from someone involved in all of it. It sounds like I probably played a bunch of stuff you were involved with! :)

                And I don't think we're really disagreeing -- what you're describing is exactly what I meant when I said "the art was designed to be robust". Just like the sound that still works on bad speakers.

                I never meant to imply there was any kind of lack of care or attention to quality. It's more that I see a kind of certain fetishization for "one true image" that never existed in the first place. Rather, the art was intentionally (and carefully) designed to be robust -- and of course more detail would come through on better displays.

                You make a great point about the arcade cabinets though, where they did have that level of control, where maybe it really was "one true image" -- I was definitely thinking only about the consumer systems I grew up with. I can definitely appreciate that the art was specifically fine-tuned for that one display. I am curious if there are CRT emulators that try to replicate the individual monitor models used in arcades, as opposed to more generic TV's and monitors...

                Thanks again for your comments and for engaging! This is the stuff I love HN for.

                • mrandish 10 days ago

                  Yes, I think we broadly agree. There was quite a bit of variability in home CRT games and much less variability in arcade cabinet games. However, there was a clear specification establishing what these games should look like on a CRT which was set by the RS-170A composite video standard - even if some home TVs fell short of this goal due to age or maladjustment. Our goal in the 80s and 90s was to create game graphics that were as high-quality as we could and ensure the graphics we shipped would look correct on any CRT set to the RS-170A standard. To accomplish this we actually calibrated the composite video monitors on our desks to match the broadcast video standard. I recall one time when a new artist joined the team and his monitor wasn't set up right. The first floppy disk of image tile data he gave me had some odd color choices that were puzzling until I went and looked at his screen - where the same images looked fine. Of course, he had to redo the bitmaps but he did learn a valuable lesson about always checking the calibration on a new monitor. His monitor was literally out of phase with the rest of the universe. :-)

                  > It's more that I see a kind of certain fetishization for "one true image" that never existed in the first place.

                  Well, it's a matter of degree. The nature of analog composite video is that it can never be as precise as 16 or 24 bit digital color. But it also wasn't 'horseshoes and hand-grenades' approximation. Just because analog video is old doesn't mean these standards aren't capable of being very precise. It's possible to adjust a decent composite video monitor very close to objectively "correct" per the specification in a few seconds with just standard color bars. Many people assume the standard color bar test signal only allows calibrating correct color with the tint knob. However, it also allows calibrating correct brightness and contrast if you know what you're doing. So, we were creating our game content targeting a precise objective standard.

                  As for fetishization of vintage or retro... I hate it. Hopefully I've made clear I have no interest arbitrarily injecting the limitations or shortcomings of the analog past if there's any way to avoid it. I love today's 4K 10-bit HDR+ video sources and have a perfectly calibrated, ultra high-end home theater with 150-inch screen, 3-laser projector and 7.4.2 THX surround sound that can damn near make your eyes and ears bleed. It's about as good as it's possible to do in 2025 - and most days I wish it was possible to achieve even better quality. I really want 1,000 nit video projection and 12-bit sources. So, those people degrading video quality to match some nostalgic memory of the past are misguided in my view. 40 years ago those of us making the content, hated that the tech wasn't better. The tech today has improved 10x and I still hate that it isn't even better. :-)

                  That said, when we're playing old analog era content, whether a retro-game, laserdisc or whatever, we should make sure we're putting all the quality that was in the original up on the screen and that our replay environment correctly matches the standards the content was originally created to match. Back in the day, doing that used to be really hard. Today it's damn near trivial. Which is why it makes me maybe a little extra crazy some people who profess to love "retro" don't even bother to do it.

                  > You make a great point about the arcade cabinets though...

                  I posted a response elsewhere in this thread that discusses more about just how precise arcade CRTs can be. https://news.ycombinator.com/item?id=42824445

                  > I am curious if there are CRT emulators that try to replicate the individual monitor models used in arcades, as opposed to more generic TV's and monitors

                  Oh yes, indeed there are! Hundreds in fact. And it's a glorious rabbit hole to dive down. I'll just point you to this forum to get started: https://forums.libretro.com/c/retroarch-additions/retroarch-... First, there are shaders, shader packs and shader presets. The lovely thing is that it's easy for anyone to examine, adjust and remix components between various shaders. While the RetroArch emulator system has it's pros and cons, it's undoubtedly excellent for auditioning, adjusting and remixing shaders and presets.

                  In general, I recommend CRT Royale as a good baseline for shader newbies as it's good and not too complicated. However, I'm personally quite impressed by CyberLab's recent work on the Death to Pixels shaders. https://forums.libretro.com/t/cyberlab-death-to-pixels-shade.... Download and install the latest shader sets into RetroArch. Find and follow an online guide if it's confusing. There are tons. Advanced shader authors like CyberLab and a handful of others are doing some incredible work in the last year. Crazy stuff like researching the phosphors used in certain CRTs and doing physically based modeling on the data. There are shaders specifically for emulating CRTs with dot masks, slot masks and aperture grilles (used in Sony Trinitron CRTs). There are also shaders that target specific classes of legendary CRTs like Sony WEGA, PVM (professional grade) and BVM (broadcast grade). Others target emulating different kinds of cable connections from RF, composite, S-Video, YUV, and RGB). One of the latest trends is creating shaders which rely on what kind of flat screen technology you have. So a shader that more correctly emulates a certain Trinitron CRT by leveraging the uniquely wide contrast range of an OLED monitor but doesn't look as good on a non-OLED monitor. The same is happening around both HDR monitors and high FPS monitors, as each enables better kinds of fidelity by using those traits.

                  Personally, I prefer maximum quality, fidelity and authenticity (and zero nostalgic degradation). So, I avoid the entire mega-bezel series as that takes up precious screen space for rendered monitor bezels with reflected screen glow. It's cute but simply a waste of space and the bounceback reflections wash out the original image. I focus on RGB shaders and set them to minimal blurring and minimal scanlines. Have fun exploring and trying different things.

        • dahart 11 days ago

          > No one was sitting down and factoring in blur, scanlines, phosphor persistence, or etc.

          Sure they did, implicitly, as a byproduct of using what they had and developing game art on CRTs. That’s the whole point; using the CRT affects your choices, and things would come out different if they’d used LCDs. We know that for a fact, because things are coming out differently now that people develop game art on LCDs. :P

    • mrandish 12 days ago

      GP here. I don't want to repeat the lengthy technical explanation I already posted in another response downthread, so please refer to that: https://news.ycombinator.com/item?id=42817006

      > The reality is that different CRT's had wildly different characteristics in terms of sharpness, color, and other artifacts -- and it also varied tremendously depending on the type of video output/input.

      As a video engineer old enough to have started my career in the analog era, I fully agree composite video displayed on consumer TVs could vary wildly. I already explained the technical point about decoding the signal information properly in my other post but you're making a different point about variability, so I'll add that just because TVs could be mis-adjusted (or even broken) doesn't mean there's not a technically correct way to display the encoded image data. This is why we used color bars to calibrate TVs.

      > I definitely appreciate wanting to blur the image in an emulator to remove the jaggies

      But that's not my point, blur was an undesirable artifact of the composite video standard. 80s and 90s 2D pixel art was hand-crafted knowing that the blur would blend some colors together, minimize interlace artifacts and soften hard edges. However, I only use shaders that model a small amount of blur and I run my CRT in analog RGB instead of composite, which can be quite sharp. My goal is not nostalgia for the analog past or to degrade a game's output as much as my parent's shitty 1970s living room TV did. I had to engineer analog video in that past - and I hated it's shortcomings every day. When I play 80s and 90s video games, whether via shaders or on my analog RGB CRT, it probably looks quite a bit sharper and clearer than the original artists ever saw it - but that's not due to some subjective up-res or up-scaling - it's due to accurately decoding and displaying the original content to the technical standard it was created to comply with (even if many consumer TVs didn't live up to that standard).

      In the 90s I worked at a TV station and after hours we'd bring in consoles just to play them on the $3,000 Sony BVM broadcast reference monitor. And they looked great! That's what I'm after. Accurately reflecting the original artist's intent in the maximum possible quality - without slipping over the line into editorializing colors or pixels that were never in the original data in the first place. I want to play the game as it would have looked back in the day on the best video output available, through the best cable available and on the best screen money could buy. And via emulation and shaders, now everyone can have that experience!

    • dahart 11 days ago

      You have very valid points, variation in CRTs was very high back in the day, and the example image does have a gamma/brightness discrepancy, I agree with that. Back when CRTs were dominant, gamma and brightness were all over the map, almost nobody knew what those were. You couldn’t even count on where the visible edges of the screen were. And you’re right that saying “the way it was intended” is perhaps slightly hyperbolic or maybe isn’t quite meant the way you’re taking it. It’s not that using CRTs or not was a choice, but it is fair to say artists used CRTs when creating game art and intended for it to look as good as it could on CRTs, and they did not intend for the pixels to turn into multi-pixel solid color blocks.

      Yes exactly CRTs were blurrier, and that alone affects artistic choices. It is fair to say that CRT art looks different than LCD art because CRTs are blurrier. Games developed on CRTs with low resolutions don’t look as good when displayed on high res LCDs with up-resing and nearest neighbor sampling. The problem with using a solid 2x2, 3x3, 4x4 block of LCD pixels to represent a low res CRT pixel is that it’s a poor reconstruction, introduces unwanted high frequencies, and looks very different from the original. It’s true from a signal processing perspective that 4-up LCD reconstruction of old CRT art is quite wrong and bad.

      This does extend into music, kinda. We can look at music from the 30s and 50s for an even stronger example - early recorded music was both technically limited to, and also artistically designed for, a frequency range of, I don’t know, like 500-3k Hz. Some audiophiles do argue that using an old record player to play old vinyl is a superior experience to listening to a digitized copy on modern hardware, and often with the same argument - that the old stuff is the way it was intended to be heard.

      However, the analogy to music is slightly broken since today’s digital music - unlike LCD up-resing of old games - never tried to reconstruct old music using nearest neighbor sampling. When you do that with audio, you can instantly hear it’s all wrong. If you were actually comparing nearest-neighbor audio reconstruction to blurry reconstruction, you would 100% agree that the blurry reconstruction was the ‘way it was intended to be heard’. The biggest problem with this whole argument that neither you nor the parent addressed is that LCD nearest-neighbor reconstruction is crappy, and as long as we try to blur when using LCDs, most of this discussion is moot.

      So anyway, in many ways I think your argument already does agree with the idea that games designed on CRTs look better on CRTs than, e.g., 4-up reconstructions on LCDs. The entire sticking point in your argument might hinge on how you interpret the word “intended”. I’m saying the original argument isn’t necessarily claiming that the intent was conscious or explicit, it’s merely saying that the intent was a byproduct of having used CRTs during the creation process. In that sense, it’s a valid argument.

      • mrandish 11 days ago

        I largely agree with your points, especially about 4-up reconstruction.

        > variation in CRTs was very high back in the day

        I wanted to add some more info around this point. In cases of home consoles this is true (because they hooked up to whatever TV you had) but there's one very large case where it's not true - and it's a case that matters quite a bit, especially from a historical preservation perspective.

        Most arcade cabinets were made on factory assembly lines and used bare industrial CRTs. These CRTs were made by a handful of companies and arcade manufacturers selected the CRT model for a game by its specifications, which often differed from CRTs designed for use in consumer TVs. We know exactly which CRT (or CRTs) were used in most arcade cabinets and the detailed manufacturer specifications and schematics for those CRTs are preserved and online. When researching the proper modeline frequencies to set my quad-sync monitor to (because it's a chameleon), I look up the specifications of the original CRT in the original cabinet. The game developers usually had one of these industrial CRTs on their desk, so that they were developing for the exact CRT that would be in their game's arcade cabinet.

        But it's even more precise than that. Many game ROMs have a set of hidden factory calibration screens with alignment grids and color bars. On the manufacturer's assembly line, after installing and connecting the CRT, workers fired up the game, went into these screens and adjusted the internal controls of the CRT so the horizontal & vertical positions and sizes of the grids were correct as well as the color bars via the tint control. I use these calibration screens to this day to properly set up my CRT to match the adjustments of the CRTs in the original cabinets (which the game ROM was written and tested against). Because my monitor handles so many ranges of frequencies, it stores and recalls these horiz/vert/tint adjustments for each unique scanning frequency (along with other adjustments like pincushion, skew, bow, etc). Historians have even managed to preserve some of the instruction sheets written for the factory floor workers to use when adjusting the CRTs to the intended spec.

        Fun photo of the Ms. Pacman assembly line: https://thedoteaters.com/tde/wp-content/uploads/2013/03/pac-...

  • gwbas1c 12 days ago

    I'm pretty sure those are similar, but different, images.

    Having grown up with CRTs, very few games look "better" on them; mostly games that used interlacing to create flashing effects. (Edit: Forgot that light guns need CRTs due to timing.)

    Otherwise, CRTs are like vinyl: Some people will invent all kinds of reasons why they are better, but in practice, they aren't.

    • pdpi 12 days ago

      The argument isn’t “CRTs are better”. You’re right — they’re not. The argument is that pixel art from that era was designed around the specific limitations of CRTs, and takes advantage of the specific way that CRTs would mess with the pixels.

      This is similar to what happened with electric guitars — you can make cheap amps with barely any distortion these days, but that sucks horribly for playing music composed around the presence of distortion. E.g. amp distortion tends to add a bunch of harmonic content that makes major/minor triads sound pretty bad, which is why power chords are popular. On the other hand, power chords sound pretty terrible without distortion, because they need that extra harmonic content to sound good!

      • sim7c00 11 days ago

        nice points. the parallel with guitar really made it click for me thx =) makes total sense!

    • cubefox 11 days ago

      CRTs are definitely much better than OLED or LCD in one major aspect: Motion clarity. OLED and LCD are sample-and-hold screens, meaning they will display a frame for the entirety of a frame time, like 1/60th of a second at 60 FPS. A CRT displays frames just for a fraction of the frame time (the rest of the time they are dark), which prevents perceptible blur when our eyes track moving objects, which they do all the time in order to prevent blur. More details here:

      https://news.ycombinator.com/item?id=42604613

      • paulbgd 11 days ago

        As a user of a crt pc monitor and a 240hz oled, the motion clarity of the oled is pretty darn close now. I’d bet 480hz is the point where the smoothness of modern panels finally catches up to the crts.

        • cubefox 11 days ago

          Of course the question is how to leverage those monitors. Either games have to render 480 frames per second (which is impossible on average hardware in most cases other than Subpixel Snake), or the monitor just displays 7 black frames after every rendered frame, which would cut down the games to 60 (rendered) frames per second. But the latter would of course greatly reduce the maximum screen brightness to 1/8, possibly below CRT level, because OLEDs aren't very bright in the first place.

    • sim7c00 11 days ago

      the vinyl comparison doesn't hold because music isn't composed on vinyl. saying vinyl is better is like saying jpeg images are better than png or something.. its the storage format/medium. it does impact the sound, but not the way people composed afaik.

      the crts were used as the medium to compose the thing on for these artists. they saw their art come to life on them and found ways to optimise for that.

      • stavros 11 days ago

        There was a post on here a few weeks ago that claimed that this isn't true, and that artists created the images on much better displays, that didn't have the limitations that the average CRT of the time had. Unfortunately, I can't find the post.

    • mrandish 12 days ago

      I agree with you about vinyl but I think you're misunderstanding my point about CRTs. I'm not claiming CRTs are inherently "better" either technically or aesthetically. In general, they're not. I'm not like some audiophiles who argue vinyl, tube amplification and "magic copper" cables are better - denying both signal theory (Nyquist et al) and objective data (double blind A/B/X tests). Modern digital formats and devices are almost always better overall. The cases where they aren't are rare, very specific and, even then, 'better-ness' is only in certain ways and not others.

      My background is in video engineering and the point I'm making here is very specific. It only applies to hand-crafted retro game pixel art created in the CRT era. And my point isn't even about CRTs! It's about how the RS-170A composite video standard that CRTs used encodes color. The "A" in RS-170A added color to the existing black and white television standard while remaining compatible with old B&W TVs. It was sort of a clever analog-era compression hack. I'll skip the technical details and history here (but both are fascinating) and simplify the takeaway. Broadly speaking, in digital representations of composite video color encoding, the correct color of a pixel can only be known relative to the current phase of the pixel clock and the pixels adjacent to it. Sometimes it's a fairly subtle difference but other times it can change a pixel from blue to red.

      To be clear, this wasn't "better" in any way (other than allowing optional color). The "hack" of encoding chroma information at half the frequency of luma and only in relation to a sub-carrier frequency came with trade-offs like chroma fringing on high frequency edges, ringing and other spurious artifacts. However, it was the only video we had and video game creators of the 80s & 90s used the tech they had to create the best images they could. For example, we would put a pixel of a certain color next to a pixel of another color to intentionally change the color of the second pixel (and NOT just due to blurring the two together, it literally decoded to a different third color). I did this myself in the 1980s, intentionally positioning a white pixel next to a black pixel on an even numbered column so they would show as a single red pixel on the screen. Using this encoding technique, I could display 9 different colors from a computer that only had two bits per pixel. That's why just displaying a naive RGB output of the game isn't properly decoding a signal that was hand-encoded to have more and different data than what's in the naive RGB.

      So I recommend using a CRT shader not because it emulates a glass tube with phosphors but because it includes the decoding logic to correctly display the original encoded content. Personally, I never use the shaders that add noise, blurring, cross-talk or other degradation. That would be as dumb as adding the pops and click of a dirty vinyl LP to a pristine signal. That would make it less accurate. My goal as an engineer is to be more accurate. It's fine if adding spurious crap tickles somebody's nostalgia bone from when they were a kid - but I'd never do that. I want to to see the original pixels and colors the artists saw and intended their audiences to see. That requires properly decoding the signal. And the example I posted demonstrates just how different an image can appear when properly decoded.

      • dylan604 12 days ago

        >It was sort of a clever analog-era compression hack.

        Also known as technical debt around these parts. The repercussions of that clever hack are still being dealt with to this day. I've spent a good deal of my career specializing in proper handling of video sources that are painful to deal with all because of this clever hack.

        color burst, front porch, 1.001, ugh!!!!

        • mrandish 11 days ago

          I feel your pain. :-)

          I've gone back and read through some of the committee reports when they were deliberating about this and... all I can say is, the more I understand about how composite color video really works - the more amazed I am that it works at all.

          And to be fair. It's not like they didn't know there were better ways. There were lots of proposals to do YUV, RGB and other kinds of encoding but backward compatibility with B&W TVs and staying within a 6 Mhz channel were political mandates from on high.

          • dylan604 11 days ago

            Yeah, it's one of those very clever things that fit the bill for exactly what they needed right then and there. How could they have ever expected HD, 4K, progressive, internet streaming, or any of the things that their very clever hack would wreck future havoc on forevermore?

lukevp 12 days ago

Very interesting! Learned a lot about color space and how it applies to subpixels, glad I watched this!

Gameplay wise, I think it should be a bigger game board and there should be accounting for the speed of the snake through each subpixel (when traveling left to right, going from R to G is less horizontal movement than going from B to R, and traveling vertically, each step is massive compared to the horizontal movement.) This should be pretty easy to do by ratio’ing and changing the speed of each animation step based on where in the spectrum it is. That would make it feel a lot more polished I think.

blibble 12 days ago

qbasic nibbles did the same using the ansi box drawing characters

there was a text character with half the vertical "cell" in use

this, along with clever use of foreground/background colours allowed double the vertical resolution (in text mode!)

FriedPickles 12 days ago

If you're as dumb as me and try to actually play this, note that the "Snake speed" value is inverted.

  • hatthew 12 days ago

    the speed is ms per frame

    • grayhatter 12 days ago

      ms/frame isn't a speed... it's a delay? maybe a rate?

      • kaoD 12 days ago

        > it's a delay? maybe a rate?

        I think period?

      • hatthew 12 days ago

        Yeah frames per second probably would have made more sense. That being said, I think it's fine to colloquially refer to time/distance as speed, e.g. my walking speed is 15 minutes per mile, but it should probably be specified that that's the unit in use. But also this isn't a carefully designed game, it's a small tech demo, so ¯\_(ツ)_/¯

    • kbelder 11 days ago

      Ah, "Speed of Time"

htk 12 days ago

Who's old enough to remember the joys of tweaking ClearType on Windows XP?

It was a great workaround for rendering smoother text on low dpi LCD displays.

  • shmeeed 12 days ago

    You can still do that on Windows 10! It's just not that much fun anymore.

leeoniya 11 days ago

the easiest way to see a subpixel is to put a drop of water on the display. you probably get 100x magnification this way :)

bawolff 12 days ago

For the zooming out to make the css pixel = real pixel, i wonder if you could just use units like 0.25px instead. Or maybe divide by window.devicePixelRatio in js to make it dynamic.

yuvalr1 12 days ago

This great video goes into a bit more detail about pixels. It also shows that there is an interesting difference with the color green not only in monitors, but also in camera sensors that detect the color:

https://youtu.be/PMIPC78Ybts?list=PLplnkTzzqsZTfYh4UbhLGpI5k...

I can recommend it, and all the other videos of Cem Yuksel. He is really great at presenting!

kbelder 11 days ago

People who did it: We did it this way.

People who didn't do it: You couldn't have done it that way.

kiru_io 12 days ago

Thank you for this video. Love the style, introduction to color space and the practial use.

mempko 12 days ago

The color space portion was interesting. Learned something new!