londons_explore 2 years ago

Gnome (Ubuntu, Linux) allows a custom shader to be applied to the whole desktop, including fullscreen apps.

Here is an up to date fork with some example shaders: https://github.com/Hello1024/gse-shader

I use it to make sure a 'sensitive' pixel on my screen never turns on (it's a row of pixels which, if the difference between it and the pixels to the side have more than a certain difference in brightness, the whole screen fails - presumably due to a power supply fault in the column driver circuitry).

  • EZ-Cheeze 2 years ago

    I want to be able to edit anything on my screen in ways defined by me: "hide any faces" "put a red circle around any animal" "hide any word under three letters"

    http://zeroprecedent.com/lore/flipside.html

    • londons_explore 2 years ago

      Well now you have all the building blocks to implement it...

      • EZ-Cheeze 2 years ago

        I'll do it when I can just tell the computer to do it - maybe 2024

        It will be interesting to rewatch movies with all the faces blanked out - they're eye-magnets that prevent you from noticing other details, for example in body language

        • franky47 2 years ago

          Funny how recent developments in AI have warped our comprehension of what is easy to ask to a computer vs what is hard.

          Obligatory related XKCD: https://xkcd.com/1425

          • lesuorac 2 years ago

            Besides the fact that GPS already exists, it seems like it would take longer (~20 years) than the 5 years claimed to detect birds.

            > The GPS project was started by the U.S. Department of Defense in 1973. The first prototype spacecraft was launched in 1978 and the full constellation of 24 satellites became operational in 1993

            https://en.wikipedia.org/wiki/Global_Positioning_System

            • squeaky-clean 2 years ago

              Never forget to read the alt-text joke on an XCKD comic. The punchline on this one is based in some truth.

              > In the 60s, Marvin Minsky assigned a couple of undergrads to spend the summer programming a computer to use a camera to identify objects in a scene. He figured they'd have the problem solved by the end of the summer. Half a century later, we're still working on it.

              https://dspace.mit.edu/bitstream/handle/1721.1/6125/AIM-100....

              • GauntletWizard 2 years ago

                What's really incredible to me is how right the prediction was, but also that it was real. In 2014, classifying images as "containing a bird" was a nigh-on-impossible task. Not an impossible one, and image classification was already in production use in limited forms with mapping agencies and the like, but beyond anyone's capability at the time. In 2017, Not Hotdog was a novelty app - Image classification was real, but limited, and didn't have a great reputation yet. By 2019, papers[1] were being written on image classification as a service and where their pitfalls were, but the idea was solid and sound; Today in 2022, it's something you'd have to research and test before buying for your startup but not a hard product to find.

                [1]https://arxiv.org/abs/1906.07997

          • xattt 2 years ago

            AI is turning out to be the singularity that science fiction writers talked about. You just have no freaking idea what’s next at this point.

            • somat 2 years ago

              I am halfway convinced that the singularity has already occurred and has been ongoing from the 1700's* however, being carried along with it, you can't see it directly, you can only dimly sense it's tidal pull.

              It is like a black hole. how long does it take to fall into a black hole? the answer is a surprising "just about forever" due to time itself dilating as you approach the center.

              * think about it, mankind had been trundling about with effectively the same economy for many thousands of years then at some point about three hundred years ago it went exponential and has not slowed down.

  • speedgoose 2 years ago

    How did you manage to find the issue?

    • londons_explore 2 years ago

      I noticed the screen only died when playing videos and always died at the same point if I rewatched a video. Notably, greyscale videos never caused the issue. Then I went through a video frame by frame to get to the frame it died on. Then I erased parts of the frame to find which part caused it. Eventually I found that if the red or blue pixel in the 481st column and the green pixel in the 482nd column have a difference of brightness of too much, and neither are 255 or 0, the screen dies.

      I'm pretty sure the problem is the column drivers (which put data onto the column lines). They take in serial data, and my 1920 screen has 4 column drivers, each responsible for 480 columns, so the 481st pixel is the first column that the 2nd column driver deals with.

      It uses more power during the row sync pulse (because it has to drive all the column lines to the correct voltages for whatever is being displayed). It uses more power for grey values (because 255 or 0 are solid on or off, while mid values are typically dithered, wasting energy in the column capacitance). I would guess all these worst-case events for power consumption within the column driver, combined with probably 'barely passing qa' silicon, means that in edge cases the power sags, something gets reset, and the whole screen fails.

      So my fix is a shader to make sure the worst case conditions can never happen all at once. Visually, it isn't really noticeable. And with more work it could probably be turned into something that could be shipped to customers (within the GPU driver) without any customer complaining (for example if you are a laptop manufacturer who has purchased millions of screens with this fault).

      • jcims 2 years ago

        What's the odds that such an obscure problem happens to someone with the patience and wherewithal to figure it out *and* fix it? Pretty impressive.

        • londons_explore 2 years ago

          I did consider replacing the screen, but the only source I could find was China with 60 day delivery by boat, and I decided to find a workaround till the screen arrived... The screens now arrived, but the workaround is so good I can't be bothered to fit it.

          • bombcar 2 years ago

            It’d be wonderful if we were in a world where working things like this out was worthwhile. I shudder to think how many perfectly working monitors are dumped each year, let alone ones with problems like this.

        • fuckstick 2 years ago

          It’s probably not all that obscure. There’s an awful lot of ewaste.

      • moffkalast 2 years ago

        That is some top tier investigating and persistence, hats off. I would've used the excuse the upgrade the monitor lol.

        • CamperBob2 2 years ago

          Or replace the 30-cent capacitor that's likely causing it.

          • londons_explore 2 years ago

            It's likely one of the power bond wires from the glass screen to the silicon IC. Without equipment stretching into the millions of dollars, you won't be fixing that.

            • CamperBob2 2 years ago

              Why would that cause a current-dependent symptom? Do they rely on a lot of parallel bond wires? If so, breaking one isn't realistically going to do anything.

  • e-_pusher 2 years ago

    This is really cool. Is there a way to do a global shader like this in Android?

drumdude 2 years ago

I do this for a living. Sony digital cinema projectors use a type of LCD panel (SXRD) where the uniformity drifts over time. A special camera takes about 35 minutes to create a LUT to restore the projected image to a uniform white.

  • jagged-chisel 2 years ago

    Is that 35 minutes spent collecting input, or calculating the LUT?

    • drumdude 2 years ago

      The camera handles everything automatically. It generates dynamic patterns on the screen and adjusts several times until it is satisfied with the result. It does this for red, green, blue, and white across 10IRE, 20IRE, ect up to 100IRE. The LUT is human readable and able to be uploaded and downloaded from the FPGA.

    • washadjeffmad 2 years ago

      I imagine the process is Capture > Analyze > Generate > Apply > Repeat until all test images are within bounds and color matched, not stage 1: capture, stage 2: calculate, done. That's how it worked for our projection, at least.

      • MayeulC 2 years ago

        Well, you could capture once over a range of inputs to get a much better initial first guess too!

        • GauntletWizard 2 years ago

          I'd bet that they do some of that, too, but if you've got 100 images you want to get "perfect", it's probably faster to capture 10 of them uncalibrated, guess at the curve, capture those same 10 and 10 more calibrated with guess #1, and make a second guess, for a couple iterations than it is to capture all 100 images before any calibration and then all 100 again afterwards to confim, especially if there's any slop in the calibration curve.

  • aidenn0 2 years ago

    Is SXRD just Sony's name for LCOS, or is it something distinct?

ludwigvan 2 years ago

> I haven't yet watched a whole movie with the new color

Love the hacker mindset. Once the problem is solved, the underlying issue loses its appeal :)

Rygian 2 years ago

This is almost identical to a problem I'm trying to solve, which is to turn a potato-quality picture of a sheet of paper into a clean scan, turning whatever levels of gray conform the paper background become a uniform #ffffff white. The obvious solutions (equalizing, converting to bitmap, …) don't work because what's white in the top left (say #ccc) is wildly different from what's white in the bottom right (say #888), and the shift is non-uniform due to potato-quality lighting.

Glad I caught this post, I hope the solution can contribute to my problem (although I do not have a way to obtain a fixed ground truth — lighting will change for each picture.)

  • sorenjan 2 years ago

    Sounds like local contrast adjustment. There are several different ways of solving it, here's a couple that look like they work pretty well:

    https://stackoverflow.com/questions/63251089/how-to-do-a-loc...

    https://stackoverflow.com/questions/65666507/local-contrast-...

    One possible preprocessing step could be to do a high pass filter on it, if the shadows vary slowly over the image.

    There are also more specialized techniques specifically for removing shadows from documents, like these:

    http://civc.ucsb.edu/graphics/Papers/ACCV2016_DocShadow/

    https://faculty.iiit.ac.in/~vgandhi/papers/shadow_removal_ca...

    I also found this, an image editor based approach if you just want to do a few images manually:

    https://janithl.github.io/2021/12/remove-shadows-and-uneven-...

    • Rygian 2 years ago

      Thanks! These pointers will be very helpful.

  • crazygringo 2 years ago

    If your content is black and white, just use a local contrast filter and then threshold it. It's easy to do, but it does result in monochrome so you lose antialiasing. If you're at 300+ dpi though that doesn't usually matter. This is commonly done with PDF scans where monochrome output is desired for high compression. Easy to do with ImageMagick.

    If you want to preserve aliasing and also color generally, I'm sadly not aware of any open source solution for that. Various scanner apps seem to do it with varying degrees of success; I'd be curious if there's a standard algorithm for it. It feels related to the de-curving algorithms that take a book page and make it flat. So you'd be modeling both the page curvature and black/white values simultaneously. Seems possible for general lighting/shadow, but wouldn't work for reflectivity from camera flash.

  • comboy 2 years ago

    Just so you know, there are many scanner apps which solve this problem already, not sure how many of them are open source though.

  • daedbe 2 years ago

    A common alternative solution in this case would be to use an adaptive thresholding technique such as Otsu’s method.

    • komatsu 2 years ago

      Otsu's method finds a single threshold value in an adaptive way. This can't solve a scanned document thresholding problem.

      btw: The iOS Notes app has quite a capable document scanning tool. It's cleverly hidden though.

      • Rygian 2 years ago

        I'm pretty happy with the scanning feature of Evernote, and I think there are some other nice apps in the Android app store, but my goal is to have a solution that is not captive (either to a specific vendor or to a SaaS solution).

    • Rygian 2 years ago

      Thanks, I was not aware of Otsu's method.

      From the Wikipedia article, "Otsu's method performs badly in case of heavy noise, small objects size, inhomogeneous lighting and larger intra-class than inter-class variance." (Emphasis mine.)

      Right now my solution is at the stage of local thresholds with a configurable block size.

      Thanks to your pointer, I know now that my next steps will be to review the Niblack or the Bernsen algorithms. (Or just integrate ImageJ.)

Retr0id 2 years ago

Wow, I'm very glad to see this!

This exact idea has been floating around in my head for ages, and I always wondered how well it actually worked - now I don't need to wonder. I started thinking about it as a potential solution to OLED burn-in. Thankfully, my OLED TV doesn't have any burn-in yet, so I never needed to investigate further.

  • xvector 2 years ago

    OLED TVs already compensate for burn in. A lot of your pixels are probably already somewhat "burnt" but you can't perceive it due to the corrective measures your TV takes. You'll only really notice it when it's irredeemably bad.

    • Candas1 2 years ago

      OLED TV's try to prevent burn in, I am not sure it can compensate burn in

      • empiricus 2 years ago

        I don't see how the TV can compensate for the burnin without having an external picture of the screen. I have a couple of years old OLED TV, and it has burn in... The Netflix logo, the netflix animation, and subtitles are quite visible all the time now. This is despite running periodically the TV pixel refresh or however it is called...

        • orbital-decay 2 years ago

          > I don't see how the TV can compensate for the burnin without having an external picture of the screen.

          Build a comprehensive degradation profile of your LEDs. Keep the burn-in accumulation buffer that tracks intensity and usage amount of each subpixel. Use it in your EOTF to correct the picture.

          Some color-accurate monitors like Eizo are even profiled for temperature (and have a grid of temperature sensors)

          • empiricus 2 years ago

            sounds good in theory, but what about gradual errors accumulating after thousands of hours/millions of frames?

        • ahartmetz 2 years ago

          Burn-in is probably somewhat computable as a function of brightness, time, and temperature or so. So it can be compensated physically (burn out the other pixels... you don't want to be in the room, it's going to be annoyingly bright) or digitally by adjusting the signal.

        • astrange 2 years ago

          It has a memory of the history of what’s displayed on screen and uses that to predict what’s burned in. This can obviously be a privacy issue, so there’s a trade off in making it too exact.

        • serd 2 years ago

          Could you share a photo? I'd really like to see that.

          • empiricus 2 years ago

            https://i.postimg.cc/1Xy6V5wb/oledburn.jpg

            not the greatest picture (some reflections). should have used a gray background, but I used red because this color is the most affected.

            The burnin is mostly harmless, but the middle blob is very annoying; yellow parts of the image become greenish when they get to the middle of the screen.

            • ornornor 2 years ago

              Ah that sucks. How old is the tv? And what’s the brightness in it?

              My 3 year old Panasonic gets used a lot but we keep the brightness down to the 45–55% range (it’s plenty bright) to avoid burn in. We also don’t display static content on it. And when we do, it’s mostly from Kodi which dims itself after 10 min or so.

              All that to say for those who are fearful of OLED because of burn in: don’t be. With some precautions, it’s fine. And having true blacks is absolutely glorious. I enjoy it every time I use the screen.

            • zepolen 2 years ago

              Which tv is this?

          • sss111 2 years ago

            second this!

      • xvector 2 years ago

        OLED TVs keep track of how long each pixel is on, and perform periodic uniform burns to bring all pixels down to the same remaining lifespan.

unglaublich 2 years ago

I wouldn't call this over-engineering. It's a reasonable solution to an actual problem.

  • axiolite 2 years ago

    I think the reasonable solution is: Buy a new TV. We're not talking about a jumbotron display here. A cheap new TV with proper color is likely very inexpensive.

    • adrianN 2 years ago

      Why create waste when it's not necessary?

      • tgsovlerkhgsel 2 years ago

        If you're not doing it as a hobby, the one-off effort put into repairing an item (or in this case, engineering the correction) is also "waste".

        If you want to dedicate that time to improving the planet, avoiding the waste of one TV is likely not a better use of your time than e.g. fixing some bug in some popular open source software that causes it to be less efficient.

        Let's say you make a change to Firefox that makes it use 0.1 Watt less on average, and let's be conservative and assume the ~350 million Firefox users use it for one hour a day on average. That's 35 MWh per day saved. Assuming 0.1 kg CO2e/kWh, that's a saving of 3.5 tons of CO2e saved each day.

        • adrianN 2 years ago

          Fair point, but I bet that making Firefox take 0.1W less on average is vastly more work that fixing a TV image. And if you write blog post about it, maybe other people can fix their TV with less work.

          • RajT88 2 years ago

            I don't know if I am going to ever fix a TV this way.

            But the knowledge of how to create a custom shader is going to come in handy one day. More and more I am finding, most knowledge comes in handy some time. You just have to remember at the right time what is possible and go refresh your memory on it.

          • anon4584 2 years ago

            There are other ways to improve the planet as well.

            If everyone spent 2 hours a year removing washed up waste from the beaches, it would have made a big difference.

            • smileysteve 2 years ago

              Washed up waste is the tail end, the only "big" difference it makes is to the localized beach until the next tide/current comes from the source of the pollution.

              Saving the monitor, releasing the diy reduces a (minor) head end of the problem that is not only leading to long waste in a landfill, but also metals and plastics in water.

        • nmilo 2 years ago

          What a ridiculous premise. The value of your time does not come in some sort of interchangeable unit where 1 hour of TV repair is comparable to 1 hour of Firefox bug-fixing. Realistically, if the author decided to get a new TV, he would not spend that new-saved time trying to make up for the environmental damage he caused by throwing his old TV out.

          (And who is even to say your bugfix would save power? It's not like Firefox has a power-usage detector in their CI pipeline.)

          • batch12 2 years ago

            Not to mention the time spent shopping, fuel spent delivering, time spent setting up and configuring...

    • agent008t 2 years ago

      Where is the fun in that? We are alienated enough from the stuff we consume as it is.

rixrax 2 years ago

I wonder if this is so 'generic' that e.g. AppleTV could add support for this? Take a photo of your TV with an iPhone when AppleTV is showing test image. And then the AppleTV output is calibrated appropriately to compensate for uneven backlight.

  • bzzzt 2 years ago

    Calibrating color with an iPhone is already a feature in tvOS, so correcting for local errors looks like a nice improvement in that direction.

    • dontlaugh 2 years ago

      Sadly it only works with iPhones that have FaceID.

      • bzzzt 2 years ago

        Probably related to the quality of the camera.

  • millimeterman 2 years ago

    There are already some companies that let you calibrate your TV's colors with your phone camera. Apple TVs do it and adjust the output signal while some new Samsung TVs can do it and actually apply hardware calibration. Adjusting for unevenness in the same way seems potentially harder but doable.

    • jimnotgym 2 years ago

      Isn't this done with a ICC profile and a lookup table, in which case varying or across the screen does not sound feasible?

nagonago 2 years ago

Clever solution! It kind of reminds me of when I sometimes run sound through customized EQ and compression to compensate for crappy speakers. My approach is not quite as scientific as this though, just "tweak until it sounds good."

I didn't know there was a version of MPC with a live shader editor, that is also very cool. This is actually a pretty good use case for such a feature.

  • klodolph 2 years ago

    My home receiver came with a microphone. You plug it in and the receiver blasts noise through the speakers, and comes up with EQ to compensate automatically. This wasn’t an expensive receiver, either.

    If you want a more scientific approach, you can use a tool like Room EQ Wizard, which is free—although it works best if you have some kind of flat response microphone, or calibrated microphone with a known response curve.

    (I’ll also add that you’re compensating for crappy acoustics in your room as much as you are compensating for crappy speakers.)

  • atahanacar 2 years ago

    I use AutoEQ (https://github.com/jaakkopasanen/AutoEq) for my headphones. It works by "parsing frequency response measurements and producing equalization settings which correct the headphone to a neutral sound". They also have a huge database of already measured and equalized data, which is what I use.

  • HPsquared 2 years ago

    I use Equalizer APO for this in Windows. Add some parametric EQ peaks (negative peaks) to cancel out resonant modes of my cheap speakers/headphones.

robomartin 2 years ago

I have over a decade of experience in the design and manufacturing of advanced display systems and ran into precisely this problem around twenty years ago. At that time we experimented with and developed pretty much exactly this type of compensation; implemented on custom FPGA-based real time image processing boards.

Just looking at the pictures, this does not look like a backlight problem but rather degradation of the liquid crystal layer. Yes, sure, there's interaction between the two. The purple shift, however, is very much something that was happening twenty years ago with some liquid crystal chemistries. Back then you could definitely tell which panels were not using high quality LC fluid.

Apple had this issue with their second generation HD Cinema Display product (the first aluminum enclosure, 24 in, 1920 x 1200 model). Some percentage of them would turn purple. I don't have Apple's stats on this. From my own experience the number fluctuated between 15% to as much of 50% of the panels in a batch going bad after moderate burn-in.

Having said all that, this type of compensation or fix might be OK for a TV at home or the computer monitor on the desk of a doctor or even a coder. Not good --at all-- for someone doing critical color work, such as a graphic artist. The reason is that you introduce spatial nonlinearities and differential errors.

The simplest way to put it is that you no-longer have the full 256 (or 1024) steps per R, G, B channel between 0 and 100%. Hypothetically, you might have 256 for green and, say, 200 for red and 175 for blue. This means that the path from black to white is no longer monotonic. You can have serious color rendering errors through the color space. For example, it might be impossible to make an accurate 50% gray because you just don't have the RGB values needed to accomplish that. Worse yet, everything between 47% and 53% gray might look exactly the same.

You can also introduce serious gamma distortion. If, on top of that, you add a temporal element (video), well, it can be a real mess.

The real solution (for critical workflows) is to replace the panel.

BTW, this can apply to RGB OLED as well.

Tade0 2 years ago

I damaged my laptop's screen by leaving it running at 100% CPU with the lid closed for too long(not my intent - that was "sleep mode"). The adhesive keeping the LCD layers came off, creating a diagonal striped clouding pattern.

I was meaning to do something similar to the author[0], but couldn't make time and just used this opportunity to buy an external screen.

I'm glad someone put in the work so that now I may be able to use my original screen again.

[0] My most desperate idea was to run a RDP session locally and process the displayed image. Seemed simpler than trying to modify the content of the screen directly.

  • dylan604 2 years ago

    >The adhesive keeping the LCD layers came off, creating a diagonal striped clouding pattern.

    now this sounds like something my younger me would be interested in seeing. what kind of special effect look can you get from that by running different colors/patterns through that stripe? obviously, the only way to make it usable would be to record the screen externally, but it would not be the first time someone (ahem,me) pointed a camera at a screen for sfx/vfx. back in the old analog days, i would by crts specifically because of their "issues".

  • jerf 2 years ago

    "The adhesive keeping the LCD layers came off, creating a diagonal striped clouding pattern."

    Oh, is THAT what that is. I have an ancient laptop where the screen is failing that way. Was trying to figure out what on Earth could produce perfectly diagonal streaks in an LCD! Though I'm still not quite sure how that connects; is the tape oriented diagonally?

    • Tade0 2 years ago

      > Oh, is THAT what that is.

      That is at least the explanation I got on some obscure forum where someone else had a similar problem and it was caused by heat.

      > is the tape oriented diagonally?

      Worse - it's a multi layer sandwich of filters etc. which need to be perfectly aligned for a clear image:

      https://www.azom.com/images/Article_Images/ImageForArticle_2...

      Funny thing is in my case the hottest part actually remained ok - it's the surroundings that got, for lack of a better word, ruffled.

      I've seen videos of Indian repair men disassembling a panel and putting it together to fix such issues, but IIRC only the back light ever came off.

justusw 2 years ago

This is such a cool idea and execution. I also like the DIY approach to patching MPC BE yourself, which shows how far OSS can take you.

I wonder, when applying a linear transformation like in the shader described, will the total available color space decrease? Simply put, if a one-dimensional color value on the arbitrary scale between 1 and 100 needs to be decreased by 20 for correction, the resulting maximum will be 80. Does that mean the total available color values will be less?

  • actionfromafar 2 years ago

    Mostly yes but it also depends. What happens is exactly that, there will be a clipping in the color band(s) you are correcting for.

    This can easily be verified with a simple thought experiment: imagine an area is almost completely red. This area will have to be complemented with full blast of green and blue to even achieve white, or partial blast of green and blue to achieve gray.

    It can not achieve any color without a red component, hence reducing the area of the color triangle for that part of the screen.

  • klodolph 2 years ago

    No. Color perception is relative.

    Basically, what you are doing is adding a cast to the image. This cast cancels out with the cast that the backlights give. When you add two complementary color casts to each other, you end up with neutral gray.

    This results in an image which is darker, but still has the full color range that the TV is designed for.

causi 2 years ago

I wish you could do this with phones. I know more than one person stuck with a crappy pOLED screen with that terrible green tint to the bottom third of the panel.

  • jeroenhd 2 years ago

    Back in the day before operating systems copied the feature there were tons of Android apps that added a red shift on top of your screen by drawing over other applications (i.e. https://play.google.com/store/apps/details?id=com.csk.app.sh..., https://github.com/LibreShift/red-moon). This usually breaks/doesn't work with banking apps and other such secure contexts but for most apps that's probably fine.

    Sounds like you should be able to do this by just modifying Red Moon to draw a more complex overlay. If the entire screen is affected equally, the stock app may just be all you need!

sorenjan 2 years ago

This reminds me of the world cup in South Africa, where the spectators used vuvuzelas to make a unique sound that I found hard to endure. I figured it would be easy to use a notch filter to remove it, but I could never find a way to implement it it real or near real time, only as post processing.

  • ermir 2 years ago

    I remember the broadcasters also implemented this solution straight into the stream, the TV spectators did not have to do anything.

1-6 2 years ago

This technique is cool but it seems like it would cause poorly performing pixels to get worse over time.

Would it be possible to create an inverted image that would correct the backlight by burning in the bright areas? It would seem like a more difficult task to accomplish because pixel burn-in time is a variable that’s hard to measure.

infomax 2 years ago

I have the same idea each time I read something about backlights;

How easy would be to place a cheap LCD at the back of the main screen and mirror the same output (in horizontally inverted mode)?

Technically it might require synchronizing the frame latency differences between the two devices, but would such a hack improve the perceived quality?

  • t4h4 2 years ago

    Hisense Dual Cell might be what you're talking about.

  • rzzzt 2 years ago

    The cheap LCD will probably not be able to produce enough light to brighten up the main display, and if the two panels are not of equal size, you will need to add some depth to the unit and come up with a projection system to "blow up" the light map to cover the entire image.

pferdone 2 years ago

When I installed my DIY backlight solution (ala AmbiLight) to my TV I also had a red shift for some of my LEDs. But that was mainly due to the fact that not enough voltage reached the later LEDs on the strip to power green/blue because they need higher voltage than red.

mordae 2 years ago

If only my screen had issues with the picture. Instead, its internal clock lags behind the source, causing audio to be ever more delayed. It resets after a standby/wake-up cycle, though. It's annoying to restart TV after about 3 hours, though.

  • TaylorAlexander 2 years ago

    I have a similar issue when using my raspberry pi as a Steam Link client for gaming. After a while the sound starts to get choppy, and this problem gets worse and worse until sound breaks entirely. But video still works. If I kill the game and re-start the steam link client the issue resets. Feels like some kind of clock drift, but in this case it seems to be affecting the ability to decode a digital stream. It’s a very strange issue!

Arrath 2 years ago

I wonder if something similar could be used to keep my TV from ratcheting up the overall brightness/backlighting whenever a subtitle is on screen.

The constant ramping of brightness is rather distracting.

  • kasabali 2 years ago

    There should be an option to disable automatic brightness.

    • Arrath 2 years ago

      I swear I've dug through the menus time and again! I guess I'll have a look again tonight.

      • kasabali 2 years ago

        Might be disguised as an eco mode or something.

tim_hutton 2 years ago

I need this for my Android phone to compensate for a burnt-in image.

  • jeroenhd 2 years ago

    Like I said in another comment: https://github.com/LibreShift/red-moon with a more complicated pattern than just a single color may be all you need for many use cases.

    I don't think Android has an easy to use global shader system, so you'll be stuck with overlay windows and the incompatibility they have with banking apps/DRM crapware that locks you out of your own screen without root access.

    • e-_pusher 2 years ago

      What do you mean by incompatibility with banking apps? I am also interested in what it would take to do a global shader on Android.

      • jeroenhd 2 years ago

        Android malware has previously used overlays to hijack taps and do other shady stuff.

        Banking apps generally set a flag on their apps to prevent overlays and screenshots to prevent malware from reading the screen and tricking the user.

        Depending on your phone, this can have two effects: either the overlay is disabled automatically or the banking app detects the overlay and blocks access (ie to the PIN keypad).

        Not all banking apps do this, but the ones I've used do.

dirtyid 2 years ago

How much does backlight drift over time? I have a few calibrated monitors with "uniform compensation" that has gotten decidely less uniform over the years.

dmos62 2 years ago

Pretty cool, but you could instead replace the LEDs.

  • 1-6 2 years ago

    Sourcing the LEDs, disassembling the panel, and re-soldering each LED is not scalable. The solution mentioned in this article can be open-sourced and distributed to the masses.

    • kasabali 2 years ago

      Leds are sold as plug in bars. there's no soldering involved. taking out the panel safely is tricky, though.

  • MrBuddyCasino 2 years ago

    I did this once, was pretty easy. The backlight color had turned completely blue. The strips can be bought as spare parts and swapped easily without any soldering. The hard part is finding out the parts number of the LED strip for the particular TV.

ifqwz 2 years ago

The quality of computer monitors is appalling - after buying and returning several monitors because of quality issues, I briefly considered doing this system-wide to fix colour uniformity problems. Eventually I chose to keep my old monitor instead. It has uniformity problems as well but it doesn't cost me $1000 extra to keep.

But it's a damn shame that you can't throw enough money at manufacturers to make them make monitors without glaring QA problems. No matter how much you spend they sell you shit.

  • nottorp 2 years ago

    You talking about mainstream "gaming" monitors? I'm just a programmer and gave up on those long ago. Getting entry level monitors targeted at graphic designers now. Namely Asus ProArt and I think Dell has similar stuff?

    They don't have 240 Hz and sub 0.01 ms response times though, so if you're buying your hardware based on bigger numbers in specs they won't do.

    They're probably not that great for actual designers either, but they're good enough for me.

    • deergomoo 2 years ago

      > Asus ProArt

      I’m using one of these and I’m very happy with it. Reasonable price, 75Hz, supports USB-PD + has its own USB ports so I can one-cable it with my work laptop.

      Most importantly they come factory calibrated. I consider reasonable colour reproduction important even though I only use it for programming. I stare at this thing for 8 hours a day, it needs to look good.

      In fairness I also have a 165Hz LG UltraGear gaming monitor, and the image quality is almost as good. My only complaint is the black levels and grey uniformity suck, but for someone who wants performance and quality it’s a decent option.

      • nottorp 2 years ago

        > I consider reasonable colour reproduction important

        When I got my first I wasted 2 hours rewatching a movie I had seen recently. Just because i didn't know it can look that good on a monitor :)

    • kevingadd 2 years ago

      The first graphic design monitor I bought from Benq had a busted image processor so that you couldn't turn off the sharpening - only set it to -5% (blurry) or +5%. Eventually I complained enough that they sent me a different monitor that didn't have the problem.

      I've had bad experiences with some of Dell's pro-grade monitors too. It feels like modern displays are so complex firmware and hardware wise that it's just very hard to find one that isn't defective in some way. This replacement Benq works for basic uses but its freesync is broken and it's already developed burn-in around the edges after about 1.5 years.

      • nottorp 2 years ago

        Funny, when I bought my first designer-ish monitor I threw a (gaming) Benq into the trash.

  • BoorishBears 2 years ago

    My XDR has been pretty flawless, and on the less exorbitantly priced end, I've had good experiences with a few business oriented IPS models (most recently the Samsung UR55).

    My main gripe is gaming monitors seem to be consistently the worst panels they can get their hands on.

    It seems like they realize gamers will put up with a lot of garbage in exchange for raw "power" and take full advantage of the fact. I'm 99% sure that's why we saw brands like Wasabi Mango (who used to take B grade panels and sell them on the cheap) disappeared... the manufacturers just started shipping them as gaming models.

    • jabroni_salad 2 years ago

      Gamers hit up blurbusters to see how the motion is and it either looks okay or you get a headache. The OEMs optimize for that (hence the focus on g2g and adaptive sync) and then just jack up the saturation slider to compensate for everything else.

      And hey, those wasabis and catleaps got you a 1440p IPS panel that did 90% of what you want for 50% of the price at a time when 1440p and IPS was still kind of rare to own. Most people who got one were upgrading from a typical TN so even a crappy IPS looks good in comparison. I was playing eve online at the time and caused at least 10 people in my corp to buy them when they saw how much screen estate you got at a higher res.

      • BoorishBears 2 years ago

        You're misunderstanding the comment: the point is Wasabi Mango and co were good because they were charging significantly less for the lower grade panels.

        Now manufacturers are possibly prioritizing the highest grade panels for non-gaming use and using extremely expensive gaming monitors as a dumping ground for everything else.

        For example, the 28" UR55 has few complaints about backlight bleed and in my experience with having bought several is a reliable choice. Meanwhile the oddly similar 28" Odyssey G8 is known as a "buy and return until you get one that's ok" type of monitor, as are many other gaming monitors these days.

        Gamers seem conditioned to just accept inferior panel quality as long as the other specs work, while business and casual customers would probably just buy another monitor if they saw weird issues. They might not know the term backlight bleed, so they'll still see it just fine.

    • autoexec 2 years ago

      If you fill your entire screen with nothing but a single solid color (try #FF6400) does it show up correctly? That is, without any gradient or areas of the screen where the color appears darker or lighter (especially around the edges or in the corners?).

      I've yet to find a modern monitor that doesn't have a problem with that basic test, which is pretty disappointing considering accurately representing a single color should be easy and I've had several CRTs that could do it.

      • adrian_b 2 years ago

        I use a pair of Dell monitors with IPS screens (U2720Q and UP2414Q).

        I always use as background a solid grey (#808080) and there is no noticeable non-uniformity.

        I have tried now your color (#FF6400) on the U2720Q. Because this color is much brighter, if you look carefully you can see that there are small areas at the corners, especially at the 2 lower corners, with lower brightness. Also the 2 lateral edges have a slightly lower brightness, but the difference from the center is less visible than for the 2 lower corners.

        However the areas affected are small (maybe a width of about 1/30 or 1/40 of the screen width) and you really have to look with the intention to find non-uniformities. When looking casually at the screen there is no obvious non-uniformity.

        For emissive displays like CRT or OLED it is easier to achieve uniform brightness over the screen.

      • Sohcahtoa82 2 years ago

        On my main monitor, a test like that fails spectacularly.

        If I put a solid purple, then if my eyes are directly perpendicular to the very center of the screen, it works fine. As soon as I move up or down, either the top or bottom of the screen becomes very noticeably blue.

        But in daily use, I never notice it. If I lean way back in my chair, then yeah, I'll need to adjust my screen to be able to see it.

        But this is a 144 hz 1440p monitor I got for $400 brand new in 2015. Pixel response times are great. The monitor works exceptionally well on all the Blur busters tests. It is an amazing monitor for gaming...

        ...except in dark scenes. It's a TN panel, which by default kind of lacks in contrast and brightness, and so to make it look good, I had to tweak contrast, gamma, and brightness settings, and it results in some clipping. #020202 and #010101 look like they get rounded down to #000000, and #050505 and #040404 look like they're getting rounded down to #030303.

        If I draw a pure black-to-white gradient, then there's noticeable banding. Like colors are only being represented in 7 bits per channel, and the darkest colors lose even more.

        But again, in daily usage, especially in games (as long as it's not a dark scene) and videos, it's not even noticeable.

      • jiggawatts 2 years ago

        Monitor? No.

        But I just recently purchased a Sony A95K QD-OLED television, and holy cow the uniformity is just breathtaking. You start noticing the deficiencies in your own vision.

        There's a similar panel available as a computer monitor, but unfortunately only curved and 1440p.

      • BoorishBears 2 years ago

        Yes, I know what backlight bleed is...

        I've probably owned something like 15 monitors in the last 5 years, the XDR may not live up to the 25k reference monitor dreams, but no mere mortal would be able to drive one anyways.

      • ifqwz 2 years ago

        Most people will tell you that their monitor is flawless, then you do simple tests like that one and the monitor shows that it has severe issues and they respond "uh I never noticed, well I don't care". Which is precisely why manufacturers can get away with the shit that they sell.

        • BoorishBears 2 years ago

          I've owned pretty much every "notable" monitor in the formats I care about in the last few years, I'm sure I'm pickier than you.

          The fact is you can pay for a good enough monitor to truly be flawless, it just costs more than people are envisioning. For example, my late revision 5K Ultrafine nearly as flawless as the XDR. I didn't list it because people who don't know better latch onto the wifi teething issues the first revisions had, but the panel is approaching the limit of little backlight bleed as the technology allows (and the limits are not as poor as people are making out).

          -

          Honestly I've seen the opposite though, people who don't realize that any piece of screen large enough, photographed with exposure cranked way below normal will show _some_ sort of pattern and confuse _that_ with "terrible backlight bleed".

          But that's the panel equivalent of people who only watch Star Wars space sequences with brightness cranked to 11 in a pitch black room to judge HDR bloom...