crazygringo a year ago

This is amazing. It's truly bizarre that 5.1 surround sound, or HDR, or spatial audio, would be proprietary paid formats -- I mean, what if someone told you there was a license for stereo audio?

And sure it's in Google's self-interest so that they can bring these technologies to YouTube without paying anyone else. But it benefits everybody, so this is really fantastic news for everyone if it's something that takes off.

  • SllX a year ago

    They’re paying to use Dolby’s encoding schemes. It’s not the idea of “stereo audio” that’s patent encumbered, it’s some of the encoding schemes that can output stereo audio (much less so these days, patents are expiring and all of MP3’s have). Same with surround audio and spatial audio and HDR: Dolby has an encoding scheme that works well and gets the desired result, is high quality, usually higher quality than the alternatives, and they market it well.

    I’ll welcome anyone that wants to enter the space that thinks they can do better, but Dolby is good at what they do, and Google often has massive commitment issues for new projects (although notably, not usually when it comes to codecs). I suspect what will happen is YouTube will develop their own HDR and audio codecs, and it’ll just be used on YouTube and almost nowhere else. That’ll be enough to drive client support, but it’ll be one more HDR format in addition to HDR10+ and Dolby Vision, and it’ll be one more set of audio codecs in addition to like the half dozen to a dozen they already decode, and ultimately this will be to increase the quality of YouTube while minimizing their licensing costs. That’s fine.

    • suzumer a year ago

      Youtube already has it's own HDR video codec with VP9. Also, HDR10+ and Dolby Vision aren't HDR formats, they're formats for storing dynamic metadata that can help TVs better display HDR video. The article seems to misinterpret what their purpose is. HDR video can be presented just fine without HDR10+ or Dolby Vision.

      • pa7ch a year ago

        So if VP9 and AV1 already store HDR data, How is Dolby Vision used? Its HDR metadata shipped alongside video formats that don't already encode this data?

        Like if I stream netflix to a TV supporting Dolby Vision what format is the video being streamed in and is the TV manufacturer just paying Dolby for the right to correctly decode this HDR info then?

        • suzumer a year ago

          Dolby Vision supplements the HDR data already present in a video. For example, when you buy a Blu-ray disc that supports Dolby Vision, the disc contains several m2ts files containing HEVC encoded videos. Present within the HEVC stream is also the metadata, which supplies metadata for each frame. To see what this data is, I used[1], and then got the info for frame 1000 and saved it as a json in this [2] pastebin. As you can see, it contains info regarding the minimum and maximum PQ encoded values, the coefficients of the ycc to rgb matrix, among other things. This allows TVs to better display the HDR data, as currently, video data encoded using rec. 2020 color primaries with max light level of 10000 nits is far outside what current TVs are capable of displaying, so metadata showing max pq of a frame or scene allows these devices to make better decisions.

          [1] [2]

        • izacus a year ago

          Dolby Vision is a format of that metadata - it's hiding either at the beginning or as part of a frames in proprietary metadata extensions. When the video is being played, this metadata is extracted by video decoder (or demuxer) and then sent together with video frames to the display where the display then applies the (color, etc.) metadata to correctly show the frames based on its capabilities.

          Since the format of this metadata is proprietary, the demuxers, decoders and displays need to understand it and properly apply it when rendering. That's the part that needs to be implemented and paid for.

          But that's really not all of the technology - Dolby Vision isn't just the metadata format, it's also definition of how the videos are mastered and under which limitations (e.g. DV allows video to be stored in 12-bit per pixel format, allows mastering with up to 10.000 nits of brightness for white pixels and defines wider color range so better, brighter colors can be displayed by a panel capable of doing that).

 is actually a pretty good article that overviews this topic (although you do need a basic understading how digital video encoding works).

          • account42 a year ago

            That we restrict who can properly decode data is even more absurd than other software patents.

      • CharlesW a year ago

        > HDR video can be presented just fine without HDR10+ or Dolby Vision.

        To see HDR content at its full dynamic range, you'll need an HDR-capable device or display. Viewers watching on non-HDR devices/displays will see an SDR¹ video derived from the HDR source.

        ¹ "Standard dynamic range" or "smooshed dynamic range"

        • jmole a year ago

          > To see HDR content at its full dynamic range, you'll need an HDR-capable device or display.

          Not exactly - you need an HDR Mastering display to see HDR content at full dynamic range. There are essentially no high volume consumer-level devices, with the exception of maybe Apple's XDR lineup (MBP, iPad, iPhone, Pro Display) with the capability of displaying non-windowed HDR content at full brightness.

          Everything else relies on tone mapping, even the latest 2022 OLED & QDOLED TVs.

          • suzumer a year ago

            Even HDR mastering monitors can only reach 2000 to 4000 nits [1], whereas PQ gamma goes up to 10000 nits. This is why most hdr streams contain metadata containing mastering display peak luminance.


          • CharlesW a year ago

            Good info, thanks! Can you elaborate on why you qualified this with "non-windowed"?

      • trissylegs a year ago

        Linus Tech tips talked about it. HDR on YouTube is a nightmare. There's no way to know if it'll come out in the right colours or get crushed into an ugly mess.

      • mlindner a year ago

        That’s kind of false.

        The entire points of Dolby Vision is that it should give you the same image no matter what display you display it on, if the device is capable of displaying it. It’s an absolute format of light and color values, rather than a format based on RGB values.

      • SllX a year ago


    • PaulDavisThe1st a year ago

      > I’ll welcome anyone that wants to enter the space that thinks they can do better,

      "Do better" is tricky to define. By some metrics, Ambisonics, a decades-old, license free technology, "does better" than Atmos does. But by others, it does worse. Which metrics are important?

      • SllX a year ago

        Content is King here. Do you have source material and do you have playback devices or software out in the world?

        Google has both with YouTube, Chromecasts and Android phones and TVs. They’re one of the few power players that can unilaterally change up their codec and metadata suite, but only as far as YouTube goes.

        So “do better” means getting enough content behind a tech stack and still delivering a satisfying experience to the customer. If they can meet or exceed what Dolby delivers, I think that would be great! Even if they only match Dolby, that’s still pretty good.

        • PaulDavisThe1st a year ago

          Ambisonics hardware was on sale in the late 1970s and early 80s, but few people at that time had multi-speaker setups, which meant that the companies making it went defunct.

          Libre and gratis ambisonics encoding and decoding software has existed for more than a decade. It is the domain of enthusiasts rather than regular consumers.

          • SllX a year ago

            Makes sense, and this is actually the first I've heard of ambisonics. From your description, it doesn't sound like the kind of thing there will ever be tons of content for, so not tons of demand either. But a small demo-scene-like thing can be neat too.

        • justinclift a year ago

          That's a good point.

          If support for playing Ambisonics was added to (say) vlc, and easy-to-use conversion tools from Atmos to Ambisonics also existed, that'd probably go a long way to increasing it's adoption.

      • dTal a year ago

        I think the most important figure of merit is sadly how much money it will make the people in charge of implementing it in consumer devices.

    • kasabali a year ago

      > I suspect what will happen is YouTube will develop their own HDR and audio codecs

      It is stated in the article they're backing HDR10+.

      Not sure about the audio.

      • SllX a year ago

        Yeah there’s some supposition in the article so I’m not clear if it’s HDR10+ or an extension of it, just that Google wants to do something to break Dolby’s bank and they’ve got a heist crew with something to prove (i.e. Netflix, Samsung, Meta, et al.) and a placeholder name for the audio stuff (“Immersive Audio Container”). The distinction hardly matters since it’ll just be one more format (or set of formats) for manufacturers to support.

    • Melatonic a year ago

      This could put pressure on Dolby to release their older stuff for free (no licensing) which would be a huge win for everyone. But I agree about Google - the whole point of Dolby is to have a high quality standard that is the industry choice for consistency. That does cost money (mainly in licensing their stuff or chips that use their encoder) but the way I see they have to make profit somehow. Is it overpriced? Probably.

      Is older standard dolby digital (and dolby digital plus) 5.1 surround sound still pretty damn good? Yep - and it should be free. They have 20 years of newer, superior stuff to make money from!

      • SllX a year ago

        AC-3’s (Dolby Digital) patents all expired in 2017. I’m not sure about E-AC-3 (Dolby Digital Plus), but my understanding is producers just move onto new sound technology. It’s not that DD and DD+ don’t both sound great, it’s that people move with advances in sound production to stay on or near the state of the art. If you want to write an AC-3 encoder/decoder, go for it, but that’s not much help for folks that want to use AC-4, TrueHD and Atmos.

        • Melatonic a year ago

          Well the reason I mention those two older formats is that those are the encoders you need to use a TON of older but still very useable hifi equipment. As far as I know if you want to get good quality sound to anything that uses TOSLINK (very common connector not just on older equipment) or the single orange digital audio cable (that looks like an orange RCA port) too. I specifically bought an older Roku 4K model that has the dolby digital plus encoder chip built in (and a TOSLINK output) so that I can keep using my still very nice sounding Yamaha receiver from the early 2000's. They only had the TOSLINK and DD chip for a brief period in the Roku 4K (current one does not have it and the oldest ones do not either). Basically what this means is that Netflix can have the latest and greatest audio on their stream (whatever is the best of the best in 5.1 currently) and this little chip will nicely re-encode it on the fly down to the best quality format my receiver can handle. If you do HDMI passthrough to Toslink or similar it will not work as theres no chip to re-encode.

          Even a ton of modern motherboards have TOSLINK and lots of new equipment as well so its a worthwhile way to get a baseline quality 5.1 audio that still holds up very, very well for home systems today. DD+ with a good receiver and large speakers will blow away most of the cheaper Atmos systems.

          You could also theoretically do this with most computers or a laptop with the right hardware and a little software digital encoder but the issue is that most of the time the way they have instituted DRM means that the browser is not going to even have access to 5.1 in the first place or your device (like any Nvidia graphics card for example - even if the HDMI output can include high quality audio) is going to just get 2.1 audio.

    • daveslash a year ago

      Speaking of Stereo Audio vs Encoding Schemes, there was a great article posted here a while back about diving into audio formats on 35mm movie film [0]. There's a photo showing the analog stereo wave-form alongside two digital tracks: Dolby and Sony [1]. So the audio is physically printed onto the film in 3 different formats (1 analog, 2 digital) and it's up to the projector to decide which one is needs/has-the-hardware-for.


    • m1nes a year ago

      To complement what you say, you also have to consider that there are an absolute myriad of processes, people and companies within the audiovisual industry where the Dolby workflow is almost inseparable from what they do and what they work with. People and companies that have made investments, sometimes of many zeros and many hours, to be able to work this way.

      Dolby is more than a standard or spitting six channels of raw audio out of speakers. That's just the end product, because it's not just standard itself.

      It is, for example, the hardware that is in cinemas and home theaters (encoders, decoders, Dolby RMU...), the certification processes that Dolby does in cinemas and recording and mixing studios, the mixing technicians who work with all that and send the final mixes with the netflix/hbo/whatever specifications, vendors, integration partners, speaker manufacturers...

      There are also plugins that work in DAWs like Pro Tools, the ecosystem (Dolby Atmos Renderer, Dolby Atmos Production Suite), just to scratch the surface.

      One thing is to publish a standard and another the ecosystem around that standard. It is interesting that there are new standards, but given Google's history with its long term attention span I have my doubts that this will materialize into anything more than an internal asset for google.

    • klabb3 a year ago

      > and ultimately this will be to increase the quality of YouTube while minimizing their licensing costs.

      Yes, I can also imagine they have specific requirements on the file format like quickly skipping to timestamps, highly variable bitrate, handling text and graphics well etc. I imagine their requirements to be so general that it'll benefit anyone, especially those that do streaming.

      In either case "just one more standard" (or relevant xkcd) is an unavoidable obstacle for every new standard, and does not mean the project will fail. I have lots of critique against Google but this is one thing they are positioned to do well, and have a decent track record. And given how the competition operates, is frankly refreshing.

      • SllX a year ago


  • __david__ a year ago

    I was involved with digital cinema in the mid 2000s and attended standards meetings. Dolby was constantly trying to push their proprietary format for 5.1 audio into the standard but luckily everyone else at the table pushed back, correctly pointing out that just raw PCM 5.1 audio was perfectly adequate (and didn't require a bunch of licensing fees!). Dolby had to actually innovate (with Atmos) to get anyone to actually listen to them.

    Though I'm quite disappointed with ATSC 3.0 which appears to have given in to them and used their proprietary audio codec which no one supports yet. I'm extremely skeptical that it provides a tangible benefit over more widely supported formats. Yay, regulatory capture.

    • ryandamm a year ago

      For the pedantic record, "regulatory capture" typically refers to a situation where an entity responsible for regulating another entity is controlled or influenced by the body it is meant to regulate. Like if the SEC was influenced by Goldman Sachs, or something, or the FDA was influenced by the pharmaceutical companies it was meant to regulate.

      Standards bodies are comprised of its member companies, who negotiate (oftentimes quite hard!) with one another over what IP ends up in the standards. They're non-governmental trade organizations (though I suppose with ISO it gets semi-governmental).

      Disclosure: I work at Dolby but do not have anything to do with standards. I dabbled in standards bodies at a previous startup but am really not qualified to weigh in on any of this, besides the stray pedantry around language.

    • phh a year ago

      Well French dvb standard is requiring e-ac3, so not all standardizers got the memo... (And it's being used mostly in stereo)

      Fwiw, Dolby does bring something compared to PCM, which is metadata to dynamically change dynamic range on the final device, allowing higher ranges with perfect home cinema and smaller range when in a noisy environment

    • splitstud a year ago

      I suppose one man's regulatory capture is balanced by another man's monopolistic behemoth pushing out a competitor that doesn't have an alternate revenue stream.

  • JohnFen a year ago

    > what if someone told you there was a license for stereo audio

    Interestingly enough, stereo was under patent in the 1930s -- so you did need a license then.

  • pdntspa a year ago

    Thank god. Do you know how much of a pain in the ass it is to cleanly sample most modern movies because of this proprietary codec bullshit?

    The best results come from ripping voiceovers out of the center channel... but busting through encryption and figuring out the right proprietary codec to open the audio is a pain.

    • suzumer a year ago

      I've found that the easiest way to extract audio data from a Blu-ray movie is to rip the file using MakeMKV and then use FFMPEG to convert the audio data to my codec of choice be it wav, aac, opus, etc. FFMPEG takes care of identifying the right codec to decode.

      • pdntspa a year ago

        Does it work with the proprietary stream formats like DTS? I believe I am using some plugin for Audacity for that, but I hate doing the actual sampling in Audacity so it then gets loaded into another editor.

  • 52-6F-62 a year ago

    There _was_ a license for stereo audio. See: Blumlein and EMI, Fantasound, etc

    • crazygringo a year ago

      It made a lot more sense back in the analog days, when figuring out how to encode and decode stereo audio on an LP was non-trivial.

      But things like 5.1 audio or HDR or spatial audio aren't that much more than adding a bunch of extra channels/bits to a stream, defining relative power levels, and the signal strength follows a curve, and oh there's some positional metadata.

      The heavy lifting is done by compression algorithms which deserve to be patented because they do genuinely non-obvious stuff. Just like the way Dolby got digital audio onto a filmstrip was similarly clever.

      But stuff like 5.1 surround sound... it's just channels, man. In the digital world, it seems like it should be awfully easy to design an open standard.

      • mschuster91 a year ago

        > The heavy lifting is done by compression algorithms which deserve to be patented because they do genuinely non-obvious stuff. Just like the way Dolby got digital audio onto a filmstrip was similarly clever.

        Or... we could have governments begin funding universities like they did in the past, and the research would be available for all?

        Seriously, we have to re-think patents. The amount of money all that rent-seeking crap is costing societies each year is absurd, and not just in payments to trolls, but also stifled progress - think of stuff like e-Ink that's barely affordable.

        • robinsoh a year ago

          > Seriously, we have to re-think patents. The amount of money all that rent-seeking crap is costing societies each year is absurd, and not just in payments to trolls, but also stifled progress - think of stuff like e-Ink that's barely affordable.

          You think patents is why e-Ink is 'barely affordable'? Could you elaborate on what data you used to form that conclusion? A simple question, lets say price of a panel versus cost of the raw materials to make it? What margin do you think they're making? Do you know?

          • justinclift a year ago

            Previous discussion of the E-ink patent situation:


            • robinsoh a year ago

              That looks like a non-answer to me. That post doesn't give any verifiable facts. Repeating it just shows a lack of desire for accurate information.

              • mschuster91 a year ago

                AFAIK, that was the situation at least in 2016, per media articles back at the time [1]:

                > And E Ink, the company, has such a patent moat that it has acted as a monopoly, which Behzadi says has kept prices perhaps too high. But E Ink lost a big patent fight in 2015, and the market could expand soon.

                The fact that almost everyone wants e-Ink technology, but e-Ink prices still are still really high [2][3] leads me to believe that either there are still patents at play that prevent competition from rising up, or that the competition hasn't managed to catch up for some other reason. It might also be the case that all of this is simply due to the aftershock of COVID supply chain disruptions, in any case I haven't found a better explanation yet.




                • robinsoh a year ago

                  > The fact that almost everyone wants e-Ink technology, but e-Ink prices still are still really high [2][3] leads me to believe that either there are still patents at play that prevent competition from rising up, or that the competition hasn't managed to catch up for some other reason.

                  I disagree. It leads me to believe the underlying technology, electrophoretics isn't capable of achieving the volume scaling and update speeds that would make it achieve mass market pricing. Also the links you provided, do not substantiate the main thesis that's being asserted, ie "there are still patents at play that prevent competition from rising up". Further, the quotation you provided was from Behzadi who was a kickstarter guy who failed to deliver the product he promised and then proceeded to blame everyone else except himself.

                  There's a simple question you can ask to prove this to yourself. Ask everyone who is making this claim, which specific patent is blocking them and how exactly it blocks their idea. You'll instantly realize the people making these claims are not actual display engineers with knowledge of electrophoretics. Typcially, at best, they're bullshitters trying to sound clever or at worst like Behzadi, scammers who are trying to hide having overpromised and then misspent other people's money.

      • PaulDavisThe1st a year ago

        Ambisonics has existed since the 1970s. It is license-free, and doesn't define a speaker layout (which is one of the reasons why it is not widely used).

      • layer8 a year ago

        It’s not just channels. If you want just channels, you can use 5.1 PCM no problem.

      • 52-6F-62 a year ago

        "Seems like", and the reality are often different. That's some of the ingenious nature of these inventions—they seem like they should be obvious and easy. And yet they aren't. Not at first. It took a heavy amount of investment, organization, and talent to get to the point of stereophonic sound alone.

        • Dylan16807 a year ago

          > It took a heavy amount of investment, organization, and talent to get to the point of stereophonic sound alone.

          Sure, because that adds a ton of new complexity!

          Going from 2 to 3+ in a digital format does not add complexity.

          • bradstewart a year ago

            Is certainly can add complexity when you consider bandwidth and/or processing constraints.

            • Dylan16807 a year ago

              If you figure out a particularly clever way to save on those, sure.

              But the baseline of "okay, compressed audio isn't very demanding, throw 3x as much bandwidth and processing at it" does not add meaningful complexity.

              • crazygringo a year ago

                Yup. Joint encoding [1] is really the main thing, but that's something that codecs already do, ever since MP3 with stereo music.

                The overall point remains: multichannel open container formats exist, and open audio codecs exist. An open standard for 5.1 surround sound, for example, seems like a relatively straightforward combination of the two. I'm not saying you can do it overnight, but compared to other open-source efforts, it's tiny.


              • KerrAvon a year ago

                That's like saying "we should all drive hydrogen cars because you can just replace all the existing petroleum infrastructure with hydrogen infrastructure." Yes, but you have to execute on that. Any practical use of digital multichannel audio must consider bandwidth and decoding power as constraints.

                • Dylan16807 a year ago

                  When we're looking at video content, the audio processing for stereo is a tiny fraction of the bandwidth and power. Tripling that effort barely makes a difference.

                  In terms of petroleum infrastructure, it's like slightly adjusting the ratios at the refinery.

          • worik a year ago

            > Going from 2 to 3+ in a digital format does not add complexity.

            What adds complexity is determining how many channels to use and what to put through them.

            That is the important part now.

          • 52-6F-62 a year ago

            Have you worked on multi channel encoding/decoding or mixed a multi-channel project?

    • deltarholamda a year ago

      Imagine my shock when I found out I have been paying royalties to Doug Stereo for decades.

      Stupid Doug.

      • worik a year ago

        Cleaver Doug!

        Stupid deltarholamda!

        Na. Just kidding I'm sure deltarholamda is fucking smart!

  • m463 a year ago

    You might want to read the book "Schiit happened" about the Schiit Audio startup

    We don't see a lot of real innovation from small companies because of proprietary audio and video formats.

    in schiit's case they do most things via rca audio or usb/pcm + a dac

    in other cases, there are lots of wired headphones/iem via a 1/4" or 3.5mm audio jack, but basically none via lightning.

  • Bombthecat a year ago

    Dolby vision is patented. There is already a free version,hdr+ , which samsung supports.

    But no one is using it.

    • izacus a year ago

      And you can read in this article why - Dolby aggressively made deals with streaming services to push their technology to profit from royalties on end-user devices.

    • BonoboIO a year ago

      What is exactly patented in Dolby Vision? There has to be some innovation to be patentable. HDR10 and HDR10+ exist.

      • jiggawatts a year ago

        Dolby Vision is poorly documented but I did find a long PDF explaining how it works.

        It does provide significant value worthy of patent protection.

        The main thing they did was develop a nonlinear color space designed so that each “bit” of information provides equal value. This way no bits are wasted, making compression more efficient and have fewer artefacts.

        The color space is also set up so that the “lightness” channel is accurate and brightness can be rescaled without introducing color shifts.

        They also came up with a way of encoding the HDR brightness range efficiently so that about 12 bits worth of data fits into 10 bits.

        The format also allows a mode where there is an 8-bit SDR base stream with a 2-bit HDR extension stream. This allows the same file to be decoded as either HDR or SDR by devices with minimal overhead.

        Last but not least they work with device manufacturers to make optimal mapping tables that squeeze the enormous HDR range into whatever the device can physically show. This is hard because it has to be done in real time to compensate for maximum brightness limits for different sized patches and to compensate for brightness falloff due to overheating. Early model HDR TVs had to have FPGAs in them to do this fast enough!

        • suzumer a year ago

          While Dolby did design the perceptual quantizer gamme (PQ) [1] that almost every HDR device today uses, they waived patenting it [2] when it was standardized in SMPTE 2084 [3]. Everything that is proprietary about Dolby Vision (everything except PQ gamma) is relatively mundane and just dynamic metadata.




          • jiggawatts a year ago

            People assume that some RGB matrix transform and a nonlinear PQ gamma is all Dolby Vision is.

            In practice it's an entire ecosystem of certifications, professional calibration of panels, efficient encoding formats, etc...

            To reproduce the end effect of Dolby Vision you'd have to have a team of people liaising with television manufacturers, and makers of production software like DaVinci Resolve.

            It's not a trivial task that can be done through open source. It's real work that costs money.

            We can all hope and wish for open standards, but it's a bit like trying to come up with an open architecture for a railway bridge. It'll still take real work to customise it to any particular valley, the local geography, and the specific requirements. Dolby Vision is similar. The mapping from the full signal range to each specific panel is a complicated thing that requires quite a bit of work to determine.

            • suzumer a year ago

              You obviously know more about this than I do, but when I watch a a well encoded Blu-ray that doesn't use Dolby Vision like Planet Earth 2, and I compare that to a dolby vision encoded disc, I fail to see a noticeable difference. And when I rip the RPU of a disc that uses dolby vision and look at some of the data [1], I again don't see enough to warrant this being a proprietary system, especially when we have HDR10+. So Dolby may have teams of calibrators and created certifications, at the end of the day I fail to see why this warrants being proprietary. Please let me know what Dolby Vision does better than HDR10+ or regular HDR, because at the end of the day I'm just a hobbyist and want to learn more about this space.


              • jiggawatts a year ago

                Roughly speaking, the differences are:

                Dolby Vision is effectively 12 bit while using only 10 bits for encoding the actual signal. HDR10 is effectively... 10 bit. To achieve the same 12-bits of dynamic range they'd have to come up with a HDR12 format or something.

                You can think of 8-bit SDR brightness as something like 0 nits to 255 nits of brightness. This is technically wrong because it's a nonlinear curve, but ignore that for a minute. Increasing this to 10 bits like with HDR10 gives you 0..1,023 nits with the same "steps". Going further to 12 bits lets you take this to 4,096 nits while continuing to preserve the same level of gradation.

                The hiccup with this is that some displays have 600 nits of peak brightness and some have 2,500 nits. There are rare displays that can go to 4,000, and prototype displays that go to 10K nits.

                HDR10 only goes to 1,000 nits.

                Dolby Vision goes to 4,000 nits.

                Does this matter now on some cheap LCD TV that only goes to 600 nits? Probably not.

                Does it matter on an OLED panel that only goes to 1,000 nits? Maybe, because Dolby Vision has more true-to-life mapping from the maximum range to the display panel range.

                Will it matter more with next-generation panels capable of 3,000+ nits? Almost certainly.

                Then again, HDR10+ has dynamic metadata, which compensates a lot for its lower bit depth. Additionally, most smart televisions smooth the "steps" in smooth areas, largely eliminating the artefacts caused by HDR10.

                At the end of the day, they're both significantly better than SDR, but Dolby Vision is a touch better, especially on high-end panels.

                • suzumer a year ago

                  Based on my own research, I don't believe this information is correct. HDR10 video is encoded according to Rec 2100 [1], which states that the video is 10 bits and goes up to 10,000 nits. I know this to be true as I have written my own programs to encode HDR10 video. You are correct however that most HDR10 content is only encoded with a mastering luminance of 1000 nits [2]. According to this [3] forum post, all DV content (besides profile 9) is encoded using 10-bit HEVC stream. When people talk about 12-bit Dolby Vision, they are referring to Profile 7, which contains a full enhancement layer stored at a 1/4th the resolution, which contains the residual between the source and the Base Layer, allowing a player to decode the data to 12 bits. It appears [4], however, that most DV content only uses the minimal enhancement layer, which is only the dynamic metadata.

                  [1] -

                  [2] -

                  [3] -

                  [4] -

                  • jiggawatts a year ago

                    I was oversimplifying on purpose. The standards are complex and have multiple modes. These interact in weird and wonderful ways with hardware capabilities.

              • mvanbaak a year ago

                This all depends on the system used to watch the content. From setting used on tv and avr and player, to the actual hardware.

                When i watch See and compare the DV version vs plain HDR version on my LG C1 the difference is big

                • brokenmachine a year ago

                  Which particular content have you noticed a big difference on?

                  I have an (older) LG OLED, and haven't seen anything in DV that I didn't think would be just as good in HDR10, although I haven't compared the same content in both DV and HDR10 directly.

            • expensive_news a year ago

              I was under the impression that there were no Dolby Vision reference monitors, and that different players, like Sony and Oppo, output Dolby Vision encoded 4K Blu-rays differently, and thus there is no ’correct’ interpretation of the metadata.

              This is opposed to HDR which has reference monitors and should look exactly the same on different, calibrated systems.

    • dopa42365 a year ago

      >But no one is using it.

      Aren't both formats just some meta data on top of the same video file? Netflix etc. serve both, depending on what your device supports.

      Pretty sure that both blurays and "4k TV" channels use HDR10(+) as well.

      Not like the viewer has to know or care about it (from what I can tell, they're basically identical anyway).

  • UltraViolence a year ago

    All of this is basically easy to make an alternative implementation of. Just use a slightly different audio codec and rearrange the fields a bit in the format.

    Probably most important for good uptake is a fancy name. HDR10+ just doesn't sound snazzy enough.

    • dylan604 a year ago

      Right, HDR11 sounds so much better.

      --But it goes to 11.

  • ksimukka a year ago

    The audio format (mono, stereo, 2.1,5.1, etc…) isn’t proprietary. The encoding scheme and codecs are i.e Dolby. However, this is a bit more complicated then what maybe is apparent.

    The post production sound department for a film will create and mix the sound to a specific format (usually 5.1) and depending on how it will be distributed, Dolby (or other) is used for encoding the distribution.

    Dolby has been popular because it solves for cases when someone only has stereo or when a theatre doesn’t have a specific capability (Atmos) by mixing down or up the film’s delivery format.

    It is preferred that audio sources are uncompressed PCM in the WAV format. Most audio in a film is actually mono, but is mixed to a surround format with various effects and processing applied to achieve a desired effect.

    When the mixing process is complete, it is then rendered to a delivery format which is usually a lossless format and/or a property format.

    It is worth mentioning that the sound mix for cinema distribution is not the same for digital distribution.

  • babypuncher a year ago

    You don't need a license from Dolby to encode or play 5.1 audio unless you are using a format that Dolby owns (like AC-3, TrueHD, etc). Plenty of free and non-Dolby proprietary audio formats support arbitrary numbers of audio channels.

    • aesh2Xa1 a year ago

      From the fine article:

      > Google has a lot of influence on hardware manufacturers

  • adolph a year ago

    It may be in someone’s self interest to make anything that any of us do zero cost, at which point we do not have a business model for living (other than some form of the dole contingent on agreeing to whatever conditions put thereupon). To the extent that Dolby creates value and reasonably licenses their development I think the employees of Dolby have a better claim to a fair living than a pure patent troll or an ad farm.

    • account42 a year ago

      > at which point we do not have a business model for living

      You know we could make it so that living does not require "business model".

      • adolph a year ago

        I also know that we could put pigs on airplanes. Google is still an ad farm that isn't creating new value for anyone other than Google in this case.

  • EGreg a year ago

    Yep. Once again, open standards smashing capitalism's rentseeking proprietary schemes helps a wide array of people around the world including new entrants who would have previously been priced out.

    • dimitrios1 a year ago

      > capitalism's rentseeking proprietary schemes

      Correction: greedy assholes rentseeking proprietary schemes. There is nothing in capitalism that says you must obtain your money through unethical means.

      • EGreg a year ago

        It is also true that there is nothing in capitalism that precludes this rentseeking scheme. Everything about it is quintessentially capitalist: a top-down organization employs people (job creation) and pays them money which it gets by restricts other people and organizations from using something (private ownership) unless they pay (rentseeking) and to take down its rivals (competition) to take over an entire industry. The only aspect that’s missing is to abuse publicly available resources (extraction) and dump its waste (pollution).

        It is actually fine for trailblazers to charge large amounts for new tech. But open source gift economies can eventually break their stranglehold.

        Unless they use the power of government to enforce their rentseeking, which can be especially egregious with “intellectual property”.

        (Yes it is possible to be a libertarian who criticizes capitalism as using government force.)

      • js8 a year ago

        That's lack of understanding of origins of the term "capitalism". Capitalism is typically characterized by (a) free market for labor and (b) private ownership of means of production. Both of these concepts have been considered unethical. The fact that liberals also consider "rent seeking" unethical doesn't change that.

        • dimitrios1 a year ago

          > Both of these concepts have been considered unethical.

          Considered unethical by who? I wouldn't classify me disagreeing with those who consider those two concepts as unethical as a "lack of understanding"

          • js8 a year ago

            By many intellectuals and philosophers, like Karl Marx. How do you know you would disagree with their arguments if you didn't even realize such people exist? That's the "lack of understanding" I am talking about.

            In fact, there are several different explanations why these are unethical. One (Marx's) comes from labor theory of value (which I don't believe in, I think it is self-contradictory, but even classical liberals did believe in, that's why they hated rent seeking).

            Every ethical argument is based on some ethical principles (like axioms). The question is, do you disagree with the argument (the derivation of the conclusion) or with the principles?

            I'm tired of this sophistry where people claim "this is not a real capitalism". Why cling to the term then? Capitalism has always been a designation of the actual, real system, not some liberal utopia with no rent seeking. (The actual reason is that you can, somehow hypocritically, acknowledge the flaws of the system, while claiming TINA, anyway.)

            • dimitrios1 a year ago

              Knew it was going to be Marx.

      • dTal a year ago

        > There is nothing in capitalism that says you must obtain your money through unethical means.

        There is. Ethics constitute a voluntary constraint on behavior. Businesses with no ethics are less constrained, and therefore can outcompete businesses so encumbered.

  • lowdose a year ago

    This is unbundling of all royalty taxes, it is the same for fonts. It’s cheaper to purchase a made-to-order font than pay per use.

  • shmerl a year ago

    All of that doesn't seem anywhere recent. Shouldn't all these patents already expire?

  • rodgerd a year ago

    > This is amazing.

    Company with online video and advertising monopoly wants to use that monopoly to destroy competitors isn't "amazing", it's "business as usual".

solarkraft a year ago

We are seeing a deeper and deeper split between Google (webm/VP9, webp, AV1, strongly pushing their formats) and Apple (HEVC, HEIC, Atmos, completely boycotting Google's formats), with Microsoft caught in the middle, supporting neither that well.

Apple's stance is especially interesting because it's unclear to me what they gain by pushing license fee encumbered formats.

  • CharlesW a year ago

    By supporting de jure standards (vs. Google projects which hope to become de facto standards), Apple gets a 3rd-party ecosystem for use cases where it wants one. Examples include USB, Wi-Fi, MPEG-4, Thunderbolt/USB4, web standards, etc. In many cases, Apple is an active participant in the standards process of de jure standards which are important to their business objectives.

    When de facto standards develop enough momentum to have customer value on their own or as part of other de jure standards, Apple will support them at the OS or app level. Examples include MP3, VP9, Opus, VST3, etc.

    • account42 a year ago

      How are MPEG-LA/Dolby/etc's standards more "de jure" than Google's? Both are standards only in the sense that a number of corporations agree on them. Neither have any kind of legal basis for being the standard.

      • midislack a year ago

        MPEG's an industry group, Google's not. Pretty easy to understand.

  • shp0ngle a year ago

    Apple is a governing member of Alliance for Open Media that develops AV1 standards.

    They joined quite late but are there.

    • account42 a year ago

      Apple is also a member of Khronos - the highest level of membership in fact. Doesn't mean that they are at all interested in furthering and promoting the standards Khronos oversees. Microsoft is also a member of Khronos btw and they have been actively hostile towards open graphics standards for much longer. It simply makes sense for corporations to get access to relevant standards bodies in their fields, whether they are interested in adopting those standards or interested in competing with them.

  • jiggawatts a year ago

    > unclear to me what they gain

    It's clear to me! I have an iPhone Pro 13, and I tell you: the Dolby and HEIC formats are a key reason I use the Apple ecosystem instead of Android. The pictures and video I take have a huge dynamic range, accurate true-to-life colour, and have a surprisingly small encoded size. The 4K Dolby Vision HDR video the iPhone takes looks like it has been professionally graded and rarely needs any further touching up. To reproduce it with any other device would require a significant setup of something like DaVinci Resolve and a RAW video workflow.

    I don't know if everyone but me is colorblind or what, but it's night & day to me. The Apple + Dolby Vision videos are mindblowingly good, whereas everything I've seen taken by or displayed on an Android device is always incorrect in some way. Blown highlights, oversaturated, or whatever.

    Google has little clue about colour, HDR, standards, quality, or anything at all along those lines that photographers or videographers care about. They're still releasing SDKs and entire operating systems with the baked-in assumption that images are always 8-bit sRGB SDR. Then they don't bother color-managing that at all, leading to the inevitable end result of either desaturated or garish color depending on the display panel used.

    PS: Microsoft used to be the best of the bunch back in the Vista days, and is now regressing to be the worst of the bunch. In part, this is driven by the need to be "cross platform" and compatible with Google's Android. Only Windows and Mac uses 16-bit linear light blending for the desktop window manager, whereas Android uses 8-bit unmanaged God-only-knows-what-color-space.

    • deanCommie a year ago

      > I don't know if everyone but me is colorblind or what

      Possibly. In most blind tests, for shots straight out of the camera, people prefer the Google Pixel photos to iPhone photos. Professionals complain that google over-amps the HDR, but consumers prefer that. Apple may have better codecs, but Google has better software post-processing.

      iPhone still wins in video though.

      • least a year ago

        > Possibly. In most blind tests, for shots straight out of the camera, people prefer the Google Pixel photos to iPhone photos.

        People tend to favor shots that are punchy and high contrast, as far as I can tell from the many blind tests various youtubers have performed over the years.

        From my viewpoint, iPhones have almost always had better color processing straight out of the camera as it is much closer to true to life than Samsung or Google's. Google's image processing in other areas seems to be better. Low light continues to not be that great on iPhone, which is a big use case while the Pixel excels at it. Denoising isn't great. The iPhone can also tend to lose some fine details that are present in Google's photos.

        Overall, I think Apple's photos are more accurate and serve better as a baseline to edit photos, but Pixel wins out in many common scenarios that the iPhone just doesn't do as well along with having a more pleasing image straight out of the camera (because most people want to look better than real life, not like real life).

        • deanCommie a year ago

          Yeah, because "True to life" is irrelevant for people's personal photos.

          "True toMEMORY of life" is what's more important, and Pixel photos absolutely feel like they more match what I feel like I see.

      • Jaxan a year ago

        I think this is right. To me the Google pixel photos look unnatural. But everything (dark areas to bright areas) is very clearly visible and it has a certain professional look to it. I can see why people like it. But to me, it looks unnatural.

        Maybe it’s a matter of getting used to.

    • BeFlatXIII a year ago

      …yet for all that focus on proper color reproduction, Apple *still* can't be bothered to gamma-correct when blurring colors.

      Fun experiment that will have you filing radars until the end of time: find a red notification bubble on an iPhone home screen. Now, slightly pull down the screen (as if to open the search screen) slowly. Note how the blurred bubble suddenly got darker instead of retaining the same visual brightness only redder.

    • solardev a year ago

      Does this really work..? I just switched to an iPhone 14 from a Pixel 5 and the pictures are so much worse. Maybe if I bothered with a raw workflow, but in automatic mode, the pixel photos are way better...

      • riversflow a year ago

        At the time of writing this comment all siblings and their children are about comparing Stills, where GP is specifically talking about Video.

        Keying/Grading video is a significantly bigger chore than color correcting a still.

      • expensive_news a year ago

        Not that I’ve had either phone but I would say you’re likely in the minority. I have the iPhone 12 and have always been happy with the iPhone cameras. It’s not like it even comes close to my DSLR but they’re decent for phones. I haven’t had a lot of Androids either but all of the ones I owned had terrible cameras.

        You don’t need to shoot in RAW to get good results. I’m curious what your complaints are?

        • solardev a year ago

          The Pixels are a standout among the usual Android crapfest. Google has really good computational photography. The images are much better balanced, not oversaturated, much sharper, night mode is awesome, etc.

          Even with the 48 MP sensor I'm getting subpar results. Apple might have better hardware but their computational photography hasn't quite caught up yet.

          Here's a sample gallery (not mine)

          I was expecting a major upgrade going from the Pixel 5 to the 14 Pro. It's actually a slight downgrade.

          • jiggawatts a year ago

            Keep in mind that it is borderline impossible to show iPhone photos in full fidelity in a web gallery. The benefit you get out of iDevices is the wide gamut and HDR photo format. Converted to sRGB SDR JPG images, there's not much difference compared to every other phone manufacturer.

            This is my point: Apple is far above the lowest common denominator.

            • solardev a year ago

              If I'm hearing you right, it's possible to capture higher-fidelity photos on iPhone, but then you require either special hardware to view it correctly and/or a special workflow to make a better end product for web viewing by tweaking the values yourself?

              Whereas the Pixels just make better web-ready photos from the get-go, not because their hardware is better, but because their algorithms are. That seems a lot more useful to me because you can actually capture good photos nearly every time, with zero effort, and send them to anyone.

              If you have a specialized professional workflow, I entirely believe that you can -- with effort -- generate a better end product. But that's not what most people use their phones for. "Dynamic range" doesn't mean anything to the average person, and if they use HDR it's to create that 2000s-like super amplified gimmicky lighting effect. There's not really even a way to get real HDR to properly/intercompatibly show on most consumer computing devices anyway... same with gamut and color spaces.

              IMO (only) making it look good on the average device should be a more important goal than making a workflow that only professionals with specialized knowledge and software can utilize.

              • jiggawatts a year ago

                You are misunderstanding.

                > special hardware

                Any iPhone, or any Mac device. The other vendors refusing to support HEIC and/or HDR image formats is their prerogative. Keep in mind that the Apple formats are just h.265 still frames, not some bizarre propriety standard!

                > special workflow

                I press the shutter button and send the result via iMessage. Is that some esoteric professional workflow?

                Again, “web ready” means precisely: the lowest common denominator supported by browsers as far back as NetScape Navigator 4.

                Apple decided not to limit themselves to 1990s era standards.

                Google is content to remain hobbled.

                • expensive_news a year ago

                  Yes, this is also what I most appreciate about Apple’s approach. They’re the only company that care about calibrated displays on consumer products. Even top-end TVs like LG OLEDs ship with terrible default settings. The only display that’s as color accurate out-of-box as Apple’s are the ones on my professional camera.

                  If I edit a photo and I send it to someone with an iPhone I know the colors will look exactly like how I intended it to look. You can’t say the same about Androids because the screens often aren’t even trying to be color accurate.

                  I’m never sure if some of the photo file types are an extension of Apple’s “walled garden” approach or if other companies just make lazy products. But the rarity of consumer displays that aren’t tuned for maximum blue-light emission is maddening and other companies really need to step up their game in this area.

                  • brokenmachine a year ago

                    >Even top-end TVs like LG OLEDs ship with terrible default settings.

                    I'm sure LG and all the other manufacturers know what they're doing.

                    Consumers must prefer the garish vivid mode, aka "torch mode". Implying that they shouldn't seems to disregard their preferences.

                • brokenmachine a year ago

                  My Samsung phone can take HEIF pictures and HDR10+ video, but both of those are disabled by default.

      • 4ggr0 a year ago

        I think we're dealing with the classic "Professional vs. Regular/Consumer opinion" here.

        The person you responded to seems to work with media or at least knows some things about video formats and editing. To them, true, raw colors and representations are what they want. You on the other hand may like the slightly unrealistic, but appealing look of the Pixel phones, which tend to look oversaturated to professionals, but better to consumers.

        Similar to how audiophiles like headphones which sound flat and boring to regular people, yet audiophiles think regular peoples headphones have way too much bass and treble.

        Different products and needs for different use-cases, I guess.

        • solardev a year ago

          These things aren't mutually exclusive, FWIW. I'm not a professional, but I was a passionate amateur back when SLRs were still a thing. I hate the gimmicky Instagram-style photos myself and overwhelmingly prefer "neutral" pictures -- it's one of the reasons I chose the Pixel.

          IMO the Pixel phones aren't more saturated than the iPhones (and certainly less than default Samsungs), they just have better dynamic range and color balance. Here's a sample gallery from a sibling post (not my gallery):

          The pool picture with the red roof and teal/blue water, for example... the Pixel has a less saturated roof but better color correction for the pool. It's way sharper in the tree photo. In the last two photos with the yellow car by the fence, the iPhone completely blew out the sky but the Pixel captured it fine.

          That's just one gallery, but that's been my experience the overwhelming amount of time. The Pixels are unique among Android cameras in their ability to process natural-looking photos at very high quality, very quickly, at a very reasonable price point. Other manufacturers ramp up hardware specs or added thirty lenses, but Google's approach is to tackle it all computationally, stacking and analyzing multiple exposures and using algorithms to produce a single output (all automatically). The results are really quite amazing -- to my eyes.

          At the end of the day digital photography is always a series of judgment calls -- by the sensors, by the firmware, by the software, by the format, by the compressor, by the photographer, doing signals processing on detected light in various ways. Apple optimized the beginning and end stages of that, whereas Google focused on the middle.

          It might be possible to get better output out of the iPhones if you run it through special software and workflows, but out of the box the Pixels generate better outputs, and I think that's the way most users (on either side) are going to use them...

        • jiggawatts a year ago

          You're guessing at my point of view... correctly.

          I'm not exactly a professional, but I've done enough photography that I twitch slightly whenever I see color balance that hasn't been set correctly to neutral grey.

          It's like a typesetter that can't handle bad kerning:

          • 4ggr0 a year ago

            I'm glad that I was able to guess correctly! :D

            You're maybe not a professional per se, but if I read "color balance that hasn't been set correctly to neutral grey", I have no clue what that's supposed to mean. (I know a bit more now because I fell into a color balance rabbit-hole after reading your comment).

  • theandrewbailey a year ago

    I figured that it was because Apple is more heavily involved with Hollywood than Google is, from production to distribution, and Apple would rather side with standards formulated by trusted experts, professionals, and academics with long track records like MPEG than ones that seemingly came from nowhere.

    • ajross a year ago

      > standards formulated by trusted experts, professionals, and academics with long track records like MPEG

      I genuinely can't tell if this is humor or not. If it is, bravo.

      • tpush a year ago

        Perhaps formulate a point instead of pretending to be incredulous.

    • kmeisthax a year ago

      Fortunately, MPEG got lobotomized by ISO a few years back[0]. So there's room for a new standards org (e.g. AOM) to overtake them. In fact, Apple joined AOM recently, so presumably they will be providing their own expertise to that organization too.

      [0] Specifically, Leonardo Chiariglione got fired and the MPEG working group cut up into a bunch of pieces.

  • sparrc a year ago

    > Apple...completely boycotting Google's formats

    This is not exactly true, they are a "founding member" of AOM:

    > Apple's stance is especially interesting because it's unclear to me what they gain by pushing license fee encumbered formats.

    My guess is cheaper hardware. AV1 is simply behind HEVC in terms of hardware (ie, ASIC encoder/decoders) support.

    • account42 a year ago

      > This is not exactly true, they are a "founding member" of AOM:

      Doesn't mean that hey are there to adopt those standards rather than to be informed on how to best compete with them.

      > My guess is cheaper hardware. AV1 is simply behind HEVC in terms of hardware (ie, ASIC encoder/decoders) support.

      This might be a reason for anyone else but Apple makes their own hardware.

  • jacooper a year ago

    It was the main reason dealing with Safari was a PITA, luckily they supported WebP recently.

    • account42 a year ago

      Hopefully they won't take this long for jxl.

      • jacooper a year ago

        Looks like they are supporting AVIF.

        Its in IOS 16 and in beta Ventura.

        And almost no body supports JpegXL, is it much better than AVIF?

        • account42 a year ago

          Comparing lossless compression formats is tricky, but it is at least competitive with AVIF. The big advantage of JXL is that it can provide additional compression for JPEG files without any (further) quality loss.

          Both Chrome and Firefox have implemented JXL support (to some extend, FF's implementation did not work for me when I tried it) but it is still hidden behind flags for now.

  • kyriakos a year ago

    Apple prefers to pay dolby than help Google in any way it seems.

    • thornjm a year ago

      The article said Dolby prefers to charge on the playback side. Does anyone know if Apple pays Dolby anything? iPhones produce a lot of content so it makes sense for Dolby to provide Apple with free licensing if it popularises the formats.

  • asciimov a year ago

    Apple is just supporting the technology people likely already have in their homes.

    • mminer237 a year ago

      I think it's actually the opposite. Historically, nothing but Apple devices have been able to easily view HEIC images. I think half the reason Apple does it is to make life as hard as possible for people not using Apple devices, so they will give up and switch.

      • account42 a year ago

        You can see this strategy also in their refusal to provide first-class chat interopability.

  • ksec a year ago

    Apple typically pick the better standard, being widely used in the industry and licensing cost isn't ( as much ) an issue. At least in Steve Jobs's era, he do appreciate people doing innovative work and are paid for it. And cost wasn't as ridiculously expensive as they are today. And when it was, Steve would demand they show why they are worth that much.

    Now I think that is mostly gone in Apple.

duped a year ago

There is already an open media format (edit: for object-based immersive audio), it's called SMPTE 2098. Granted it's basically the mutant stepchild of DTS and Dolby ATMOS, but it does exist.

The real problem isn't the hardware manufacturers but the content producers. Dolby engages in blatant anticompetitive behavior that basically requires hardware manufacturers to support their codecs and make it impossible to innovate on the actual media formats in a way that might compete. For example: paying for content to be released in atmos or giving away the tools to author it for free.

  • Aissen a year ago

    If only it was the only thing... Did you know Dolby is actively fighting open source software like VLC behind the scenes ? Anyone having a Dolby license is pressured to stay away from VLC, even when it's a superior technical solution.

    The reason is simple: VLC/OSS developers have been implementing Dolby technologies without paying a dime or using proprietary blobs. How dare they!

  • zeroimpl a year ago

    I'm familiar with several Dolby tools for processing ATMOS and Dolby Vision, none of which are free.

  • scarface74 a year ago

    So it’s anticompetitive to give away tools for free or paying content owners to use it?

    Would you say the same about a search engine company that gives its browser away for free and pays its competitor in mobile a reported $18 billion a year to be the default search engine for its platform?

    I would much rather tie my horse to Dolby than a company that has the attention span of a toddler.

    • monocasa a year ago

      Google is anticompetive and Dolby is. Just like in Google v. Oracle there are no true good guys here, only outcomes that are better or worse for the general public that various corporate entities have aligned themselves to for some perceived short term benefit.

      Open standards are good for the general public, as are allowing re-implementations of APIs. Taking a look at Google's anticompetive use of search combined with ads would be absolutely fantastic too, but I'm not going to gate other actions on it unless there's some semblance of a chance that the connections between the two actions are anything other than theoretical.

      • scarface74 a year ago

        So it’s open source? What happens as soon as Google abandons it and stops supporting in Android? YouTube? Do you think Apple will ever support it?

        • izacus a year ago

          This is a standard, not a piece of software. Nothing "happens" just like nothing "happened" to mp3 or jpeg when they stopped being actively changed.

          • TheRealPomax a year ago

            MP3 was not "an open standard", it was a patented technology, owned by the Fraunhofer institute and lead to hilarious lawsuits against people putting MP3 capabilities in both hardware and software without even knowing they had to pay license fees, until Fraunhofer gave up and gave MP3 to the world. And not "way back when", this is a story that didn't have a happy ending until 2017. And then only because technology had passed it by with the industry having mostly moved on to newer, better codecs.

            • worik a year ago

              > until Fraunhofer gave up and gave MP3 to the world

              I thought the patent ran out.

              • TheRealPomax a year ago

                In the US, patents don't just "run out", the patent owner has to actively not bother tweaking something trivial, filing for a new patent on top of their own while it's still active, and then start arguing that even though the old patent runs out, folks who act on that are now in violation of the new patent.

          • nightski a year ago

            Standards are only as open as their implementations. The fact that google isn’t really involving anyone in this and first introduced it in a closed door hush hush event makes me extremely skeptical.

            • ipaddr a year ago

              A standard created by one body requires adoption by others before it can be a standard unless the first party owns the market. Implementations don't have to be open..

        • monocasa a year ago

          It's an open standard, so better than open source per se since you're not reliant on any particular source tree.

          And in those cases you've listed, you're left with strictly more options than if it's not an open standard.

          • scarface74 a year ago

            You’re not left with an “option” of an audio standard that none of your customers hardware or browsers support.

            • monocasa a year ago

              If the open solution doesn't take off in favor of the proprietary solution, then you're in essentially the same end state as if there was no open solution in the first place.

              • scarface74 a year ago

                Not after you’ve already encoded your audio to support it. From history we know that Google isn’t going to do the leg work it takes to make the standard ubiquitous.

                • monocasa a year ago

                  Why would you only encode your media in a format that none of your customers have support for in the first place?

                  Additionally, being an open standard, you can probably rely on ffmpeg supporting it. This allows you to transcode into something that your proprietary encoder will support for ingest if it comes to that.

                  • scarface74 a year ago

                    Yes because transcoding audio couldn’t possibly lead to lesser quality.

                    • monocasa a year ago

                      Neither of these (HDR10+/DV, open spatial audio/Atmos) are due to compression, so yeah, kind of. They're both metadata schemes on top of other, pluggable compression schemes. So yeah, I wouldn't expect a conversion to necessarily end up with a loss of quality.

                      This is also ignoring the first sentence: your whole supposition is based on a scenario where you as the content producer for some reason encoded in a format that your customers don't have, and don't have the masters for some reason. Which is basically absurd for anything that would need dynamic HDR or spatial audio.

            • ipaddr a year ago

              The browsers will support this. Customer hardware from large venders will take more time or never happen if they are getting paid not to. But other hardware can take it's place.

              • scarface74 a year ago

                You think Apple will support it or even Microsoft? The large hardware vendors not supporting it is really a big deal don’t you think?

                • slac a year ago

                  They both support the AOM.

                  • account42 a year ago

                    They both also "support" Khronos - doesn't mean anything.

    • galdosdi a year ago

      Sometimes, yes. Selling something for below cost with the intent to drive your competitors out of business is called "dumping" and has been illegal since the 1800s.

      Because selling a product below cost is fundamentally unsustainable, there is no logical reason to sell a product for less than cost besides doing so temporarily with the hopes of being able to later recoup the loss with higher, above cost prices. This is anticompetitive because an inferior product can win out if it is backed by bigger pockets that can afford to stay unprofitable longer than the company making the superior product.

      This is basic economics, not really something that needs to be thought out and debated from scratch by HN over and over everytime it comes up, so it really would be helpful if everyone who is thinking about commenting on economic issues like this tries to at some point spend a couple hours reading an AP Microeconomics text. If a high school kid or college kid can do it in a semester, and intelligent adult can cover the high points in a weekend.

      • monocasa a year ago

        Like most calls to just understand econ 101, the real life applications are significantly more complex.

        > Because selling a product below cost is fundamentally unsustainable, there is no logical reason to sell a product for less than cost besides doing so temporarily with the hopes of being able to later recoup the loss with higher, above cost prices.

        There are plenty of other reasons. The one applicable here is "commoditize your complement". Zero cost to consumer codecs mean more eyeballs on youtube videos, which means more ad revenue for Google. That thought process doesn't lead to later ramping up consumer costs. And if it's truly an open standard, how are they going increase costs when anyone can simply release a free implementation?

        • scarface74 a year ago

          Anyone can release a “free version” of a phone that runs Android. How successful of an effort will that be without proprietary Google Play Services?

          • monocasa a year ago

            Which wouldn't be an issue if play services was an open standard.

    • water-your-self a year ago

      >So it’s anticompetitive to give away tools for free or paying content owners to use it?

      Absolutely. Go look at how carnegie won the steel market by starving his competition.

      • asveikau a year ago

        More recently, Microsoft and Internet Explorer.

        • scarface74 a year ago

          And absolutely nothing came of it in the US. Microsoft was not forced to unbundle IE, no browser choice, nothing.

          Now all platforms are bundled with browsers and plenty of other software.

          • asveikau a year ago

            I think that has more to do with who was running the DOJ in the 90s vs. the early 2000s.

            > Now all platforms are bundled with browsers and plenty of other software.

            This is an interesting question. You could take it as Microsoft's argument before the DOJ being correct, that browsers become an inextricable part of an OS. Whether or not they would have been had they not included it, it seems like we can say in hindsight, of course it would have. But surely Microsoft's decision to do so influenced the way the market went.

      • scarface74 a year ago

        Does that also apply to every other service and software that is either given away for free to gain traction? Does it apply to money losing VC backed companies?

        • duped a year ago

          Impact and outcomes matter more than explicit behavior when you look at this, it isn't a binary "do x and get banned" kind of program.

    • mattnewton a year ago

      > So it’s anticompetitive to give away tools for free or paying content owners to use it?

      If the intent is to drive others out of the market, it could be right?

      • scarface74 a year ago

        So what is the intent of any company that gives away software or any startup that operates at loss funded by VC money?

        Or if the same company gave its mobile operating system away for free to undercut a rival and then as soon as it became ubiquitous, started making much of it closed source and forcing companies to bundle its closed source software?

        • mattnewton a year ago

          If you ask Peter Thiel, it’s certainly not to engage in competition for the ultimate benefit of consumers. Not that he somehow speaks for everyone in VC funded startups at all. I just don’t find many other voices willing to say “competition is for losers” out loud where regulators might hear, even if the structure of their investments looks like it needs to find a monopoly to me.

    • diob a year ago

      Yes and yes.

      Although honestly, it's always a nuanced thing.

      But typically the idea is to use money and undercutting to force out competition, then when the competition dies quality goes to crap.

      • scarface74 a year ago

        So it would be like if a big tech company used its money from search to fund an audio standard and give away the software for free to undercut a rival…

        • diob a year ago


    • duped a year ago

      Yes, loss-leading in general can be anti-competitive. And I'm not the only one who thinks so! It obviously depends on the scale, but having been in the space in the past I can tell you the "we are literally paid to use this" hurdle is next to impossible to clear.

gjsman-1000 a year ago

It will be interesting to see how this plays out. Dolby Vision is actually a mess of a standard, with several different not-quite-compatible "profiles." Streaming video is Profile 5, UHD Blu-ray Discs are Profile 7[1], and iPhone Dolby Vision is Profile 8. Profile 7 cannot be converted into Profile 5 [completely incompatible and different algorithms!], devices that implement Profile 5 can't necessarily play Profile 7, but Profile 7 can with difficulty be theoretically converted into Profile 8 which is basically stripped-down Profile 7 with quirks[2]. Basically, Dolby Vision is fragmented within itself. Fun stuff.

[1] And within Profile 7, there is the difference between the MEL (Minimum Enhancement Layer) which just adds HDR data, versus the FEL (Full Enhancement Layer) which actually adapts a 10-bit core video stream into a 12-bit one for FEL compatible players. Not all Profile 7 implementations can handle FEL, but can handle MEL. So even the profiles themselves have fragmentation. FEL and MEL are, within Profile 7, actually HEVC video streams that are 1920x1080 that the player reads simultaneously with the 4K content. So a FEL/MEL player is actually processing 2 HEVC streams simultaneously, so it's not a huge surprise why it isn't used for streaming DV.

[2] Profile 8 comes in 3 different versions, Profiles 8.1 through 8.4. 8.3 is not used. Profile 8.1 is backwards compatible with an HDR10 stream, Profile 8.2 a SDR stream, and Profile 8.4 an HLG stream. Big surprise that iPhone uses 8.4 because HLG can be seamlessly converted into SDR or some other HDR formats when necessary.

anigbrowl a year ago

As a consumer, I don't care. dolby makes money from licensing, but don't ask that much, they innovate constantly, and they do a lot of public education.

This seems like one corporation flexing on another rather than great sense of mission; it's not like Google doesn't have IP of its own that it prefers to keep locked up. I suspect that this signals a strategic desire to move into the A/V production space, where customers have big demands for storage and computing resources.

  • Seattle3503 a year ago

    I care because ATSC 3.0 includes AC-4, a proprietary Dolby format. None of my software will play the audio from ATSC 3.0 over the air broadcasts for this reason.

    ATSC 3.0 is a government standard for how public airwaves should be used. It strikes me as wrong that the government has basically mandated Dolby licensing for hardware manufacturers and software libraries.

    • anigbrowl a year ago

      That's a good counter-argument which I agree with. I was looking at the issue in terms of voluntary consumer behavior (visiting theaters, buying TV or hifi equipment) and industrial supply rather than public good considerations. Where limited resources are allocated by government as (ideally) neutral broker we should certainly prefer openness.

      There is an argument for patent protection as innovation motivator, but lockup periods are more likely to lead to runaway market dominance due to preferential attachment. Where there's a monospony (like government as owner of spectrum) that's probably going to lead to negative outcomes.

      Thanks for widening my perspective on that issue.

      • ajdude a year ago

        I’m kind of directly affected by this as a consumer, as I am a big user of Plex, but can no longer receive sound with my antenna television if I run it through Plex.

        This is because ATSC3.0 requires Plex to secure licensing for AC-4 audio in order to process the sound, which they legally haven’t been able to do yet.

        Unfortunately, my reception is significantly better via ATSC3.0, yet I’m left waiting and unable to utilize it.

        • d110af5ccf a year ago

          It looks like there are some ffmpeg forks that include patches supporting AC-4 but they haven't been merged upstream yet.

          Not that it changes anything. Proprietary formats have no place in such government specifications. But possibly a relevant ffmpeg build could help you do what you wanted to.

    • layer8 a year ago

      ATSC 3.0 doesn’t mandate Dolby AC-4, it also supports MPEG-H. Both require licensing. The thing is, there is no equivalent patent-free technology available (object-based 3D audio).

      • izacus a year ago

        Exactly. Which is why "I don't care" is a very shortsighted and terrible look on a standard that's at least royalty free.

        • layer8 a year ago

          Yeah, my point is you can’t blame ATSC 3.0 for making use of the existing standards. And it’s not too different from how a lot of mobile/wireless technology depends on patents.

    • toast0 a year ago

      Well of course ATSC 3 includes a new proprietary Dolby format, ATSC 1 got AC-3 as a mandatory audio format, too.

      I'm not sure if ac-4 is mandatory, but it seems like it is? Kind of a big pain indeed.

      • kieranl a year ago

        Atsc mandates ac-3. Ac-3 audio is actually dolby-d. They just could not call it Dolby in the standard so they renamed it to ac3 instead.

    • scarface74 a year ago

      Do you feel the same way about the EU forcing exertions to use USB C?

  • anotherman554 a year ago

    Dolby vision playback effectively doesn't work on PCs, so if you are a consumer that uses PCs, you have a reason to care.

    • Avamander a year ago

      Windows Movies & TV can play Dolby Vision if you install the HEVC and Dolby Vision extension from Microsoft Store.

    • solardev a year ago

      Last I tried you couldn't even play 4k Blu Rays on PC. Has that changed?

      • account42 a year ago

        MakeMKV can read them with LibreDrive firmware. Now if you are asking for a media-conglomerate approved way then no idea...

        • solardev a year ago

          ...and this is why I just stopped trying, heh. The industry really shot themselves in the foot there.

  • hparadiz a year ago

    Dolby vision is basically broken on Linux and not great on Windows.

    • Melatonic a year ago

      From what I remember it basically does not work in any browser based setup - have to use Netflix apps or dedicated software?

    • anigbrowl a year ago

      I can't argue with that as I only know about their audio side.

    • brokenmachine a year ago

      HDR in general is non-existent on Linux.

  • izacus a year ago

    Competition is always good for a consumer, especially here where right now you simply don't have a choice but to pay the rent to Dolby on every TV audio device.

    C'mon, this is market capitalism 101

    • anigbrowl a year ago

      Luckily I'm not a capitalist and am not convinced my life will be improved by shaving costs down to nothing until there is only one supplier left standing. Fort that matter I'm not much of a consumer, I'm still using a 10 year old TV XD

      What's weird to me is how google is selling this as a win for the public, when the marginal costs added by Dolby are so low. Even in the audio production space, Dolby stuff is a little expensive for an individual (surround sound plugins costing hundreds of dollars) but it's not a big overhead for a recording studio. Their product is quality and consistency at industrial prices and imho they deliver on this.

      There isn't an underground of frustrated audio engineers dreaming of how theatrical sound could be so much better if it weren't for big D. Spatial audio rebels build quadrophonic sound systems for raves, but you didn't hear it from me.

    • rodgerd a year ago

      How is one of the richest monopolistic companies in the world deciding to destroy a market segment "competition" that is "good for the consumer"?

      • BiometricAndy a year ago

        So like Apple with their ATT? HN news user simping for Apple

pier25 a year ago

Anyone remembers the open format HDR10+ pushed by Fox, Panasonic, and Samsung?

Me neither.

The world at large has settled on Dolby Vision and Atmos and it will be very difficult to change this. Not only from the consumer end but specially in the pro audio/video end.

Google would need first to offer plugins for DAWs, video software, etc, to work with these formats before there's enough content that manufacturers and streamers consider it.

  • kllrnohj a year ago

    Of course I remember HDR10+, along with HDR10 and HLG. All of which are quite common and broadly used.

    Hollywood movies primarily standardized on Dolby Vision, but the entire HDR ecosystem very much did not. Sony cameras for example primarily only shoot in HLG, even for their cinema cameras.

    Similarly games regularly opt for HDR10/HDR10+ for their HDR output instead of Dolby Vision. Why? Because it's cheaper, and dynamic metadata is largely pointless in an environment where the lighting content of the next dozen frames aren't known

    • pier25 a year ago

      > Hollywood movies primarily standardized on Dolby Vision

      No, pretty much the entire video/streaming industry did. Apple, Netflix, Disney, HBO, etc, either stream in DV or HDR10 (non plus).

      Physical Bluray is slowly dying (I own a bunch of those) so streaming is really where most of the HDR video content lives.

      > Similarly games regularly opt for HDR10/HDR10+ for their HDR output instead of Dolby Vision

      Fair point, but consumers keep complaining the PS5 doesn't have DV which is an indicator of what people want. DV is actually a big selling point for the Xbox Series X.

      On PC, I don't know. I've been playing HDR in consoles for years but support on Windows has been pretty bad until recently. My impression is HDR is so much more popular on consoles vs PC. Same with Atmos and surround.

      • izacus a year ago

        That's funny, since PS5 doesn't support Atmos at all and on Windows you need to buy a paid plugin to make it work for anything that's not a Home Theatre system.

        (And even if you have a home theatre system, Windows games will still prefer outputting 5.1 / 7.1 PCM and mixing 3D effects by themselves).

        I'd also be interested to hear where those Dolby Vision complaints for PS5 are coming from, I haven't heard anyone really say that despite HDR being debated quite a lot :)

        • Avamander a year ago

          > you need to buy a paid plugin to make it work for anything that's not a Home Theatre system.

          You need to buy a license if you want things Atmos-ified (so HRTF) for your stereo headphones. It's basically worthless.

          You don't need to buy anything if your media player can decode and downmix Atmos to surround (like Windows Movies & TV).

          Home Theatre systems just get Atmos passed through to them if compatible, so they can then decode and downmix the positional audio according to your configuration.

          > Windows games will still prefer outputting 5.1 / 7.1 PCM and mixing 3D effects by themselves).

          I wish. If games have surround at all, it's usually only analog 5.1/7.1. You need Dolby Digital Live for a digital surround output in most cases (and that can be a PITA to arrange, e.g. patched Realtek drivers). DDL basically provides a fake analog output for software and then sends a compressed digital signal to your decoder.

      • 0x457 a year ago

        HDR on PCs have more important issues that lack of Dolby Vision. Well, I guess you can say Dolby Vision indirectly would benefit because it requires minimum 1000 nits brightness for certified displays, while VESA would gladly certify 400 nits peak brightness as HDR capable display.

        > Fair point, but consumers keep complaining the PS5 doesn't have DV which is an indicator of what people want. DV is actually a big selling point for the Xbox Series X.

        They want it because there are realistically two options right now: HDR10 (not HDR10+) and Dolby Vision. DV being superior in every aspect from viewer perspective. I didn't even know about HDR10+ until today. In other words, what people actually want is HDR dynamic metadata because it looks a lot better than static metadata.

        Since, like you said, every streaming service is either DV or static HDR10, it means people say that they want DV.

        • kllrnohj a year ago

          > Well, I guess you can say Dolby Vision indirectly would benefit because it requires minimum 1000 nits brightness for certified displays

          It unfortunately does not. That's what a Dolby Vision Mastering display requires, but to get the DV logo on your display all you really have to do is pay Dolby money and use their tonemapper. Unlike Vesa they don't actually have a display certification system at all.

          • 0x457 a year ago

            Oh, I misunderstood the requirements then. I thought it's required for certification as well.

            Well, then PC HDR is doomed.

            There is also issue with HRD calibration on PC for some reason. I have no issues on console connected to TV, but the same game on the same TV running on PC would get all weird looking.

    • TheTon a year ago

      For games another reason they don’t need dynamic metadata is they produce their content on the fly and they’re doing tone mapping themselves already and can tailor it to the display characteristics.

    • mvanbaak a year ago

      HDR (HDR10) is the high dynamic range data. HDR10+ and DolbyVision are additional metadata on top of that.

      • kllrnohj a year ago

        Not quite. HDR10 is static metadata on top of the BT2020 PQ data, with PQ being what gives it high dynamic range. HLG being another way to encode high dynamic range data. The Dolby Vision content captured by an iPhone is actually HLG, for example, not PQ. Other Dolby Vision profiles, like profile 5, is similarly not BT2020 PQ but instead IPTPQc2.

        So HDR10+ is dynamic metadata on top of HDR10's static metadata on top of BT2020 PQ which is what makes it "HDR" in the first place. That's easy.

        Dolby Vision is then a profile clusterfuck. Sometimes it's just dynamic metadata on top of HDR10. Sometimes it's dynamic metadata on top of HLG. Sometimes it is its own thing entirely.

    • izacus a year ago

      If I remember correctly, broadcasting is on non-Dolby standards as well. UK uses HLG right?

      • kllrnohj a year ago

        Broadcast TV is HLG because it's backwards compatible with non-HDR TVs. And yes used by UK (BBC is the one that came up with HLG even)

foghorp a year ago

What does Dolby actually do on a day to day basis?

Do they have researchers working on new audio and video formats?

Or is it now all just a self-perpetuating machine for generating licensing revenue, based on existing patents?

Sorry for the ignorant question but I'm clueless about their ongoing contributions to the industry.

  • kllrnohj a year ago

    Yes. Dolby Laboratories is what came up the PQ transfer function which is what's responsible for HDR10, HDR10+, and Dolby Vision, for example. Dolby Atmos is similarly an actually new way to handle spatial audio. They've also created things like ICtCp ( )

    So they do actually contribute some worthwhile stuff. And some of it is open standards (like PQ & ICtCp).

    They also absolutely troll licenses though. Dolby Vision being a perfect example, check out the "profiles" section of and you'll see some truly dumb Dolby Vision profiles that exist obviously just to slap a Dolby Vision license & branding on an otherwise boring, generic video format. Dolby Vision 9 is a perfect example, it's not even HDR at all. It's literally the same stuff we've all been watching for a decade+, but with marketing wank shoved onto it.

    • cosmic7dice a year ago

      Oh god, THAT'S Dolby Vision? I thought it was like a propriatery codec or something. This is a joke the proportions of which have been never seen

      • jiggawatts a year ago

        Dolby Vision is significantly more than just the colour space and gamma curve.

  • Anechoic a year ago

    Do they have researchers working on new audio and video formats?

    Short answer is yes. Search for "Dolby" under "Author Affiliation" in the AES paper search [0] and you can see the research they publish. (the papers themselves are unfortunately behind the AES paywall, but if usually authors will send you the paper if you ask nicely).


  • Mindwipe a year ago

    Most of the industry uses HDR grading tools made by Dolby.

  • izacus a year ago

    Sue everyone violating their patents probably :P

CobrastanJorji a year ago

I love it when a powerful corporation's self interest happens to align with the public's interests.

  • Bombthecat a year ago

    Do they?

    Imagine android tv disabling atmos because of licensing or netflix, then using the new format which needs a new receiver.

    I spent 2k on mine for atmos 4 ceiling speakers..

    • izacus a year ago

      Why would they disable Atmos?

    • kyriakos a year ago

      By the time that happens dolby would have the next iteration of atmos and you'll have to get a new receiver. This is the sad truth about modern consumer electronics, very short life span.

ugjka a year ago

Whatever the pirates adopt will be the de facto codec

  • 0x457 a year ago

    Ehh, that's not the case anymore. It used to play some roles because source material used some ancient inefficient codecs. WebRips are well, ripped as is. BDRip generally ripped as is as well.

    • account42 a year ago

      Ripping Blu-Rays as is makes little sense except for preservation as most of them have hardcoded black bars in the video stream instead of letting playback devices letterbox/pillarbox as needed. Apparently the standard only supports a few resolutions, not even including 24:10 and other common cinema aspect ratios.

      Also still plenty of Blu-Rays around using VC-1 which is garbage compared to current codecs. Perhaps all new ones have migrated to H.264 - at least all UHD ones I have seen have.

  • IshKebab a year ago

    Nah, piracy is much less popular than it used to be. And there are plenty of formats that have been quite popular with pirates but had zero use commercially. Matroska for example.

    • account42 a year ago

      webm is matroska (with limits on what you can use) so it actually has seen quite a bit of commercial use.

      • IshKebab a year ago

        It's based on Matroska. But it's a different thing. Matroska itself still isn't used commercially.

        • account42 a year ago

          It isn't just "based on" matroska, it literally is matroska. Call it a subset if you want to be pedantic.

          • anjbe a year ago

            An incredibly limited subset for sure. WebM with VP8/VP9/AV1 is a far cry from how Matroska is typically used, with supported codecs like h.264/h.265 and chapters, subtitles, multiple audio tracks, and linked external files.

  • TacticalCoder a year ago

    Honest question: what would currently be the most commonly audio format supported by pirates for pirated movies? I've got a home cinema projector but never paid much attention to the audio (just using my stereo: it's a good stereo but it's not 5.1 or anything).

    • izacus a year ago

      You get all of them, since they tend to be ripped as-is from the source media. I commonly see Dolby Digital+ (because Netflix/Prime/etc. use them), Dolby TrueHD (because BluRays use that) and nowadays there's more and more Atmos.

      Ocassionaly you can see an odd DTS/DTS-MA now and then, but not a lot.

    • 0x457 a year ago

      Well, pirates don't care about licensing of audio or video formats, but most do care about fidelity. For audio, generally, whatever source was is used: Dolby Digital+, Dolby TrueHD, Dolby Atmos, AAC or AAC+ if source is wack.

    • pixelatedindex a year ago

      Personally I see a lot of AAC/AAC+ and some Dolby Digital.

  • kyriakos a year ago

    And porn

    • dopa42365 a year ago

      Porn uses old and free formats for maximum compatibility. h264 and aac, works on every budget device from the last decade (and older). Even the highest quality 4k porn is just 20 mbit/s h264. These companies aren't youtube, they don't encode every video in 10 different formats for every use case. If pornhub statistics are to be believed, like 85% of visitors are mobile users. There's no point in optimizing anything for expensive big screen great audio home theater experience when your target audience is 5" low res screen 1% volume ashamed at night phone fappers.

      The very opposite of cutting edge technology (with the exception of tiny niches like VR).

mikeyouse a year ago

The EU is randomly investigating the Alliance for Open Media on antitrust concerns --

Can anyone understand how it's in anyone's best interest to investigate / potentially stop an open source standard / royalty-free format that has buy-in from tons of big orgs?

  • 2OEH8eoCRo0 a year ago

    > randomly

      The Commission has information that AOM and its members may be imposing licensing terms (mandatory royalty-free cross licensing) on innovators that were not a part of AOM at the time of the creation of the AV1 technical, but whose patents are deemed essential to (its) technical specifications
    Sounds worth looking into.
    • account42 a year ago

      Only if you believe that investigating software patents are in the public's best interest.

    • ksec a year ago

      On HN, it is Rust, RISC-V and then AOM that can do no wrong.

      And then it is Facebook and Oracle that can do no right.

  • rodgerd a year ago

    > Can anyone understand how it's in anyone's best interest to investigate / potentially stop an open source standard / royalty-free format that has buy-in from tons of big orgs?

    Because "using my monopolistic profits in one area to destroy your business in another area" is textbook anticompetitive behaviour.

    • mikeyouse a year ago

      I guess I get it but is anyone aside from the rent-seeking royalty holders worse off if open standards win out?

rektide a year ago

It's super unclear to me why Dolby keeps being the ones to do basic things. What is underneath the marketting gloss? It feels like we are all paying a lot for high bit depth, paying a lot for multi-channel audio.

I have never understood how or why it is that expensive proprietary codecs keep taking over. Maybe there is more value add somewhere, but it's very unclear, esepcially under the gloss of (usually deeply non technical) marketting fluff.

  • anigbrowl a year ago

    Partly marketing and licensing deals with studios - they do market very heavily. But they did pretty much invent the whole surround sound thing as we know it today, as well as a host of other realtime audio processing technologies. They're kinda the luxury mattress company of theatrical audio - it's expensive, but once you get it installed it's really nice to have.

    When I did audio production in the film industry, Dolby stuff was a post production expense but not e very big one. Their license fees aren't staggeringly expensive, and the quality and reliability of the playback system was its own argument - if the Dolby 5.1 sounds right in one theater it's going to sound right in another, and that's a big deal because bad sound can really kill a movie, even if the audience can't articulate why (most people don't think too much about sound).

    Digidesign (the manufacturers of Pro Tools & later owners of Avid) are a far more aggressive company that has maintained a virtual lock on its market with a combination of very expensive hardware and moat-building strategies.

  • izacus a year ago

    Don't underestimate existing business relations and contracts. The format you're using is the format that your TV/soundbar/tablet support. There's only a few manufacturers of those and they have decades of business relationship with Dolby.

    Breaking those proprietary realtionships with open source has always been a losing battle - look at HEVC vs. VP9 vs. AV1 battles or AptX vs. AAC vs. Opus.

    Media industry is a surprisingly tight knit and very conservative club that doesn't adopt outsiders easily.

  • 52-6F-62 a year ago

    They invented a lot of this stuff first, some of the preliminary work going back decades. It's not a secret. It predates digital audio.

    There's nothing stopping open source digital codecs from ruling, but they need people working for them.

    Personally, I'd rather pay dollars than data.

  • UltraViolence a year ago

    Like DTS HD Master Audio (which is basically HiRes PCM) and Dolby TrueHD (which too is basically HiRes PCM)? We're basically paying for an alternative WAV or FLAC format.

bredren a year ago

Worth noting that Apple's vice president in charge of the AR/VR team, Mike Rockwell, is a former senior executive at Dolby Laboratories.

cutler a year ago

For all the evil Google may be guilty of they have done a lot of good work supporting open standards and releasing proprietary formats as open source. Let's not forget how Chrome liberated the web from Microsoft's won't-fix attitude whilst IE remained the dominant browser.

  • account42 a year ago

    > For all the evil Google may be guilty of they have done a lot of good work supporting open standards and releasing proprietary formats as open source.

    Sure, there are defintitely things Google has done that have benefited the public.

    > Let's not forget how Chrome liberated the web from Microsoft's won't-fix attitude whilst IE remained the dominant browser.

    But I am unconvinced that Chrome is one of them. Firefox was doing fine displacing IE on its own and the main reason Chrome managed to pull ahead so fast was the huge marketing drive including ads on the home page of the worlds #1 website. Meanwhile, Chrome-driven extensions have made the web incredibly complicated to the extend that new browsers are almost impossible while its dominance holds back any real effort to put users into control of their web browsing experience since few websites care to support anything other than Chrome.

hot_gril a year ago

I look at the current reality of VP8/9 with dissatisfaction. Google went all-in with it and made Meet/Hangouts use it. But encoding, decoding, or both end up being done on the CPU usually, since hardware support is way behind. Zoom and FaceTime just used H.264 (and 5?), and it's way more efficient as a result. I don't normally care a ton about efficiency, but it actually matters when your laptop's fans are overpowering the audio in a meeting and draining your battery to 0 within a short time span.

Also, ironically, even Google Chat didn't seem to support webp images until recently. I appreciate the idea of open standards, but compatibility matters way more to the end user.

  • acdha a year ago

    The other thing I noticed was that Google had some very … optimistic … claims about quality which were hard to reproduce. Every time I tested VP8/9, it took more tweaking to get either quality or size competitive with H.264/5 (VP9 was often playing catch up with the previous generation) and paying twice as much for storage just didn’t make any sense when everything supported the MPEG standards and you couldn’t version-detect which things would technically run but drain your users’ batteries.

    It felt very Google-y: lots of attention to the cool CS problems, less so the boring ones like tool support and sensible defaults for non-experts.

sportstuff a year ago

There is difference between hearing vs listening vs feeling. I hope more creative stuff comes out this. My first experience on 5.1 was Top Gun.. The next one to top that was in Audium with sound and vibrations from everywhere. Nothing to top the sound of silence.

cosmic7dice a year ago

W-We already have AV1 and Opus. Google backed AV1, and they already use both on YouTube.

I mean, they could make a better open video codec, give me AV2 any day. But why not push the pre-existing standards as "premium offerings"?

And btw [Opinion Incoming!] I believe Opus is as good as lossy audio formats will ever get. I'd love to be proven wrong...

justinclift a year ago

Wonder how well this will compare to Ambisonics?

That's supposed to be a "full sphere" surround sound format (developed ~50 years ago), but hasn't been picked up widely:

  • TD-Linux a year ago

    It is, in fact, Ambisonics (among other features).

    Though you don't actually need any of the fancy new stuff being worked on to use Ambisonics - you can already use Opus with Ambisonics today in MP4.

    • justinclift a year ago

      Interesting. Any idea if there are existing file converters for Atmos (etc) to Ambisonics?

      For example, lets say I have some-atmos-movie.mkv on my local computer, that I'd like to play back through my 7.1 speakers (attached to computer).

      In my head, I'm thinking:

      1. Needs to be converted to Ambisonics format

      2. Player software (eg vlc) needs to understand the resulting format, and send an appropriate bitstream to each output device

      Guessing it's not that simple?

      • TD-Linux a year ago

        In general you can convert any speaker layout etc to Ambisonics. For Atmos specifically, no, because there is no software decoder available for it (you have to bitstream it to a receiver).

shmerl a year ago

Good idea. Perpetual patents on audio and video are ridiculous. Shouldn't they all expire?

phyzix5761 a year ago

Given Google's history of killing projects I'm not jumping on this ship just yet.

robertheadley a year ago

Subtext: Google doesn't want to pay licensing fees.

  • limeblack a year ago

    Or wait for the patents to expire. H.264 won’t take to much longer to expire it’s around 12 years I believe and yet Google has released vp8.

mirkodrummer a year ago

Any suggested readings about codecs/encodings/formats/algorithms used or whatnot? I’m afraid it’s the thing I lack most as a dev

daoist_shaman a year ago

I’d like to think that “don’t be evil” applies here, but that might be wishful thinking.

What are some of the risks of media formats being centralized by a mega corp like Google who works with nation states?

Can we truly expect something free… or can we expect all of the content we create to be steganographically watermarked in surveillance states that appear to be fully cracking down on encryption?

agilob a year ago

Would be nice if they started from opensourcing chromecast

_HMCB_ a year ago

Through Google Fonts, they can track page views under the guise of beautiful fonts for your site. I wonder if they could do the same with these media formats.

  • account42 a year ago

    Which is why you should self-host all the fonts you use, which is incidentally also what you need to do to be GDPR-compliant. Its not the fonts that are the problem here its the font CDN.

foxbee a year ago

'Google wants to take on...'

My immediate reaction to reading these few words is - "another tool for the Google graveyard"

2OEH8eoCRo0 a year ago

What's in it for Google? Less friction allows users to consume more which gives Google more data?

  • izacus a year ago

    Dolby is earning money by jumping in the bed with streaming services and then asking every purchased device to pay them money for the privilege of decoding it.

    Making hardware devices like Android TV cheaper helps adoption of Googles platforms and services.

    • olyjohn a year ago

      It's not about making their stuff cheaper. They just want to take the royalty money they pay and put it in their pockets. They're not going to pass the savings onto the customers.

      Nobody ever passes the savings on to customers.

      The royalty is probably a couple of cents at the most per device. It makes more sense for them to just pocket the millions in savings and show extra profit to their shareholders. It would be dumb to give up millions to drop the price a few cents, which won't even be noticed by consumers anyways.

      • izacus a year ago

        The Google formats are explicitly royalty free, so I'm not sure what you're talking about here.

        Is Google evil because it wants the royalty money how can it be so low that it doesn't matter? :P

  • sudosysgen a year ago

    Integration on YouTube and Play Video for free.

  • digdugdirk a year ago

    Perhaps just a computationally inefficient process they see some potential in throwing AI at?

    Maybe it would be easier for them to tie in to their speech recognition/translation if they controlled the whole stack?

    Does anyone have any experience with Dolby and could shed some light?

  • TuringNYC a year ago

    Being able to sell TPU compute instances via algos fine-tuned to run cost-effectively on TPU instances?

RubberShoes a year ago

(Someone who works in streaming)

While I see both sides, I don't agree with this strategy. In fact, I think the public should be more aware of just how damaging Google/YouTube is to the streaming ecosystem and if you really stretch this argument, the planet.

It is true - HEVC's original licensing structure was a nightmare, but it seems to have been resolved and we now have hardware decoders in nearly all modern consumer devices.

This is also becoming true of Dolby's formats. maybe I am biased or not as informed as I could be but they did the R&D, worked with some of the brightest (pun intended) in the industry and created a production-to-distribution pipeline. Of course there are fees, but vendors are on board and content creators know how to work with these standards.

Now here comes one of the largest companies in the world. HEVC? Nope - they don't want to pay anyone any fees so instead they're going to develop the VP9 codec. Should they use HLS or DASH? Nope, they are going to spin DASH off into our own proprietary HTTP deliverable and only deliver AVC HLS for compatibility reasons. Apple customers complain and after years they cave and support VP9 as a software decoder starting with iOS14. This means millions of users eat significant battery cycles just to watch anything, including HDR video.

Then we get to Chrome. HEVC? Nope. Dolby? Nope. HLS? Nope. The most popular browser in the world doesn't support any of the broadcast standards. It's their way or fallback to SDR and the less efficient AVC codec.

So now anyone else in the streaming industry trying to deliver the best streaming experience has to encode/transcode everything three times. AVC for compatibility (and spec) reasons, HEVC for set-top boxes and iOS, and VP9 for Google's ecosystem. If it wasn't for CMAF the world would also have to store all of this twice.

In the end, to save YouTube licensing and bandwidth costs, the rest of the industry has to consume 2-3x more compute to generate video and hundreds of millions of devices now consume an order of magnitude more power to software decode VP9.

If and when Project Caviar becomes reality, it'll be another fragmented HDR deliverable. Dolby isn't going away and Chrome won't support it, so the rest of the industry will have to add even more compute and storage to accommodate. In the name of 'open' and saving manufacturers a couple dollars, the rest of the industry is now fragmented and consumers are hurt the most.

YouTube weirdly admitted this fragmentation is becoming a problem. They can't keep up with compute and had to create custom hardware to solve. Of course, these chips are not available to anyone else and gives them a competitive edge:

  • izacus a year ago

    As someone who worked in streaming, I hope new opensource formats burn down the incestual cesspool of rentseeking codecs and bury them under tons of concrete.

    You're literaly commenting here on an article where Dolby CEO gleefuly explains how he made profit by using streaming services to make users pay for their own patents and royalties. And we didn't even get to the DRM which lies deeply integrated into every part of those formats. Or insane complexity of HEVC and Dolby Vision profiles which somehow don't bother you at all.

    So, AVC, HEVC, Dolby anything, DTS anything, burn the rentseekers to the ground. I'm sorry if you need to transcode an additional video format for that.

    • ksec a year ago

      So ultimately, who is paying for the work on Standards in Video and Audio Codec?

      • account42 a year ago

        To a large degree, the public via universities.

        • ksec a year ago

          Most of the work on leading edge, state of the art Video Encoding are not done by universities or Researchers at all. And even if in the case universities were contributing, most of them are sponsored by certain business entities.

  • ksec a year ago

    Exactly. Bandwidth Cost is still dropping with no end in sight. All while computation cost and storage cost reduction are flat. And Google are finally seeing problem with having Storage and encoding issues.

    Now Google is in the Phone business they have to somehow support HEVC on their phone.

  • oittaa a year ago

    I can't decipher if this is a satire or not. It reads like it was written by a character from Silicon Valley series.

smm11 a year ago

Pono Player checking in.

GreenPlastic a year ago

The last thing I want to do is upgrade all my TVs and audio equipment for new standards

kache_ a year ago

kill em, google :)

ck2 a year ago
  • crazygringo a year ago

    The irony of GCemetary being dead is kind of amazing. Maybe keeping things going is a little harder than they thought ;)

    • ck2 a year ago

      When I see a website that stopped updating in 2020 I kinda get sad.

      We lost a lot of people.

debacle a year ago

I have an entire speaker setup that runs on the chromecast protocol(s?)

They've been repeatedly bricked (features rolled back, support changed, can't set up complete groups, etc) by Google in the last few years, to the point where I don't even think I have them connected right now.

I don't trust consumer products from Google at all.

fnordpiglet a year ago

AKA we will crush your smaller company focused on high quality standards with our half assed support that lasts until you’re dead, maybe people will confuse this for “open”

dmitrygr a year ago

Month M + 0: Google to take on $INDUSTRY_STANDARD with $GOOGLE_THING standard

Month M + 4: Google shows off $GOOGLE_THING and announces $PARTNER devices

Month M + 9: $PARTNER releases first devices with $GOOGLE_THING support (also supports $INDUSTRY_STANDARD, of course)

Month M + 18: Google disappointed with lack of adoption of $GOOGLE_THING announces first-party products with $GOOGLE_THING support

Month M + 24: Google's internal team working on first-party $GOOGLE_THING products dissolved

Month M + 36: $PARTNER announces future products will no longer support $GOOGLE_THING due to lack of demand

Month M + 48: Google removes all mentions of $GOOGLE_THING from their websites, docs, etc.

  • mmastrac a year ago

    I think Google's commitment to core technologies bucks this trend. Go has been long-lived and received lots of love. AV1/VP8/etc have been evolved and they've continued putting money into them.

    • bzxcvbn a year ago

      AV1 wasn't created by Google. Neither was VP8, although they did release it to the public after acquiring the company that created it.

    • taylodl a year ago

      That's because Google uses Go internally. It's not consumer-facing technology.

ilamont a year ago

While I don't have sympathy for proprietary formats that come with an added use charge, alternative Google formats forced upon the world in the name of a “healthier, broader ecosystem” tend to create friction and unwanted overhead. Thinking of AMP and webP in particular.

And uncertainty ... how long will such efforts last before Google loses interest or is forced to abandon them?

  • bmicraft a year ago

    Webp wasn't bad and back then there really weren't any alternatives that performed significantly better than jpeg

  • izacus a year ago

    AMP is not a format.