jzebedee 15 hours ago

Project description:

  dav2d is the fastest AV2 decoder on all platforms :)
  Targeted to be small, portable and very fast.

If you're out of the loop like me:

  AV2 is the next-generation video coding specification from the Alliance for Open Media (AOMedia). Building on the foundation of AV1, AV2 is engineered to provide superior compression efficiency, enabling high-quality video delivery at significantly lower bitrates. It is optimized for the evolving demands of streaming, broadcasting, and real-time video conferencing. 

- from https://av2.aomedia.org/

  • delfinom 14 hours ago
    • walrus01 14 hours ago

      Sisvel is a patent troll. Take a look at the combined list of all companies that are the AOM and tell me with a straight face that all of their corporate in house counsel specializing in intellectual property law are wrong.

      • asveikau 13 hours ago

        I don't know this stuff super well but I imagine it's not necessarily about the lawyers being right or wrong so much as what they can convince people of. The ideal scenario for the patent troll is they can intimidate you into licensing with them. Another good outcome for them (though more costly) is they can convince some non-expert in court. In either case the big players behind the codec can defend themselves but a small one just picking it up downstream as OSS can't.

        • walrus01 13 hours ago

          I don't doubt for a minute that they are going to attempt to intimidate companies using av1 which are much smaller than the AOM founders.

    • Telaneo 14 hours ago

      They've done the same thing with AV1, and I can't see that having prevented adoption, nor can I imagine Sisvel wanting to poke the bear that is AOMedia unless they're certain their case is absolutely watertight.

      • walrus01 14 hours ago

        I see zero public evidence that they've filed any lawsuits against the members of AOM in any jurisdiction. I'm sure there's been a lot of threatening letters sent...

        • Telaneo 14 hours ago

          Same, which is what makes it seem to me that that case is absolutely not watertight. Those patents are probably all about esoteric minutiae (to be fair, that's because that's what it takes to make a better video codec these days) and everything and anything that can seemingly be connected to AV2 (or AV1 for that matter), many of which have only gotten a patent because the person approving it only barely understands what it's saying.

        • jorvi 13 hours ago

          Yup. The Dolby/Disney vs Snapchat lawsuit is going to be the first one. So far it's only been filed.

          The big question is if AOMedia is going to make good on their Mutually Assured Destruction promise of using their patent and financial war chest to to countersue into oblivion anyone trying to go after AV1 adaptors.

        • snvzz 7 minutes ago

          The illusion breaks once tested in court.

          Which is why they'd never sue, only threaten and try to settle.

    • ronsor 13 hours ago

      This is a thinly veiled extortion racket and any competent system would fine them into bankruptcy.

      • mort96 12 hours ago

        We need a more efficient way to eliminate bullshit patents or bullshit patent infringement claims than "violate them then spend millions on lawyers to fight them in court".

        • brookst 12 hours ago

          Sure, and at the same time we need a more efficient way to ensure big companies can’t just take what they want and bury anyone who complains.

          It’s not an easy problem.

          • voakbasda 4 hours ago

            Stop big companies from ever forming. They are not a natural force that cannot be reckoned with. We allow them to exist. Revoke the charters of any business over 500 employees.

            • pmontra 2 hours ago

              I can see a number of ways to work around that limitation, without even lobbying and bribing. And I'm not even a lawyer or an accountant.

              Eventually all the money and power will converge in a few sub 500, or sub 50, companies and nothing will change.

    • BLKNSLVR 11 hours ago

      You can tell Sisvel are a bunch of grifters by the fact they use slight grey text on a slightly less grey background.

      Aesthetics over function; style over substance. If that's their web design policy it's likely their policy in all other aspects.

      I'm also not sure that they're aware that intellectual property rights no longer exist in the US. If AV2 was vibe coded, there would be no case.

      • astrange 11 hours ago

        > If AV2 was vibe coded, there would be no case.

        …for copyright. Not for anything else. Patents would still apply.

    • shmerl 7 hours ago

      Trolls will always be trolls. The need to fight them just shows the need to reform the garbage patent system to make sure no one can ever patent software.

  • hulitu 54 minutes ago

    > AV2 is the next-generation video coding specification from the Alliance for Open Media

    Oh no. Not another one. I presume this one makes lossy better, or faster or both.

tensor 15 hours ago

Not on topic, but wow the internet has very quickly devolved into: click -> "making sure you're not a bot", click -> "making sure you're a human", click -> "COOKIES COOKIES COOKIES", click -> "cloudflare something something"

  • port11 15 hours ago

    The internet is such a Tragedy of the Commons… its citizens that act selfishly and in bad faith will slowly make it unusable.

    • honktime 15 hours ago

      Its pretty explicitly not a tragedy of the commons. Its a tragedy of the ruling class abusing the resources of the 'commons' to extract value. There is nothing 'commons' about trillion dollar companies extracting all available value from the labor of the working class. That's just the tragedy that'll bring around the death of society, the same tragedy that brings all other tragedys

      • throw-the-towel 14 hours ago

        The commons in question is the internet itself.

      • dyauspitr 14 hours ago

        There’s definitely lots of problems with the ruling class and wealth disparity. Perhaps the defining problems of our current age.

        That being said, so many of the plebs suck. Like 2% will ruin everything for everyone.

        • throw-the-towel 14 hours ago

          While a lot of the plebs do suck, a pleb who sucks causes way less problems than a big corp that sucks simply by virtue of not having too much resources.

          • dyauspitr 12 hours ago

            I agree.

            But whether you agree with me or not, most paradigm shifting changes come from billionaires/corps because they are the only ones with the money to pull off massive shifts. Most innovation is not grassroots and heavily funded by the “elites”. This is how most successful countries have been for atleast the last 100 years. So billionaires add a lot of value even as they cause a lot of pain.

            The solution in my mind is we absolutely need uncapped billionaires but they need to be effectively taxed (not like 90% but closer to 50%) and they have to have absolutely no influence on the government.

    • codedokode 14 hours ago

      No, it is because citizen allow treating them like this.

    • esseph 14 hours ago

      > its citizens that act selfishly and in bad faith will slowly make it unusable

      It's rarely been the citizens that have been the problem, but the governments and companies that seek the use the network connection for their overwhelming benefit.

      Re (above):

      > Not on topic, but wow the internet has very quickly devolved into: click -> "making sure you're not a bot", click -> "making sure you're a human", click -> "COOKIES COOKIES COOKIES", click -> "cloudflare something something"

      • fastball 13 hours ago

        wat. The protections in place that the OP is talking about are almost entirely due to (not government and company) bad actors.

  • tosti 15 hours ago

    I get exactly none of that. Is your adblocker still working?

  • thresh 14 hours ago

    We had to set it up on the parts of VideoLAN infra so the service would remain usable.

    Otherwise it was under a constant DDoS by the AI bots.

    • nerdralph 13 hours ago

      I highly doubt there is no other technically feasible option to block the AI bots. You end up blocking not just bots, but many humans too. When I clicked on the link and the bot block came up, I just clicked back. I think HN posts should have warnings when the site blocks you from seeing it until you somehow, maybe, prove you are human.

      • thresh 13 hours ago

        I'm all ears on how we can fix it otherwise.

        Keep in mind that those kinds of services: - should not be MITMed by CDNs - are generally ran by volunteers with zero budget, money and time-wise

        • nerdralph 11 hours ago

          First off, don't block the first connection of the day from a given IP. Rate limit/block from there, for example how sshguard does it.

          I've seen several posts on HN and elsewhere showing many bots can be fingerprinted and blocked based on HTTP headers and TLS.

          For the bots that perfectly match the fingerprint of an interactive browser and don't trigger rate limits, use hidden links to tarpits and zip bombs. Many of these have been discussed on HN. Here's the first one that came to memory: https://news.ycombinator.com/item?id=42725147

      • goobatrooba 13 hours ago

        I'm sure there are many solutions for many problems, but expecting a small Foss development team to know or implement them all is rather unreasonable.

        I think the world gains more if the VLAN team focuses on their amazing, free contribution to the world, than if they spend the same time trying to figure out how to save you two clicks.

        We all hate that this is happening, but you don't need to attack everyone that is unfortunately caught up in it.

      • overfeed 13 hours ago

        > I highly doubt there is no other technically feasible option to block the AI bots.

        If you have discovered such an option, you could get very wealthy: minimizing friction for humans in e-commerce is valuable. If you're a drive-by critic not vested in the project, then yours is an instance of talk being cheap.

    • hectormalot 13 hours ago

      Maybe I’m naive about this, but I didn’t expect AI scrapers to be that big of a load? I mean, it’s not that they need to scrape the same at 1000+ QPS, and even then I wouldn’t expect them to download all media and images either?

      What am I missing that explains the gap between this and “constant DDoS” of the site?

      • Y-bar 13 hours ago

        They are a scourge, they never rate-limit themselves, there are a hundred of them, and a significant number don’t respect robots.txt. Many of them also end up our meta:no-index,no-follow search pages leading to cost overruns on our Algolia usage. We spend way too much time adjusting WAF and other bot-controls than we should have.

      • nijave 13 hours ago

        I think there's a few things at play here

        - AI scrapers will pull a bunch of docs from many sites in parallel (so instead of a human request where someone picks a single Google result, it hits a bunch of sites)

        - AI will crawl the site looking for the correct answer which may hit a handful of pages

        - AI sends requests in quick succession (big bursts instead of small trickle over longer time)

        - Personal assistants may crawl the site repeatedly scraping everything (we saw a fair bit of this at work, they announced themselves with user agents)

        - At work (b2b SaaS webapp) we also found that the personal assistant variety tended to hammer really computationally expensive data export and reporting endpoints generally without filters. While our app technically supported it, it was very inorganic traffic

        That said, I don't think the solution is blanket blocks. Really it's exposing sites are poorly optimized for emerging technology.

        • Sesse__ 2 hours ago

          Also, relevant for forges: AI doesn't understand what it's clicking on. Git forges tend to e.g. have a lot of links like “download a tarball at this revision” which are super-expensive as far as resources go, and AI crawlers will click on those because they click on every link that looks shiny. (And there are a lot of revisions in a project like VLC!) Much, much more often than humans do.

      • thresh 12 hours ago

        You cant really cache the dynamic content produced by the forges like Gitlab and, say, web forums like phpbb. So it means every request gets through the slow path. Media/JS is of course cached on the edge, so it's not an issue.

        Even when the amount of AI requests isnt that high - generally it's in hundreds per second tops for our services combined - that's still a load that causes issues for legitimate users/developers. We've seen it grow from somewhat reasonable to pretty much being 99% of responses we serve.

        Can it be solved by throwing more hardware at the problem? Sure. But it's not sustainable, and the reasonable approach in our case is to filter off the parasitic traffic.

        • fragmede 11 hours ago

          You kind of can though. You serve cached assets and then use JavaScript to modify it for the individual user. The specific user actions can't be cached, but the rest of it can.

          • davidron 8 hours ago

            Totally. Remember slashdot in the 1990s used to house a dynamic page on a handful of servers with horsepower dwarfed by a Nintendo Switch that had a user base capable of bringing major properties down.

          • Avamander 8 hours ago

            The "can't" comes from the fact that VLC is not going to rewrite their forum software or software forge.

            Software written in PHP is in most cases frankly still abysmally slow and inefficient. Wordpress runs like 70% of the web and you can really feel it from the 1500ms+ TFFB most sites have. PhpBB is not much better. Pathetic throughput at best and it has not gotten better in decades now.

            I don't know how GitLab became so disgustingly slow. But yeah, I'm not surprised bots can easily bring it to its knees.

        • hectormalot 8 hours ago

          Thanks, appreciate the details. 99% is far above the amount I expected, and if it specifically hits hard to cache data then I can see how that brings a system to its knees.

    • nijave 13 hours ago

      While I do sympathetize with the AI DDoS situation, it'd be nice if there were a solution that allows them to work so they can pull official docs.

      For instance, MCP, static sites that are easy to scale, a cache in front of a dynamic site engine

      • thresh 12 hours ago

        Of course, static websites is the best solution to that problem.

        Our documentation and a main website are not fronted by this protection, so they're still accessible for the scrapers.

  • oybng 14 hours ago

    renders your gigabit connection pointless

  • rayiner 14 hours ago

    Wow I’m glad it’s not just me. I thought my IP block had gotten caught up in some known spamming or something.

  • notenlish 13 hours ago

    Nearly every single website I'm not logged into these days want me to "confirm I'm not a bot".

    it is incredibly annoying but what can you do? AI scrapers ruined the web.

  • pixelpoet 13 hours ago

    No one's even clicking anymore, everything implores me to tap or swipe these days, and everything is optimised for humans with one eye above the other.

    Then I press the X to close the all-caps banner commanding me to install the app, upon which I get sent to the app store. Users of the website refer to it as an app.

  • tomwheeler 13 hours ago

    At least this one was significantly faster than Cloudflare and required no action on my part.

  • croes 1 hour ago

    AI is a gift that keeps on giving.

    High hardware prices, locked information sources, plenty of AI slop etc.

snvzz 8 minutes ago

With the first RVA23 boards shipping this month, I find it a mistake to still focus on legacy ISAs like x86 or ARM rather than what will be dominant by the time AV2 is deployed.

Telaneo 14 hours ago

Glorious. Really looking forward to seeing how much better than AV1 it actually turns out to be. It's a shame it'll take a while before we'll have a decent encoder (it took an annoyingly long time until SVT-AV1 was usable).

amitbidlan 6 hours ago

Mostly ASM for performance critical paths is a pattern that never gets old. The VideoLAN team did the same with dav1d and it paid off. Curious how much of dav2d ends up staying C as AV2 matures.

risho 12 hours ago

is there any understanding of how big of an improvment av2 will be over av1?

  • ChadNauseam 7 hours ago

    About 30% better compression than AV1 at equivalent quality. But it'll be a while before it's a good idea to use AV2 in your home media server. (AV1 is still not that broadly supported)

pkos98 13 hours ago

off topic, but related to the recent github alternative discussion:

Wow, this gitlab instance looked so much cleaner/simpler and less clunky than my past experiences! Also loaded really fast on first page load as well as subsequent actions

shmerl 7 hours ago

Nice.

What's the current state of of Dolby trying too attack AV1 ecosystem (Snapchat more specifically)? I hope there is an organized fight back by AOM against these trolls.

sylware 14 hours ago

I would even remove the C code and lower the usage of the assembler pre-processor to a basic C pre-processor.

Happy, AV2 decoding already here.

:)

arkensaw 10 hours ago

maybe not great naming. Sounds very similar to the rapper D4vd, who was just arrested for murdering a 14 year old girl

  • jofzar 10 hours ago

    Actually it's closer to https://youtube.com/@dave2d who is a popular tech YouTuber

    • bl4ckneon 8 hours ago

      That is what I thought of too. Almost live David is a popular name or something... /s

kylec 14 hours ago

I wonder if the author is a Dave2D fan?

https://www.youtube.com/@Dave2D

dcsommer 14 hours ago

We must not continue to develop media codecs in memory unsafe languages. Small, auditable sections can opt-out perhaps, but choosing default-unsafe for this type of software is close to professional negligence.

  • fguerraz 14 hours ago

    Cryptography and video codecs are notable exceptions, they put a lot of effort to making the code provably memory safe: no recursion, limited use of stack variables, no dynamic allocations, etc. As a result, memory safe languages bring nothing but trouble by making it non deterministic, that’s especially true for crypto where compiler “optimisations” guarantee you side channels attacks.

    • astrange 11 hours ago

      Video codecs just don't need to do dynamic allocations because it's not relevant to the problem. There's still certainly plenty of opportunities for memory bugs because there's a lot of pointer math.

    • simonask 10 hours ago

      What in the world do you mean by “non-deterministic”?

      C compilers, Rust compilers, and assemblers are all deterministic.

      • adgjlsfhk1 6 hours ago

        > C compilers, Rust compilers, and assemblers are all deterministic.

        Within a version, yes, but not cross version. Different versions of GCC/Clang etc can give you completely different code.

      • fguerraz 3 hours ago

        In cryptography, you want operations to run in constant time, even if it’s wasteful, otherwise an attacker could guess information about the key or plaintext by measuring execution times.

        Modern compilers are extremely clever and will produce machine code that takes full advantage of modern CPU branch predictors, and reorder instructions to better take advantage of pipelining. This in itself will make the same code run at different speeds depending on the input data.

        Then there is the whole issue of compiler version roulette. As a developer you have no idea which version of compilers your users and distros will use, and what new and wonderful optimisation they will bring.

  • fishgoesblub 14 hours ago

    Of the 3 software AV1 encoders, the only one that is fully dead is the Rust encoder (rav1e). If people truly wanted memory safe encoders/decoders, they would fund and develop them.

    • esseph 14 hours ago

      > If people truly wanted memory safe encoders/decoders

      Really? How many codecs have your neighbors contributed money for the development of, just curious.

      • Telaneo 14 hours ago

        Given Netflix's involvement with SV1-AV1, (not even that) indirectly, at least 1.

      • computerbuster 13 hours ago

        I think these conversations are directed by the parties funding the efforts. Example: "we (large company) want a fast AV2 decoder" -> they pay a specialized team to do it -> this team works in C for the most part, so it is done in C. If there were financial incentives to do it in Rust, they'd pay more for a Rust decoder.

    • vlovich123 13 hours ago

      Fully dead in what sense? Seems like it still has active development to me.

      • fishgoesblub 13 hours ago

        It hasn't had any proper quality/speed improvements in years. Only thing that has changed is updating deps and some bug fixes.

    • simonask 10 hours ago

      Encoding is a way, way less risky thing to be doing compared to decoding.

  • kllrnohj 13 hours ago

    For the codec itself, the majority of it is performance sensitive and often has a significant amount of assembly even, so a memory safe language doesn't change much.

    However for the container/extractor... those should absolutely be in a memory safe language, and those are were a lot of the exploits/crashes are, too, as metadata is more fuzzy.

    As a practical example of this see something like CrabbyAVIF. All the parser code is rust, but it delegates to dav1d for the actual codec portion

  • maxloh 12 hours ago

    Decoders written in Rust will be a lot slower than the equivalents in assembly.