nequo a year ago

  One cannot demand a service-level agreement from a volunteer maintainer. Few maintainers, [Ted T’so] said, do that work as a full-time job.
Good maintainers are a public good. Why doesn’t the Linux Foundation raise funding to hire maintainers as its employees?
  • PragmaticPulp a year ago

    The maintainers list is huge and many of them contribute sporadically. It’s not a good match for full-time employment at all.

    Hiring people as employees also completely changes the dynamic to something much more complex. Would this change require everyone to be an employee contribute? Are contributions now limited by budget? Can someone still contribute without being an employee? Who decides who gets to be an employee and who has to remain a free contributor? Who gets cut when budgets shrink in down years?

    Linux Foundation does fund a lot of projects, but the idea of “just” raising enough money to make everyone employees isn’t feasible or realistic.

    • nequo a year ago

      > the idea of “just” raising enough money to make everyone employees isn’t feasible or realistic

      I didn't mean that everyone should be an employee. But some subsystems don't have enough maintainer capacity, while there are firms that want to push patches into those parts of the code base. None of them individually find it worthwhile to pay for a full-time maintainer, but maybe they could share the cost and hire one through the Linux Foundation?

      • sitkack a year ago

        I would hope that at some point sub system, maintainers would be paid and they would manage the volunteers who are sending in the patches.

  • thx-2718 a year ago

    What about public grants and partnerships with Universities as well?

    I really think having open source software that is deployed and actively used by the University can give students real experience in improving and fixing software.

    Surely if we can have tenured Professors we should have something similar to tenure for Software maintainers.

  • walterbell a year ago

    130 projects funded by Linux Foundation: https://www.linuxfoundation.org/projects

    Need a list of Linux-specific projects by Linux Foundation, which would include:

      Automotive Grade Linux
      LinuxBoot (Firmware)
      Linux Kernel
      Linux Vendor Firmware Service (LVFS)
      LF Edge (Virtualization)
      Long-Term Support Initiative
      OpenBMC
      Real-Time Linux (RTL)
      SONiC Foundation (Networking)
      Tizen (Mobile/TV)
      Yocto Linux (Embedded)
mschuster91 a year ago

Partially, I think the problem is caused by a bit of ... let's call it "old" technologies.

Developers who entered the fray in the last 10-ish years are used to projects having extensive test suites, CI pipelines doing tests (especially regression tests), and web-based workflows (i.e. gitlab, github) with rich comment features, annotations and the like.

In contrast, the Linux kernel development has none of these. Some of the large companies working for/on Linux (Intel, AMD, the enterprise distributions) do have test automation and the resources for it in private [3], but Joe Random has none of that - say I want to modify something in the kernel, the workload is on them (and the maintainers, and all too often the users of -next branches) to check if stuff goes wrong and bisect which commit broke something. This is an insanely high barrier to entry and IMHO the outdated/not present at all tooling is the biggest issue why the maintainers tend to be nit-pickers.

On top of that... I do get that Linux wants to support older systems to build. But a six year old compiler [1]? A C standard over a decade old [2]? No C++ at all? Not many people are happy to work under such constraints. I mean, okay, now some subsystems have Rust support which alleviates a bit of these problems, but it's still marked experimental.

The biggest issue organizationally is the Linux doctrine to avoid breaking userspace at all costs with very few exceptions. That model doesn't lend itself to iterative development as everything has to be perfect from the start, and the result is projects like Asahi Linux have to maintain out-of-tree changes while they are still discovering what's the best way to implement entirely new paradigms. Bikeshedding and yak shaving galore.

[1] https://www.phoronix.com/news/Linux-5.15-Raising-GCC / https://docs.kernel.org/process/changes.html

[2] https://docs.kernel.org/process/programming-language.html

[3] https://opensource.com/article/19/6/continuous-kernel-integr...

  • syntheweave a year ago

    Systems code is extremely prone to slowing down into a "legacy" maintenance project because it's at the base of what software does - not the future itself, but a vehicle to arrive in the future with. To make a vehicle that can arrive, you also have to live with a past decision about how it's designed, or else you've already started over in trying to redesign it.

    I don't think Linux is unique in that, just uniquely visible: it's allowed the barriers to entry to steadily rise to favor corporate contributors instead of allowing a faster break to a streamlined mechanism. I hate it, but I don't see it as a project-specific issue, because big projects in open-source routinely develop an unwelcoming level of complexity, it's just a "pick your poison" kind of deal.

    Aiming for an intentionally small project is also hard, but for different reasons: you have to have a pretty good sense of your philosophy to not end up either being a useless toy that nobody uses as a serious vehicle or falling right back into mindless accumulation.

    • mschuster91 a year ago

      The thing is, a lot of the barriers would be pretty easy to solve - especially, move from cgit and mailing lists to GitLab - hell even a self-hosted Community Edition instance would be enough. Having merge requests with detailed annotation threads instead of patch series on mailing lists is just so much a better experience for everyone, but it seems like many of the core contributors have so grown into their paths they literally cannot see the sunlight at the horizon any more.

      • aragilar a year ago

        But that wouldn't change needing to support an older gcc, nor change the "no breaking userspace" rule. CI won't run on every platform (nor all the different hardware that's out there). Random PRs will be just as ignored as random patches (probably more so if the idea is to increase the volume of contributions). A "build it and they will come" attitude only works if you're ready for the volume, if not (and based on the original discussion, the problem is lack of maintainer time), then it just adds more work.

andsoitis a year ago

> developers and maintainers

What’s the difference?

  • detaro a year ago

    Maintainers make the decisions about what gets accepted (or at least strongly considered to be accepted) in "their" part of the kernel.

  • electroly a year ago

    Here it's talking specifically about the maintainers list in the Linux kernel's MAINTAINERS file[1]. Maintainer in the Linux kernel project is a role for a particular subsystem. They receive, review, and merge patches for their subsystem. Kernel developers, on the other hand, write the patches and submit them. I believe that kernel maintainers are often also kernel developers.

    [1] https://www.kernel.org/doc/linux/MAINTAINERS

  • aloisdg a year ago

    I would say:

    - Maintainers job is triage and long term vision.

    - Developers write codes, tests and reviews.

stcroixx a year ago

[flagged]

  • burnished a year ago

    Ah yes, ongoing moral degeneracy, an observation with the finest of vintages.

    • andsoitis a year ago

      what do you think is the root cause of the frustration?

      • krger a year ago

        Gatekeepers clinging jealously to their gates.