• jaybone@lemmy.world
    link
    fedilink
    arrow-up
    24
    arrow-down
    3
    ·
    2 days ago

    It’s like exactly what I said they would do after the original news of the bans from the other day. And I got downvoted for it. Lol

  • Em Adespoton@lemmy.ca
    link
    fedilink
    arrow-up
    46
    arrow-down
    7
    ·
    2 days ago

    They haven’t been removed from the community though — just the maintainers list. Now they need someone else’s review to commit code to the kernel.

    Personally, I think even maintainers should be required to have that — you can be the committer for pre-reviewed code from others, but not just be able to check anything you want in, no matter your reputation (even if you’re Linus). That way a security breach is less likely to cause havoc.

    • Ephera@lemmy.ml
      link
      fedilink
      arrow-up
      15
      arrow-down
      2
      ·
      2 days ago

      I find that difficult. Aside from code reviews, often times your job as a maintainer is:

      • getting a refactor or code cleanup in while everyone’s asleep
      • shuffling commits around between branches
      • fixing the CI toolchain
      • rolling back or repairing a broken change
      • unfucking the repo
      • fixing a security vulnerability

      A required review slows all of these tasks to a crawl. I do agree that the kernel is important enough that it might be worth the trade-off.
      But at the same, I do not feel like I could do my (non-kernel) maintainer job without direct commit access…

      • Em Adespoton@lemmy.ca
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 day ago

        I feel your pain. I have maintainer roles for a few projects where things could be slowed down by a week or more if I didn’t have direct commit access. And I do use that access to make things run faster and smoother, and am able to step in and just get something fixed up and committed while everyone else is asleep. But. For security critical code paths, I’ve come to realize that much like Debian, sometimes slow and secure IS better, even if it doesn’t feel like it in the moment (like when you’re trying to commit and deploy a critical security patch already being exploited in the wild, and NOBODY is around to do the review, or there’s something upstream that needs to be fixed before your job can go out).

    • OwlPaste@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      2 days ago

      Will we finally get the “Putinix” distribution that mines cryptocurrency for the regime by default? It will have to be a new coin called “RuOil”

    • ReakDuck@lemmy.ml
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      2 days ago

      Especially, because they can chose existing names as there is no Copyright in Russia (afaik, probably a wrong myth but idk)

      • OwlPaste@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        No there was copyright, it was only relatively enforced between 2000-2015 ish. And then probably only in tourist heavy areas. In the olden days you could find any soft on “black markets” in open stalls

    • Dessalines@lemmy.ml
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      9 hours ago

      For sure, stuxnet is just the beginning, who knows what the US will subject the world to next.

    • tetris11@lemmy.ml
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      2 days ago

      At first I thought you meant it’d be a bad fork, but then I realise you meant it’d be a bad fork.

      As long as it’s open source and vetted by the public, I don’t see how it could go bad tbh

        • tetris11@lemmy.ml
          link
          fedilink
          arrow-up
          4
          ·
          11 hours ago

          then it wont be linux, but a shittily maintained private copy that will fall out of disuse quickly unless they merge all upstream changes without too much oversight (in which case, why bother?) to keep feature parity