• onlinepersona@programming.dev
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    9 hours ago

    Somewhere, somebody’s having a meltdown because Rust is spreading more and more in the kernel.

    Good to see that NVIDIA is writing opensource drivers (or starting to). I guess it’s too much to ask to support old graphics cards, with NVIDIA mostly caring about money and a linux driver being an incentive to choose NVIDIA over AMD for some.

    Anti Commercial-AI license

    • Cosmic Cleric@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      8 hours ago

      Somewhere, somebody’s having a meltdown because Rust is spreading more and more in the kernel.

      Probably more than just one somebody, based on the drama in these last few week’s. 😜

      Good to see that NVIDIA is writing opensource drivers (or starting to). I guess it’s too much to ask to support old graphics cards, with NVIDIA mostly caring about money and a linux driver being an incentive to choose NVIDIA over AMD for some.

      It’s too bad that there’s still a proprietary binary layer that this driver will talk to. (I’m assuming right/wrong that it’s not open source, since it’s binary.)

      Best to support AMD if you game on Linux. Really wish Intel would step up their GPU game.

      This comment is licensed under CC BY-NC-SA 4.0

      • onlinepersona@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 hours ago

        It’s too bad that there’s still a proprietary binary layer that this driver will talk to. (I’m assuming right/wrong that it’s not open source, since it’s binary.)

        I must’ve missed that from in the post. Do you have more information on that?

        Anti Commercial-AI license

        • Semperverus@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          57 minutes ago

          One of the (now ex) maintainers by the name of Christoph Hellwig said that they don’t want multiple languages in their area of the kernel because it becomes hard to maintain, and specifically called out the fact that it wasn’t targeted at Rust - they would have rejected Assembly too. The Rust developer by the name of Hector (can’t remember his last name) pushing the change took it as a personal attack, flipped his shit and quit after trying to attack Christoph and get him removed for describing the introduction of another language as being akin to a “cancer.”

          Then Linus came in, noticed that the change wasn’t actually pushing any non-C code into the kernel and told the maintainer that it wasn’t his area to block in the first place, and that he has no place telling others what to do outside of the kernel.

          So we lost a kernel maintainer and a Rust developer over one issue.

      • randomaside@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        I’ve bought two AMD GPUs in the last two years but I still have three Nvidia GPUs that I use. The cost of moving everything over to AMD is high so it just takes time to get rid of old hardware as a best case scenario.

        • Cosmic Cleric@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          6 hours ago

          The cost of moving everything over to AMD is high so it just takes time to get rid of old hardware as a best case scenario.

          Totally understand. I hang on to my current GPU for as long as I can before switching to a new one (fiveish years), especially these days.

          Having said that, if your goal is to move to Linux for gaming, best to go with a whole AMD setup if possible. Also a distro that updates often but is not bleeding edge. (For me, Fedora/KDE.)

          This comment is licensed under CC BY-NC-SA 4.0

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      3
      arrow-down
      5
      ·
      9 hours ago

      Does it?

      I mean, the goal here should be transparent setup, full feature support across all applications and very quick updates to official driver parity. My bar for “promising” may be in a different place.

      • davidgro@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 hours ago

        I don’t think an open source driver will ever fully catch up to the proprietary ones in this case, but for people who want to use only open drivers if it eventually gets somewhat close that might be enough.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          9 hours ago

          I guess? Ultimately Nvidia has like 90% plus market share in dedicated GPUs. This needs a very good solution to be acceptable for most potential users.

          I guess for some applications if you get access to hardware acceleration in some form at least it’s not a hard blocker, but unless your machine is very strictly dedicated to just a subset of applications who is paying a ton of money for a Nvidia GPU only to use it partially?

          Ah, never mind. I’m just frustrated because I’m part of that 90% and even on the proprietary driver things have been flaky enough to get in my way. I’d still argue that the bar should be set at full usability, not remedial minimum functionality, though.

          • davidgro@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            9 hours ago

            I think you’re absolutely right at the high-end, but if I have a cheaper or older machine (especially laptop) and I’m not going to play AAA games on it anyway, this driver could eventually lead to decent performance with even greater stability than the proprietary ones.

            • MudMan@fedia.io
              link
              fedilink
              arrow-up
              3
              ·
              8 hours ago

              Sure, I guess? But I also feel like the further you go down that list the more stable things are already, especially if you’re willing to go shopping for distros that offer specific Nvidia-focused variants.

              I’m also not super clear on what “high end” means in Linux circles, because a bunch of the Nvidia-proprietary features in question have been in place for over half a decade now and are tied to generations, not how expensive the cards are.

              At some point you need to develop the ability to catch up to the proprietary side of things, which means progressing faster than they iterate. I’m not keyed in to day-to-day updates to the point where I can tell if that’s the case, but from the stuff that reaches me organically that doesn’t seem to be what’s happening so far.

              • davidgro@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 hours ago

                I meant in the sense of could possibly, but I don’t have a guess on how likely.

                I am extrapolating on the stability thing just based on the language it’s coded in, which isn’t any kind of guarantee, but I think it is a good sign

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      36 minutes ago

      Well, it shouldn’t. Both C and Rust can achieve the same performance. There’s also no overhead for calling Rust from C or vice versa. Theoretically, some detail-optimizations look less horrid in C, but on the other hand, writing parallel code is significantly easier in Rust. Graphics drivers tend to be all about parallelism, although I can’t say how relevant it actually is in this case.

      Having said that, it is likely that the initial versions of this new driver will have worse performance, until the code base matures more.