Old but gold. posting for anybody who hasn’t seen this yet.

  • chaorace@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    1 year ago

    I’m particularly amused by the pro-NVIDIA “it just works” comments. Compared to what exactly? With AMD, the 3D acceleration driver is bundled directly into VESA, so it’s already ready & working before even the first-boot of almost all desktop distros. That’s how drivers are supposed to work on Linux and it has taken NVIDIA 10+ years (and counting…) to get with the basic program.

    I applaud the long overdue decision to move their proprietary firmware directly onto the card and making the rest of the kernel driver open-source, but I’ll remind you folks of a few things:

    • The open source driver is still in an alpha with no timeline for a stable release
    • NVIDIA has so far elected to control their own driver releases instead of incorporating 3D acceleration support into VESA

    NVIDIA had to be dragged screaming to go this far and they’re still not up to scratch. There’s still plenty of fuel left in the “Fuck NVIDIA” gastank.

    • Fryboyter@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I’m particularly amused by the pro-NVIDIA “it just works” comments. Compared to what exactly?

      Compared to nothing. I have used Nvidia graphics cards under Linux for many years. The last one was a GTX 1070. In order for the cards to work, I had to install the driver once with the command pacman -S nvidia-dkms. So the effort was very small.

      By the way, I am currently using a 6800 XT from AMD. I therefore don’t want to defend Nvidia graphics cards across the board.

      Unfortunately, when it comes to Nvidia, many people do not judge objectively. Torvalds’ “fuck you”, for example, referred to what he saw as Nvidia’s lack of cooperation with the kernel developers. And i think he was right. But it was never about how good or bad the graphics cards were usable under Linux. Which, unfortunately, many Linux users claim. Be it out of lack of knowledge or on purpose.

      Since then, some things have changed and Nvidia has contributed code to several projects like Plasma or Mesa to improve the situation regarding Wayland.

      • chaorace@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Compared to nothing. I have used Nvidia graphics cards under Linux for many years. The last one was a GTX 1070. In order for the cards to work, I had to install the driver once with the command pacman -S nvidia-dkms. So the effort was very small.

        Kernel modules work until they don’t. I’m genuinely glad that you’ve had a good experience and – despite appearances – I’m not interested in provoking a vendor flamewar… but the fact remains that among the three major patterns (builtin, userland, module), modules are the most fragile and least flexible. I’ll cite this response to my parent comment as an example.

        Unfortunately, when it comes to Nvidia, many people do not judge objectively. Torvalds’ “fuck you”, for example, referred to what he saw as Nvidia’s lack of cooperation with the kernel developers. And i think he was right. But it was never about how good or bad the graphics cards were usable under Linux. Which, unfortunately, many Linux users claim. Be it out of lack of knowledge or on purpose.

        That’s a fair point, but to a certain extent I think this overlooks the importance of developer sentiment on a project like Linux. Take (Intel) Macbooks as an extreme example: kernel developers liked the hardware enough to support it despite utter vendor indifference. It’s clearly a case of hypocrisy compared to NVIDIA who (at the very least) participates, but at the end of the day people will show love for the things that they love. NVIDIA remains unloved and I do feel that this bleeds through to the user experience a fair amount.

        In any case, you’re right to say that legitimate criticisms are often blown out of proportion. Developer problems aren’t necessarily user problems, even if we sometimes romanticize otherwise.

    • filister@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The problem is that Nvidia’s software stack is much more advanced. For example machine learning acceleration, CUDa is miles better than Rocm and widely supported. I wish AMD were more serious about GPUs and made greater strides, but they overslept and let Nvidia become a de-facto monopolists with their anti competitive/consumers strategies and closed source strides.

      Nvidia is the new Apple, unfortunately.

      • De Lancre@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        You basically have two option: suffer on nvidia, cause some feature may not be developed, or suffer on amd, cause developed feature just straight up do not work.

    • j4k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I’m messing with shitvidia now on a new AAA laptop after people said it just works. I just spent all day trying to setup EFI keys for secure boot because shitvidia doesn’t sign their kernel drivers module. Plus their drivers are outdated and their documentation is terrible. I failed today because Gigabyte is another shit company that has a proprietary (theft) bootloader set so that no one can lock the UEFI secure boot with any PK except theirs. I can run Fedora’s key issued by Microsoft to run it with secure boot enabled, but then I can’t use the god dam GPU I bought the piece of shit for in the first place. Shitvidia will always be shitvidia. This proprietary bullshit is straight up theft. It should be illegal to sell anything with digital reservations of any kind. Dealing with all this, I think Stallman was ultra conservative right wing. Fuck these criminals.

      • De Lancre@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        You other option would be use amd iGPU. Cause good luck find amd discrete gpu in notebook this days. And even then, you would be just “messing with shit-amd”.