• The state of nVidia Prime render offload in Linux

    Home » Forums » AskWoody support » Linux for the Home user » Linux – all distros » The state of nVidia Prime render offload in Linux

    Author
    Topic
    #2289993

    I’ve been using Ubuntu and related distros exclusively since I moved to Linux, but something has gone wrong with the latest release (20.04) that has had me looking at alternatives. KDE Connect, a tool I’ve come to value a great deal, does not work properly on any of the 20.04 releases, and Neon has discontinued support on the 18.04 based version, so I’m writing to you from Fedora now, and everything is as well tuned and working as ever in Neon… and KDE Connect works.

    In the process of learning how to tweak some of the things in Fedora that are different from Ubuntu, I’ve found that it’s a common opinion that Ubuntu has better support for nVidia Prime than other distros. Indeed, it comes with a package preinstalled that allows the user to switch from the nVidia GPU to the Intel at will, though it’s still not as convenient as in Windows, where you can pretty much ignore all of it, at least so I am told. Run ordinary applications that do not need the GPU and the nVidia will remain powered down, but run something more demanding and it springs to life and handles the task, without the user ever perceiving that anything special just happened. When the demanding task finishes, the nVidia is again powered off and the integrated Intel unit will take over.

    It’s apparently not perfect, so some people like to assign certain applications to the nVidia GPU and leave the rest to the Intel. It works well, since there’s pretty much never a point when a gamer will want to run a game on the slower integrated GPU rather than the nVidia, and everything else runs fine on the integrated one, so there’s no need to fix what is not broken.

    The nVidia developers finally introduced a long-desired feature for Linux with the 435 series driver. It offered Prime offloading, where it was no longer necessary to manually switch to the nVidia GPU and log out and in again. You can have certain applications always run on nVidia, and others will run on the Intel.

    The tools that are used to select Intel or nVidia mode got a third choice: On-demand. That enables the usually-Intel, nVidia-on-demand setup. Initially that was one of the new, great features that drew me to Ubuntu 20.04, but I soon figured out how easy it was to bring it to 18.04. Just a few files from the 20.04 repo and it worked just as in 20.04.

    On my G3 gaming laptop, the external HDMI port is wired to the nVidia card directly, while internally, there is no connection between the nVidia card and the built-in display panel. The display panel is wired to the Intel GPU only, and when one turns on the nVidia (or when it does it by itself in Windows), the OS sends the data to be rendered and displayed to the nVidia GPU, and the fully rendered, ready-to-be-displayed frames are streamed over the PCIE bus to the CPU, which is also where the integrated GPU lives, and it displays the frames. It’s a clever concept… when a graphically demanding program is running (typically a game), the PCIE lanes from the CPU to the GPU are heavily utilized, but the ones going the opposite direction are essentially idle. The nVidia GPU is already wired to the PCIE bus, and so is the CPU, and the lanes going back to the CPU are empty, so why not use them?

    It works really well in Linux, even though it is one of those things that sounds like it would be a real headache.

    After I installed Fedora, it was (as you would expect) working quite well with the Intel integrated GPU, but Fedora doesn’t like non-free software, so there is none of it in their repos. Ubuntu’s repos contain nVidia drivers, so it’s really easy to get them up and running, but Fedora required an extra step. I just went to the nVidia site and downloaded the distro-agnostic .run driver and installed that.  It was a little fiddly, as installing .run drivers often is, but I got it installed.

    So, after that, it was hard to tell right off the bat if the nVidia was working or not. I saw that the little temperature monitoring widget in the systray was working, so that was a good sign. So I tried running something on it using the on-demand method, which I did in this case by toggling the option for “use Prime render offload” in the Lutris settings to ON (actually it was already on from my Neon installation, but you get the idea), and it immediately switched to the nVidia and ran the thing. There was not a tool to switch Intel, nVidia, or on-demand mode, but the on-demand bit worked nicely even without being specifically selected.

    The thing I was not certain about was whether the external HDMI port would work without me specifically switching to the nVidia. I had read about people who did not have the nVidia driver properly installed trying to plug an external monitor into my model of laptop and seeing only a blank, black screen. Is that what I would see?

    I haven’t been able to test that idea out for a while, as my barely-not-a-kitten cat is still in that phase where she likes to chew everything, and she wrecked my HDMI cable. My monitor is old enough to not have a HDMI port… it has a DVI port, and I use an adapter to convert it to HDMI. I ordered a new cable, and in the meantime, I used a DVI cable. My GPU on the desktop PC has two DVI ports and one HDMI, so it was no problem, but the laptop has no DVI port. But now I have the new cable, which I armored with some automotive-style plastic split loom (as I did with all the other wires that were vulnerable too).

    So just a bit ago, I verified that the G3 was using the Intel GPU. One of the widgets for the KDE desktop is a simple indicator that shows the nVidia logo (in the systray) grayed out when the PC is using the Intel GPU, and it turns nVidia green when the nVidia is used. I plugged in the HDMI cable, and a picture appeared on the monitor. The laptop display panel had the KDE popup asking which screen configuration I wanted to use… and at the same time, the little nVidia logo went to green.

    I had been concerned that without being able to switch exclusively to the nVidia, I’d lose the ability to use the external monitor, but happily, that’s not the case at all. It just switches by itself when it needs to, and as soon as I unplug the HDMI cable, the green nVidia logo turns to gray again.

    The only caveat so far is that the nVidia GPU is not turned off when in Intel mode. It’s idle, in a low power state, but it’s still sipping a bit of juice. So far, nVidia has only set up the newest generation of Prime GPUs to have the thing turn off by itself, and there is apparently a technical reason for this, not just nVidia trying to sell more GPUs. It has something to do with there not being a standard hardware way (in the hardware of the PC, not the GPU) to turn the GPU off in older generations, but they’ve added a standard way to the GPU itself in the latest generations. Still, if Windows can manage it across varying hardware configurations, then so can Linux.

    From my experience trying to get Prime render offload working in 18.04, I know of a way to put it in Intel mode and have the nVidia turned off (using bbswitch, a third-party program). It would still require logging out and in. With or without that, though, I don’t really use the G3 on battery. As a gaming PC, it has a rather bulky and heavy cooling system (by laptop standards), so it’s not something that would be convenient to carry around even if the nVidia bit was completely off. And certainly gaming on battery is not practical, as the current draw is so high that it would be a very short gaming session if the AC power adapter was not available.

    It’s a neat time to be in the Linux world. Things are improving at a sprightly clip, while in the Windows world, you get a special dedicated button to bring up the emoji picker. If you were a PC hobbyist or enthusiast who used to think computing was fun before the modern era of Windows, perhaps consider giving Linux a try.

     

    Dell XPS 13/9310, i5-1135G7/16GB, KDE Neon 6.2
    XPG Xenia 15, i7-9750H/32GB & GTX1660ti, Kubuntu 24.04
    Acer Swift Go 14, i5-1335U/16GB, Kubuntu 24.04 (and Win 11)

    1 user thanked author for this post.
    Viewing 1 reply thread
    Author
    Replies
    • #2292395

      … hm, funny thing, I though I’d replied to this…

      Been playing with that a bit myself, on Ubuntu 20.04.

      There’s this one laptop that’s much newer than anything else, though not more powerful or anything and in many ways less than optimal hardware… (one of the kids needed “any laptop” in a hurry the other time and this was on discount at a local shop)

      It’s a Lenovo Ideapad, fairly small and light, with a GeForce MX series nVidia GPU in addition to the usual Intel integrated thing in a 8-gen processor.

      Offload seems to work fine with 440 series driver even in lockdown mode, after enrolling the MOK… now waiting for the kernel feature of allowing hibernation during lockdown.

      (Or I could just turn off Secure Boot or override Ubuntu’s policy of enforcing lockdown if Secure Boot is detected, I suppose. Meh, this thing has kernel living in all-encrypted disk anyway, including /boot, swap and hibernate, so…)

    • #2292400

      … hm, funny thing, I though I’d replied to this…

      For the sake of your sanity, you did! on a very similar topic title by the same author 🙂

      Windows - commercial by definition and now function...
    Viewing 1 reply thread
    Reply To: The state of nVidia Prime render offload in Linux

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: