I have this question. I see people, with some frequency, sugar coating the Nvidia GPU marriage with Linux. I get that if you already have a Nvidia GPU or you need CUDA or work with AI and want to use Linux that is possible. Nevertheless, this still a very questionable relationship.
Shouldn’t we be raising awareness about in case one plan to game titles that uses DX12? I mean 15% to 30% performance loss using Nvidia compared to Windows, over 5% to 15% and some times same performance or better using AMD isn’t something to be alerting others?
I know we wanna get more people on Linux, and NVIDIA’s getting better, but don’t we need some real talk about this? Or is there some secret plan to scare people away from Linux that I missed?
Am I misinformed? Is there some strong reason to buy a Nvidia GPU if your focus is gaming in Linux?
Im putting together my new Nvidia PC build tonight. I was planning on putting bazzite on it, should I just use windows then?
Nvidia cards work just fine on Linux, old issues are parroted around by people who don’t know any better.
I had to roll back my kernel when deb flung out the last one due to drivers. They updated them yet as I haven’t noticed them in updates
For anybody tying to make sense of who has trouble on NVIDIA, keep in mind that Debian uses ancient drivers.
Thankfully Debian Stable updated recently so it has gotten a lot better but, the last I checked, the Debian drivers still did not support explicit sync. This could lead to problems on Wayland.
Remember that “stable” in Debian means that your system will not change much. That is often a good thing but it can often mean progress comes to Debian much later than other distros.
Never use Debian as a benchmark for what works on Linux.
Heh you described my problem exactly. Waylaid did get busted in my attempt to get it working again. I was hoping that given the kernel update broke nvidia they would push a fixing update.
Tbh. I have been experimenting for a while to sid what I need to keep win for and am happy enough to make linux my big partition now. Given the nvidia drivers issue on the new kernel im tempted to look elsewhere
My nvidia 1080 just failed driver upgrades on the most recent edition of fedora. Cant parrot myself…
I’d go with Linux, no natter what, but this seem exactly why I feel that we should be more clear. People may be building some PCs out there because others keep saying that everything is smooth sails with Nvidia. A lot of it is working now but there are some downsides and the recommendation is to go with AMD if you can.
No, you should at least try Bazzite first. I’ve seen people recently talking about how they have no issues with Nvidia and Linux.
I’d say given the AMD contributions to Linux better to support them with your money instead of NVIDIA.
AMD will have superior support and better power management out of the box hands down.
Nvidia may have a minor performance improvement in some areas depending on the card, but not in a way you would care if you aren’t obsessed with the technical specifics of the graphics on AAA games.
I’ve been on Linux as a dev and daily driver for 20 years, and Nvidia drivers are just problematic unless you know exactly how to fix them when there are issues. That’s an Nvidia problem, not a Linux problem. Cuda on AMD is also a thing if you want to go that route.
The choice is yours.
I’m glad you mentioned knowing how to fix them. My server has hosted Nvidia GPUs for 15 odd years now, working great, and has remained stable through updates by some miracle.
Getting it set up was a nightmare back then though, do not recommend for the faint of heart.
You’re misinformed, mostly.
NVIDIA had driver issues, incompatibility with gamescope (which was required for HDR) and a few instances of bugs, in WINE/proton, that caused performance problems in specific games/configurations.
Now, the driver issues for the mainline cards (the most common ones on Steam’s hardware survey) are about the same frequency as AMD hardware and we use Wayland’s native HDR, so gamescope isn’t a concern.
I’ve been using NVIDIA on Linux for 2 years now and I have never seen anything like a 30% performance reduction on any game, and I can also run local AI with acceleration.
As long as you’re using current hardware then you’re fine. If your graphics card was released 2 days ago, or is from the ‘00s then you may experience issues but otherwise NVIDIA cards work just fine.
I have freezes on the latest Nvidia drivers as recently as yesterday on wine. Also Wayland wine is not ready, doesn’t even full screen properly
Osu! linux version is ten times slower than wine using the same graphics back end. Yes, I get over 1000 fps on wine and only 100 natively. It would be fine if it didn’t get choppy and drop lower during the busiest part of the game.
Just because it works for you doesn’t mean it doesn’t have issues
Conversely, just because it doesn’t work for you that doesn’t mean that there are issues. I use Wayland Wine for everything, it works fine for me and even eliminates hitching caused by XWayland.
If you’re using a graphics card driver that’s newer than the version of wine then you could have problems, but this is true if you’re using AMD, NVIDIA or Intel.
Comparing osu native vs wine has nothing to do with NVIDIA or AMD hardware.
No, if there are issues, there are issues. The logic only works one way:
If one person doesn’t have issues doesn’t mean some people don’t have issues. But if even one person has issues it necessarily means some people have issues
So are you saying that those are false claims?
https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207
Sorry the Reddit links: https://www.reddit.com/r/linux_gaming/comments/1nr4tva/does_the_nvidia_dx12_bug_20ish_performance_loss/
As you can see people report this from 2 years ago and also 14 days ago.
There isn’t a global 30% performance loss. There are specific games/configurations that have performance issues and bugs, but it isn’t all games.
For example, there is a current bug, if you’re using some features in VKD3D(like ray tracing) which NVIDIA has identified and is creating a fix. The problem isn’t Linux specific, if you use VKD3D on Windows it also has this problem.
The second Reddit link is a user confusing a Baldur’s Gate 3 bug, where vulkan was implemented in a buggy way, with a performance problem.
There are always bugs and performance issues that appear and get fixed, that’s the nature of Linux. The social media meme “NVIDIA sucks on Linux” is based on old issues when NVIDIA cards had bugs that broadly affected games and other software to the point where it required a lot of effort (like patching your own software using git).
This is not the case now, NVIDIA works without major issues. The strongest reason to use NVIDIA over AMD would be if you used CUDA to run local AI. AMD doesn’t work with CUDA and the projects that fix this are in the alpha stages.
Gaming-wise, unless you play video games by staring at MangoHUD and comparing your historical frame-time graphs across multiple OSs, it works just fine.
There isn’t a global 30% performance loss. There are specific games/configurations that have performance issues and bugs, but it isn’t all games.
That was not what I said. I don’t recall saying that there is a global 30% performance loss. I’m sorry if I gave margin for that interpretation.
There are always bugs and performance issues that appear and get fixed, that’s the nature of Linux.
This one in particular seem to be taking some time for Nvidia to fix.
This is not the case now, NVIDIA works without major issues.
I don’t think I was implying that it doesn’t work. My point is that for certain games that relies on certain technologies the Nvidia drivers are not optimized to reach Windows level or even AMD level on Linux for equivalents cards. It may worth reviewing the Nvidia forum link that I posted first.
I still give the benefit of the doubt that I may be missing something and need to learn better something although I’m not following your reasoning completely.
Finally, I just want to also point that I don’t have strings attached to any GPU maker. I wish we had more options but it sounds that if we want something reasonable with good open source driver support for many different types of combinations of games, hardware and technology, AMD seems our only choice in Linux given this incidental bad performance present on DX12 combined with Nvidia GPUs on Linux.
If you want to use Linux, please choose AMD. I helped install CachyOS on my sister’s RTX 5080 system and its horrible. 40% performance loss. She’s going back to Windows.
I guess this is because the 5080 drivers are still fresh? I used to play on an 1650S and although the Wayland enviroment was the worst shit I had ever have experienced, gaming in general in X11 was normal.
Nevermind, she’s sticking with Linux. Tinkering with it actually fixed most of the major issues.
I only play older games, opensource games (like Pioneer Space Sim, Luanti), and emulate PS2 mostly (could do PS3/4 you bet) so AMD is fine for my use case and works out of the box. I know Nvidia Linux support has improved which means the latest graphics cards also pretty much work out of the box too. But by principle, I support AMD for the work they put into working on Linux.
I feel like most people who use Nvidia on Linux just got their machine before they were Linux users, with a small subset for ML stuff.
Honestly, I hear ROCm may finally be getting less horrible, is getting wider distro support, and supports more GPUs than it used to, so I really hope AMD will become as livable ML dev platform as it is a desktop GPU.
Yep, that’d be me. That said if I were to buy a new GPU today (well, tomorrow, waiting on Valve announcement for its next HMD) I might still get an NVIDIA because even though I’m convinced 99% of LLM/GenAI is pure hype, if 1% might be useful, might be built ethically and might run on my hardware, I’d be annoyed if it wouldn’t because ROCm is just a tech demo but is too far performance wise. That’d say the percentage is so ridiculously low I’d probably pick the card which treats the open ecosystem best.
From what I’ve heard, ROCm may be finally getting out of its infancy; at the very least, I think by the time we get something useful, local, and ethical, it will be pretty well-developed.
Honestly, though, I’m in the same boat as you and actively try to avoid most AI stuff on my laptop. The only “AI” thing I use is I occasionally do an image upscale. I find it kind of useless on photos, but it’s sometimes helpful when doing vector traces on bitmap graphics with flat colors; Inkscape’s results aren’t always good with lower resolution images, so putting that specific kind of graphic through “cartoon mode” upscales sometimes improves results dramatically for me.
Of course, I don’t have GPU ML acceleration, so it just runs on the CPU; it’s a bit slow, but still less than 10 minutes.
I use local ai for speech/object recognition from my video security system and control over my HomeAssistant and media services. These services are isolated from the Internet for security reasons, that wouldn’t be possible if they required OpenAI to function.
ChatGPT and Sora are just tech toys, but neural networks and machine learning are incredibly useful components. You would be well served by staying current on the technology as it develops.
ROCm works just fine on consumer cards for inferencing and is competetive or superior in $/Token/s and beats NVIDIA power consumption. ROCm 7.0 seems to be giving >2x uplift on consumer cards over 6.9, so that’s lovely. Haven’t tried 7 myself yet, waiting for the dust to settle, but I have no issues with image gen, text gen, image tagging, video scanning etc using containers and distroboxes on Bazzite with a 7800XT.
Bleeding edge and research tends to be CUDA, but mainstream use cases are getting ported reasonably quickly. TLDR unless you’re training or researching (unlikely on consumer cards) AMD is fine and performant, plus you get stable linux and great gaming.
I fall into this category. Went Nvidia back in 16 when I built my gaming rig expecting that I would be using windows for awhile as gaming on Linux at that point wasn’t the greatest still, ended up deciding to try out a 5700xt (yea piss poor decision i know) a few years later because I wanted to future proof if I decided to swap to linux. The 5700XT had the worst reliability I’ve ever seen in a graphics card driver wise, and eventually got so sick of it that I ended up going back to Nvidia with a 4070. Since then my life opened up more so I had the time to swap to Linux on my gaming rig, and here we are.
Technically I guess I could still put the 5700XT back in, and it would probably work better than being in my media server since Nvidia seems to have better isolation support in virtualized environments but, I haven’t bothered doing so, mostly because getting the current card to work on my rig was a pain, and I don’t feel like taking apart two machines to play hardware musical chairs.
I completely upgraded my desktop like a month before I decided to make the switch. If I planned ahead just a bit more I would’ve gone with an AMD card for sure. This 4090 is still new enough that I can probably trade it in, but that’s such a pain in the ass.
That is correct in my case, startedwith linux earlier this year. Will be switching to AMD for the next upgrade.
I did this.
From:
Intel i7 14700K + 3080 TI
To:
Ryzen 7700X + RX 7900 XTX.
The difference on Wayland is very big.
Did you see any performance change because that setup seems pretty equivalent to me
Absoluletly. All my issues just disappeared, performance went way higher and the smoothness is even very noticeable on the desktop. On top of that there are things like Steam Game Mode that only work on AMD because of their FOSS driver.
NVIDIA has finally learned the lesson but they are a few years behind of AMD, it will take time for their FOSS driver to mature.
No, nvidia are evil unreliable pieces of shit.
I use AMD, where ever it is possible. Simply because they support Linux. There’s really no other reason needed. I don’t care about CUDA or anything else, that is vaguely not relevant. I’d rather drive a medium car, that gives me freedom, than a high end car, that ties me down.
CUDA acceleration.
No
Thanks. That is what I thought but is good to confirm if we are not missing something.
Literally only CUDA. Rocm mostly works.
it is better to go with AMD because AMD drivers are built into the iso and less headache for gaming
Two pretty massive facts for anybody trying to answer this question:
-
Since driver version 555, explicit sync has been supported. This makes a massive difference to the experience on Wayland. Most of the problems people report are for drivers earlier than this (eg. black screens and flicker).
-
Since driver version 580, NVIDIA uses Open Source modules to interact with the kernel. These are not Open Source drivers. They are the proprietary drivers from NVIDIA that should now “just work” across kernel upgrades (like AMD has forever). This solves perhaps the biggest hassle of dealing with NVIDIA on Linux.
Whether you get to enjoy these significant improvements depends on how long it takes stuff to make it to your distribution. If you are on Arch, you have this stuff today. If you are on Debian, you are still waiting (even on Debian 13).
This is not an endorsement of either distro. They are simply examples of the two extremes regarding how current the software versions are in those distros. Most other distros fall somewhere in the middle.
All this stuff will make it to all Linux users eventually. They are solved problems. Just not solved for everyone.
Does KMS work with an nvidia gpu now? I remember ages ago the boot sequence would be stuck at 640x480 until X started.
Are you sure that the DX12 performance loss is already addressed on Nvidia GPUs? Do you have a source?
https://forums.developer.nvidia.com/t/directx12-performance-is-terrible-on-linux/303207
They are the proprietary drivers from NVIDIA that should now “just work” across kernel upgrades (like AMD has forever).
Are you sure that is how it works for AMD in Linux?
-
yes, HDMI 2.1. if you use a tv as a monitor, you won’t get 4k120 with amd cards on linux because hdmi forum is assholes
You can still get it if you use DisplayPort though, no?
yes, but TVs don’t have DP
I have a 6900xt and it has output for 4k 120 and I never had issues with it on multiple distros. Lately Bazzite has been behaving as expected so I don’t know where this information is coming from besides the argument that HDMI is closed source as opposed to DisplayPort.
hdmi 2.0 doesn’t have the bandwidth for 4k120, displayport and hdmi 2.1 do. amd drivers don’t have hdmi 2.1 driver, because the hdmi forum didn’t allow amd to use it in their open source linux driver. you still get 4k120 with dp and even on hdmi if you use limited colorspace
I agree I mentioned the lack of open source support from HDMI 2.1 because that company is exceptionally lazy.
It’s greed. Not laziness.
That is a fair reason and a good remind actually. Thanks!