One is not necessarily better than the other, as both have areas where they excel and where they leave something to be desired. For the most part, AMD is better when it comes to budget and mid-range cards, whereas Nvidia is the only way to go for high-end graphics cards.
But like most arguments, there can be a lot of bias regarding this topic. So, is one necessarily better than the other from an objective standpoint?
Let’s find out.
Table of ContentsHide
Performance and Pricing
Naturally, the first thing that comes to mind whenever GPUs come up is performance. What do the benchmark scores look like? How many frames-per-second can it push? Can it run Crysis at medium-high?
Well, we simply cannot generalize on this subject since performance varies greatly from model to model, and with Nvidia and AMD, it feels like a race in a zig-zag pattern, at least in the low-end and the mid-range.
Namely, the two giants are at fairly even odds when it comes to the more affordable solutions, though more often than not, AMD tends to have the upper hand here. Their graphics cards almost always offer better value for your money, as they can constitute a significant step up in terms of performance all the while barely costing more than what Nvidia is offering. What’s more, sometimes they not only perform better but are cheaper as well.
However, past the “mid-range” i.e. after we go past the $250-$350 price range, things become a little different. From then on, AMD has virtually no foothold (apart from the two Vega GPUs which are hardly cost-effective) and Nvidia has a monopoly when it comes to the high-end GPUs. As such, the prices get quite a bit higher but the overall performance does, too.
For now, the only viable new AMD GPUs are their RX 500-series graphics cards which consistently offer more bang for your buck than competing Nvidia models do. The beefier cards such as the RX 580 and the RX 590 are actually quite competent even in QHD, but if we’re talking about some serious QHD performance or 4K gaming, Nvidia is still the only way to go, although the overall situation might change in the near future with the upcoming releases of AMD’s Navi cards and Nvidia’s mid-range Turing models.
There is simply no denying that Nvidia uses technology that is more advanced overall. Their GPUs tend to perform better at computing tasks, they generate less heat, and they consume less power.
AMD cards, on the other hand, make up for what they lack in the processing department by increasing the memory bandwidth on their lower-priced models. Still, they use more power and are notorious for how hot they can get.
However, this gap has been closing more and more as years go by, so the differences in terms of all of the abovementioned factors are currently marginal at best. AMD’s Polaris lineup includes mainly 14nm GPUs, as well as a single 12nm model, while Nvidia’s latest Turing models are all 12nm GPUs. AMD’s upcoming Navi microarchitecture will reportedly be using a 7nm fabrication process, and if that ends up being true, it will be interesting to see how Nvidia handles the new competition.
CUDA Cores vs Stream Processors
The two above technologies are used by Nvidia and AMD respectively, and they are both simply GPU cores. So, while they are essentially the same thing, neither is inherently better than the other and no concrete performance estimates can be drawn from comparing the number of CUDA cores with the number of Stream processors in two GPUs.
In the end, it’s all about software optimization. As such, the only area where you’ll see a noticeable difference in performance is when certain proprietary technologies come into play, such as Nvidia PhysX or Nvidia HairWorks.
Good, well-optimized software can spell a world of difference for any piece of hardware, something that a certain company has aptly demonstrated time and time again. For the graphics card, there are the drivers and the control panel to consider.
There is not much to say about the drivers themselves, as both Nvidia and AMD release new and stable drivers frequently. If we had to be nitpicky, we’d have to say that Nvidia does have a slightly better track record in terms of stability and consistency.
As for the control panels, we have the Nvidia Control Panel and the AMD Control Center. You will immediately notice that the Nvidia Control Panel looks quite dated – as a matter of fact, it still looks like it’s running on the long-discontinued Windows XP. AMD’s Control Center, on the other hand, looks a whole lot better, boasting a clean and modern design, complete with some eye candy in the form of background blur effects.
As for the actual functionality, the two are largely on even terms, barring the features that are unique to each company’s GPUs. Speaking of which…
In this section, we will be taking a closer look at the several features that are specific to either Nvidia or AMD GPUs and see how they fare when pitted against one another.
Recording and Streaming – Nvidia Shadowplay vs AMD ReLive
As you probably know, you will inevitably take an FPS hit when recording or streaming your gaming session. So, unless you are willing to invest in a decent capture card, your best bet for keeping a stable framerate would be using the software that comes with your GPU drivers. That would be Shadowplay and ReLive for Nvidia and AMD respectively.
You can see the exact recording and streaming data over at GamersNexus. As evident from the table shown in the linked article, Shadowplay seems to have the upper hand in terms of video quality when it comes to both recording and streaming, as it supports higher bitrates. Other than that, they are on fairly even terms, as they can both only record and stream at either 30 or 60 FPS.
Vertical Synchronization Substitute – Nvidia Gsync vs AMD FreeSync
While V-Sync is great for 60Hz monitors, it simply won’t do once the refresh rates get higher. Namely, V-Sync prevents screen tearing by imposing a cap on the number of frames that the GPU dishes out, certain problems arise when we go beyond 60 Hz. For one, there is the FPS cap itself, but stuttering and input lag are big problems that you definitely don’t want to deal with if you’ve invested in a 144Hz or a 240Hz monitor.
Now, Nvidia and AMD have both come up with their own hardware-reliant adaptive sync alternatives. With adaptive sync, the refresh rate of the monitor is adapted to the framerate, so the two are always in sync, there is no screen tearing, and there is no input lag. However, there’s a downside to everything, and FreeSync and G-Sync are no exception.
First of all, FreeSync is only compatible with AMD graphics cards and G-Sync is only compatible with Nvidia graphics cards. However, these two technologies aren’t reliant solely on the GPU but also on the monitor.
Now, in order to be compatible with either of these technologies, a monitor needs a built-in scaler module. When it comes to G-Sync, these are proprietary Nvidia modules, so because the OEMs have to pay licensing fees to Nvidia to implement this technology, G-Sync monitors tend to be on the pricey side. AMD takes a more liberal approach, as FreeSync can work with any third-party scaler module. As such, it can be found in monitors at virtually any price point.
But of course, Nvidia’s strict control ensures that G-Sync is properly implemented in every G-Sync monitor, and it goes beyond mere adaptive sync – it adds other handy features such as motion blur reduction, the elimination of ghosting, etc. In contrast, the implementation of FreeSync isn’t always flawless, and many FreeSync monitors only support this technology in a framerate range that’s specified by the manufacturer.
With all of that in mind, FreeSync is obviously the better choice for those on a tighter budget, although G-Sync is objectively superior if we take the pricing out of the equation.
So, with all things considered, which is better, Nvidia or AMD?
The answer is – neither. In truth, it all comes down to your requirements and your budget, as both the Nvidia and the AMD graphics cards are great at what they do.
The bottom line is, AMD is still a better choice for low-end and mid-range gaming setups, as it has been for a while now. Radeon cards simply present much better value for your money in this range. On the other hand, if you are ambitious and aiming for high framerates in QHD or even 4K, then Nvidia is the only real choice.
But of course, as already mentioned in the article, things might very well change soon.