The 2010s haven’t been a good time for AMD. In addition to struggling with their FX series of CPUs that were years behind what Intel was selling, AMD also had trouble competing with Nvidia in the GPU market, as “Team Green” dominated the high-end while AMD could only really keep up in the low-end and the mid-range.
However, AMD finally made a comeback in 2017, starting with the release of their long-awaited Ryzen CPUs that are now a very popular choice for gaming builds. Then, in 2019, AMD launched its 7nm RDNA-based Navi GPUs in an attempt to catch up with Nvidia, too.
In this article, we’ll see just how well AMD manages to compete with Nvidia in 2020 and, ultimately, which company offers better gaming GPUs at the moment.
Table of ContentsShow
Pricing & Performance
Naturally, the main question when discussing GPUs is the question of performance. How does it score in benchmarks, and what kind of performance can it manage in different resolutions?
Well, it’s impossible to generalize on this subject since performance obviously varies wildly from model to model, and depending on the price.
If we look at budget models, AMD used to have the upper hand in this department, as their budget GPUs almost consistently outperformed whatever Nvidia could offer at the same price point. When comparing the latest budget offerings from both companies, Nvidia pulls slightly ahead, though ironically, some AMD’s old Polaris-based RX 500 cards are still the best picks if you’re pinching pennies, as they offer very good value for your money.
The new RX 5500 XT is a bit disappointing. The 4 GB ($169) and 8 GB ($199) variants of the card are on roughly even terms with Nvidia’s GTX 1650 Super ($159) and can barely come close to the GTX 1660 Super ($229), which ultimately makes the RX 5500 XT a very unappealing card for those looking to get the best value for their money in this price range.
Meanwhile, in the mid-range, the competition is a bit tighter, and AMD’s beefier Navi models can more than hold their own against both the GTX and the RTX cards that Nvidia is offering at the moment.
The RX 5600 XT ($279) performs significantly better than the similarly-priced GTX 1660 Ti ($279), all the while keeping up with the original RTX 2060 ($349), albeit the upgraded RTX 2060 Super ($399) does have the upper hand.
Next, there’s the RX 5700 XT ($399), which can not only outperform the RTX 2060 Super at the same price point, but can also keep up with and even outperform the more expensive RTX 2070 Super ($499) in some games.
However, as we get to the high-end, it’s the same old story – Nvidia pretty much has a monopoly here, as their RTX 2070 Super, RTX 2080 Super, and RTX 2080 Ti dominated the market. And while the RX 5700 XT can compete with the RTX 2070 Super and offers very good value for the money, AMD currently has nothing that would challenge Nvidia’s flagship cards.
However, there are some big changes coming up.
AMD’s upcoming RX 6000 GPU lineup will include the so-called “Big Navi“, the RX 6900 XT, that should finally level the playing field in the high-end. However, we still don’t know what exact kind of performance we should expect from this GPU i.e. which Nvidia GPU it will aim to compete with.
Sure enough, Nvidia is stepping its game up as well, as the new RTX 30 GPUs make for a big upgrade compared to the RTX 20 ones. For example, the RTX 3080 ($700) is faster than even the RTX 2080 Ti ($1000+), so AMD’s competition will be even more serious by the time Big Navi drops.
Keep in mind that we used MSRP pricing and average performance for reference when commenting on the different GPUs above, as both the performance and the pricing will inevitably vary slightly from model to model and from game to game.
If you’re shopping for a new graphics card right now, you might want to check out our selection of the best graphics cards of 2020. Note that we do our best to keep our buying guides up to date, so if you notice some outdated info, that means the article is likely slated for an update in the near future.
Real-Time Ray Tracing – Is It Worth It?
The most heavily-marketed new feature of the 2018 Turing GPUs were their real-time ray tracing capabilities, so what is real-time ray tracing, and is it worth it in 2020?
As the name implies, with real-time ray tracing, in-game lighting can be rendered much more realistically as the GPU traces the paths of virtual rays of light and thus more accurately simulates the way that they interact with objects in the environment. Naturally, the benefits of ray tracing are most noticeable when there are a lot of reflective surfaces around.
Now, there’s no denying that ray tracing is an important step forward when it comes to the ongoing decades-old quest for photorealistic graphics. But at the moment, is it really that big of a deal?
Well, there are several factors that put the brakes on the ray-tracing hype train, and those are:
- It is demanding on the hardware. When turned on, ray tracing can deliver a big performance hit, sometimes outright cutting the FPS in half. This performance hit is most noticeable with cheaper RTX models, and it also varies from game to game.
- The benefits aren’t always obvious. Sure, tech demos and segments designed to show off ray tracing will look amazing with this feature turned on, but when there are no reflective surfaces around, ray tracing still delivers a noticeable performance hit all the while offering little to no discernible improvement in terms of visuals.
Relatively few games support it. As of September 2020, relatively few games support ray tracing, and only a handful of those are mainstream AAA titles.
However, ray tracing has been a new, Nvidia-exclusive feature these past two years. Now, since AMD will also be introducing ray tracing with their new RDNA2 GPUs, it is bound to become a much more accessible feature.
As a matter of fact, both the PlayStation 5 and the Xbox Series X will support ray tracing, as they feature custom RDNA2-based GPUs. And since even the consoles will support it, that means ray tracing will become a mainstream thing.
And what that means, in turn, is that more and more developers will take ray tracing seriously and will attempt to implement it in their games.
VRR – AMD FreeSync vs Nvidia G-Sync
While V-Sync is good enough for 60Hz monitors, it’s simply not viable for monitors with high refresh rates e.g., 120 Hz, 144 Hz, 240 Hz, and others.
This is because V-Sync prevents screen tearing by imposing a cap on the number of frames that the GPU dishes out, thus ensuring that the framerate and the monitor’s refresh rate never fall out of sync. However, it comes with its share of drawbacks such as stuttering and input lag.
At their core, these technologies function in a similar way – they both use hardware to ensure that the refresh rate of the monitor is always the same as the framerate, so the two can never fall out of sync regardless of how wildly the framerate might vary, thus removing screen tearing without any stuttering or input lag.
However, there’s a downside to everything.
Nvidia G-Sync monitors are highly expensive, and for a number of reasons. First, Nvidia runs tight quality control, and all G-Sync monitors need to meet their standards before being approved. Moreover, OEMs have to pay licensing fees to use G-Sync, and they have to buy the scaler modules directly from Nvidia, as they are the only ones who make them.
A scaler module is a piece of hardware built into the monitor that makes VRR possible, and while Nvidia essentially has a monopoly on G-Sync scalers, AMD allows OEMs to use third-party scaler modules and they don’t require them to pay licensing fees to use the technology. As a result, FreeSync is much more popular and readily available in cheaper monitors, but its implementation isn’t always flawless, and some monitors will only work in a specific range.
Finally, we have to address the question of compatibility. In the past, FreeSync was only compatible with AMD GPUs, and G-Sync was only compatible with Nvidia GPUs. Now, the situation is a bit different.
Currently, there are some G-Sync Compatible monitors out there i.e. FreeSync monitors that can work with Nvidia GPUs; however, they never passed Nvidia’s tests so there’s no guarantee and not all G-Sync features are available with these monitors. Meanwhile, while G-Sync monitors couldn’t be used with AMD GPUs in the past, this is also about to change.
At the end of the day, both of these technologies will get the job done, but FreeSync is obviously a budget choice, while G-Sync is the premium one. The proprietary nature of G-Sync still makes it quite expensive, but Nvidia is slowly shifting to a more liberal approach, so who knows what might happen further down the road.
So, all things considered, which company currently offers better GPUs, Nvidia or AMD?
The answer is – neither.
Well, simply because it’s impossible to make generalizations without comparing specific GPU models, as both companies offer different solutions at different price points that can suit the requirements and budget constraints of a wide range of customers. On top of that, the situation can change drastically year-to-year.
Competition is definitely much better than it has been over the past few years, that much is certain. AMD used to be the definitive choice for budget and mid-range builds while Nvidia had a monopoly in the high-end, but now things are changing, as Nvidia is offering more competition in the lower price ranges while AMD is also preparing to take Nvidia on in the high-end.
Overall, the two are on fairly even terms now, and the more AMD manages to close the gap and provide adequate competition across the entire price spectrum, the better it will be for all the consumers’ wallets.