The 2010s haven’t been a good time for AMD. In addition to struggling with their FX series of CPUs that were years behind what Intel was selling, AMD also had trouble competing with Nvidia in the GPU market, as “Team Green” dominated the high-end while AMD could only really keep up in the low-end and the mid-range.

However, AMD finally made a comeback in 2017, starting with the release of their long-awaited Ryzen CPUs that are now a very popular choice for gaming builds. Then, in 2019, AMD launched its 7nm RDNA-based Navi GPUs in an attempt to catch up with Nvidia, too.

In this article, we’ll see just how well AMD manages to compete with Nvidia in 2020 and, ultimately, which company offers better gaming GPUs at the moment.

Table of ContentsShow

Performance and Pricing

amd vs nvidia

Naturally, the main question when discussing GPUs is the question of performance. How does it score in benchmarks, and what kind of performance can it manage in different resolutions?

Well, it’s impossible to generalize on this subject since performance obviously varies wildly from model to model, and depending on the price.

If we look at budget models, AMD used to have the upper hand in this department, as their budget GPUs almost consistently outperformed whatever Nvidia could offer at the same price point. When comparing the latest budget offerings from both companies, Nvidia pulls slightly ahead, though ironically, some AMD’s old Polaris-based RX 500 cards are still the best picks if you’re pinching pennies, as they offer very good value.

The new RX 5500 XT is a bit disappointing. The 4 GB ($169) and 8 GB ($199) variants of the card are on roughly even terms with Nvidia’s GTX 1650 Super ($159) and can barely come close to the GTX 1660 Super ($229), which ultimately makes the RX 5500 XT a very unappealing card for those looking to get the best value for their money in this price range.

Meanwhile, in the mid-range, the competition is a bit tighter, and AMD’s beefier Navi models can more than hold their own against both the GTX and the RTX models that Nvidia can offer at the moment.

The RX 5600 XT ($279) performs significantly better than the similarly-priced GTX 1660 Ti ($279), all the while keeping up with the original RTX 2060 ($349), albeit the upgraded RTX 2060 Super ($399) usually has the upper hand.

Next, there’s the RX 5700 XT ($399), which can not only outperform the RTX 2060 Super at the same price point, but can also keep up with and even outperform the more expensive RTX 2070 Super ($499) in some games.

However, as we get to the high-end, it’s the same old story – Nvidia pretty much has a monopoly here, as their latest RTX 2070 Super, RTX 2080 Super, and RTX 2080 Ti are currently the most powerful cards in the market. While the RX 5700 XT can compete with the RTX 2070 Super, AMD currently has nothing that would challenge Nvidia’s flagship cards.

In summary, Nvidia has currently taken the lead as far as budget solutions are concerned. However, AMD is offering better competition in the mid-range than ever before, and while they currently have no high-end cards, the “Big Navi” is expected to drop sometime in 2020, and it will hopefully level the playing field and bring the high-end prices down a notch.

Keep in mind that we used MSRP pricing and average performance for reference when commenting on the different GPUs above, as both the performance and the pricing will inevitably vary slightly from model to model and from game to game.

If you’re shopping for a new graphics card right now, you might want to check out our selection of the best graphics cards of 2020. Note that we do our best to keep our buying guides up to date, so if you notice some outdated info, that means the article is likely slated for an update in the near future.

Real-Time Ray Tracing - Is It Worth It?


The most heavily-marketed new feature of the 2018 Turing GPUs were their real-time ray tracing capabilities, so what is ray tracing, and is it worth it in 2020?

As the name implies, with real-time ray tracing, in-game lighting can be rendered much more realistically as the GPU can trace the path of virtual rays of light and thus more accurately simulate the way that they interact with objects in the environment. Naturally, the benefits of ray tracing are most noticeable when there are a lot of reflective surfaces around.

Now, there’s no denying that ray tracing is an important step forward when it comes to the ongoing decades-old quest for photorealistic graphics. But at the moment, is it really that big of a deal?

Well, there are several factors that put the brakes on the ray-tracing hype train, and those are:

  1. It is demanding on the hardware. When turned on, ray tracing can deliver a big performance hit, sometimes outright cutting the FPS in half. This performance hit is most noticeable with cheaper RTX models, and it also varies from game to game.
  2. The benefits aren’t always obvious. Sure, tech demos and segments designed to show off ray tracing will look amazing with this feature turned on, but when there are no reflective surfaces around, ray tracing still delivers a noticeable performance hit all the while offering little to no discernible improvement in terms of visuals.

Relatively few games support it. As of April 2020, relatively few games support ray tracing, and only a handful of those are mainstream AAA titles.

DLSS, or deep learning super sampling, is a new anti-aliasing method currently available only on RTX GPUs, and it can deliver a good boost in performance when ray tracing is turned on. However, much like ray tracing itself, only a few games support DLSS at the moment.

In any case, real-time ray tracing is currently available only on Nvidia’s RTX GPUs i.e., only those GPUs can deliver playable framerates with ray tracing turned on.

While AMD’s current GPU lineup doesn’t support ray tracing, their upcoming RDNA 2-based Navi GPUs that will be powering the PlayStation 5 and the Xbox Series X will support it.

That said, ray tracing won’t be an Nvidia-exclusive feature for long, and seeing as it presents dubious value at the moment, we’d say that it’s not a big selling point for Nvidia’s current lineup, but that’s ultimately a subjective matter.

VRR – AMD FreeSync vs Nvidia G-Sync

freesync vs gsync

While V-Sync is good enough for 60Hz monitors, it’s simply not viable for monitors with high refresh rates e.g., 120 Hz, 144 Hz, and beyond.

This is because V-Sync prevents screen tearing by imposing a cap on the number of frames that the GPU dishes out, thus ensuring that the framerate and the monitor’s refresh rate never fall out of sync. However, it comes with its drawbacks such as stuttering and input lag.

Needless to say, if you intend to get a monitor with a high refresh rate, it will inevitably come with one of two technologies: AMD FreeSync or Nvidia G-Sync.

At their core, these technologies function in a similar way – they both use hardware to ensure that the refresh rate of the monitor is always identical to the in-game framerate, so the two can never fall out of sync regardless of how wildly the framerate might vary, thus removing screen tearing without any stuttering or input lag.

Freesync Vs Gsync

However, there’s a downside to everything.

Nvidia G-Sync monitors are highly expensive, and for a number of reasons. First, Nvidia runs tight quality control, and all G-Sync monitors need to meet their standards before being approved. Moreover, OEMs have to pay licensing fees to use G-Sync, and they have to buy the scaler modules directly from Nvidia, as they are the only ones who make them.

A scaler module is a piece of hardware built into the monitor that makes VRR possible, and while Nvidia essentially has a monopoly on G-Sync scalers, AMD allows OEMs to use third-party scaler modules and they don’t require them to pay licensing fees to use the technology. As a result, FreeSync is much more popular and readily available in cheaper monitors, but its implementation isn’t always flawless, and some monitors will only work in a specific range.

What Is Gsync And Freesync

Finally, we have to address the question of compatibility. In the past, FreeSync was only compatible with AMD GPUs, and G-Sync was only compatible with Nvidia GPUs. Now, the situation is a bit different.

Currently, there are several G-Sync Compatible monitors out there i.e., FreeSync monitors that also support G-Sync and can work with Nvidia GPUs; however, they never passed Nvidia’s tests so there’s no guarantee that all G-Sync features will work properly. Meanwhile, while G-Sync monitors couldn’t be used with AMD GPUs in the past, this is also about to change.

At the end of the day, both of these technologies will get the job done, but FreeSync is obviously a budget choice, while G-Sync is the premium one. The proprietary nature of G-Sync still makes it quite expensive, but Nvidia is slowly shifting to a more liberal approach, so who knows what might happen further down the road.


nvidia vs amd

So, all things considered, which company currently offers better GPUs, Nvidia or AMD?

The answer is – neither. Why?

Well, simply because it’s impossible to make generalizations without comparing specific GPU models, as both companies offer different solutions at different price points that can suit the requirements and budget constraints of a wide range of customers.

Competition is definitely much better than it has been over the past few years. AMD used to be the definitive choice for budget and mid-range builds while Nvidia had a monopoly in the high-end, but now things are changing, as Nvidia is offering more competition in the lower price ranges while AMD is also preparing to take Nvidia on in the high-end, as “Big Navi” is set to launch in 2020.

Overall, the two are on fairly even terms now, and the more AMD manages to close the gap and provide adequate competition across the entire price spectrum, the better it will be for all the consumers’ wallets.

You Might Also Like These