The 2010s haven’t been a good time for AMD, that’s for sure.
Besides struggling with their FX series of CPUs that were years behind what Intel was selling, AMD also had trouble competing with Nvidia in the GPU market. “Team Green” dominated the high-end and AMD could only keep up in the low-end and mid-range.
AMD only made a comeback in 2017, starting with the release of their long-awaited Ryzen CPUs that are now a very popular choice for gaming builds.
In 2019, AMD launched its 7nm RDNA-based GPUs that are now to be succeeded by new RDNA2 models.
In this guide, we’ll see just how well AMD manages to compete with Nvidia in 2023 and, ultimately, which company offers better gaming GPUs at the moment.
Table of ContentsShow
Pricing & Performance
Naturally, the main question when discussing GPUs is performance. How does it score in benchmarks, and what kind of performance can it manage in different resolutions?
Well, it’s impossible to generalize on this subject since performance obviously varies wildly depending on the model you get and the money you’re willing to spend on a GPU.
AMD’s budget GPUs used to almost consistently outperform whatever Nvidia could offer at the same price point. At the moment, however, when comparing the latest budget offerings from both companies, Nvidia pulls slightly ahead. Ironically, though, some of AMD’s old Polaris-based RX 500 cards are still the best picks if you’re pinching pennies, as they offer good value for your money.
As for Navi, the RX 5500 XT was a bit disappointing. The 4 GB ($169) and 8 GB ($199) variants of the card were on roughly even terms with Nvidia’s GTX 1650 Super ($159) and could barely come close to the GTX 1660 Super ($229), which ultimately made the RX 5500 XT a very unappealing card for those looking to get the best value for their money in this price range.
However, since the latest-gen RTX 3000 and RX 6000 budget models aren’t out yet, now is not an ideal time to build a budget gaming PC, especially if you want it to be a bit more future-proof.
Meanwhile, in the mid-range, the competition was a bit tighter, and AMD’s beefier RDNA-based GPUs could more than hold their own against both the GTX and the RTX cards that Nvidia is offering at the moment.
The RX 5600 XT ($279) performed significantly better than the similarly priced GTX 1660 Ti ($279), all the while keeping up with the original RTX 2060 ($349). However, the upgraded RTX 2060 Super ($399) did have a slight upper hand, plus it also had ray tracing. In turn, the RX 5700 XT ($399) remained a great overall value pick in this range to compete with the noticeably pricier RTX 2070 Super ($499).
However, the new RTX 3060 Ti ($399) blows them all out of the water, as it offers performance comparable to last-gen high-end models such as the RTX 2080 Super. Until AMD comes forward with an RX 6700 XT or something else that can compete at the aforementioned price point, the RTX 3060 Ti will remain the best mid-range card currently on the market.
When it comes to high-end, it used to be the same old story – Nvidia pretty much had a monopoly here, as their RTX 2070 Super, RTX 2080 Super and RTX 2080 Ti dominated the market. While the RX 5700 XT could compete with the RTX 2070 Super and offered very good value for money, AMD had nothing that could challenge Nvidia’s flagship cards.
However, that changed.
AMD’s new RX 6000 series have proven to be more than capable of taking on Nvidia in the high-end. The RX 6800 ($579) outperforms the RTX 3070 ($499) by a noticeable margin in most games, thus potentially justifying the slightly higher price.
Meanwhile, the RX 6800 XT ($649) generally goes toe-to-toe with the pricier RTX 3080 ($699), although Nvidia does have a slight lead here. The AMD GPU does ostensibly offer better value for your money, but if you’re aiming at a high-end GPU in this price range, chances are that a $50 difference won’t matter much.
Finally, we have the best of the best: the RX 6900 XT ($999) and the RX 3090 ($1499), both of which are very niche products and unlikely to appeal to a broader audience.
The RX 6900 XT pulls slightly ahead of the RTX 3080 in terms of performance, but at a significant price premium. The overly expensive RTX 3090, with its whopping 24 GB of GDDR6X memory, will be more appealing to professional users rather than gamers.
Now, keep in mind that we used MSRP pricing and general performance for reference when commenting on the aforementioned GPUs. Performance will inevitably vary from game to game and the pricing will also depend on the model as well as on the market conditions.
If you’re shopping for a new graphics card right now, you might want to check out our selection of the best graphics cards of 2023. Note that we do our best to keep our buying guides up to date, so if you notice some outdated info, that means the article is likely slated for an update in the near future.
Real-Time Ray Tracing – Is It Worth It?
The most heavily marketed new feature of the 2018 Turing GPUs were their real-time ray tracing capabilities. So, what is real-time ray tracing and is it worth it in 2023?
As the name implies, with real-time ray tracing, in-game lighting can be rendered much more realistically as the GPU traces the paths of virtual rays of light and thus simulates the way that they interact with objects in the environment more accurately. Naturally, the benefits of ray tracing are most noticeable when there are a lot of reflective surfaces around.
For the past two years, ray tracing was an Nvidia-exclusive feature found only in their RTX GPUs, although AMD has now introduced ray tracing support with their latest RX 6000 models.
Now, there’s no denying that ray tracing is an important step forward when it comes to the ongoing decades-old quest for photorealistic graphics. But is it really that big of a deal at the moment?
Well, there are several factors that put the brakes on the ray tracing hype train, and those are the following:
It is demanding on the hardware. When turned on, ray tracing can deliver a big performance hit, sometimes outright cutting the FPS in half. This performance hit is most noticeable with weaker GPUs, and it also varies from game to game.
The benefits aren’t always obvious. Sure, tech demos and segments designed to show off ray tracing will look amazing with this feature turned on, but when there are no reflective surfaces around, ray tracing will offer little to no discernible improvement in terms of visuals.
Not all games support it. As of 2023, more games support ray tracing than before, but it is still far from being a universal feature, even when it comes to big AAA titles.
It’s worth noting that Nvidia GPUs do still have the advantage when it comes to ray tracing performance, especially in games that support DLSS. So, if you really care about ray tracing, Nvidia would probably be a better pick, although the new RX 6000 cards do fare admirably in terms of overall in-game performance and value.
Ray tracing is definitely becoming increasingly popular, what with both the PlayStation 5 and the Xbox Series X being compatible with it. But in any case, as we mentioned above, it’s still far from being a mainstream feature.
VRR – AMD FreeSync vs Nvidia G-Sync
While V-Sync can be good enough for 60Hz monitors, it’s simply not viable for monitors with high refresh rates such as 120 Hz, 144 Hz or 240 Hz, among others.
This is because V-Sync prevents screen tearing by imposing a cap on the number of frames that the GPU dishes out, thus ensuring that the framerate and the monitor’s refresh rate never fall out of sync. However, it comes with its share of drawbacks, such as stuttering and input lag.
At their core, these technologies function in a similar way – they both use hardware to ensure that the refresh rate of the monitor is always the same as the framerate. This prevents the two from ever falling out of sync regardless of how wildly the framerate might fluctuate, thus removing screen tearing without any stuttering or input lag.
However, there’s a downside to everything.
Nvidia G-Sync monitors tend to be more expensive for a number of reasons. First, Nvidia runs tight quality control, and all G-Sync monitors need to meet their standards before being approved. Moreover, OEMs have to pay licensing fees to use G-Sync, and they have to buy the scaler modules directly from Nvidia, as they are the only ones who make them.
A scaler module is a piece of hardware built into the monitor that makes VRR possible. While Nvidia essentially has a monopoly on G-Sync scalers, AMD allows OEMs to use third-party scaler modules and they don’t require them to pay licensing fees to use the technology.
As a result, FreeSync is much more popular and readily available in cheaper monitors, but its implementation isn’t always flawless, and some monitors will only work in a specific range.
On the bright side, there are many G-Sync Compatible monitors out there now i.e. monitors that don’t use Nvidia’s scaler modules and didn’t go through their testing process but are still compatible with G-Sync. However, they lack some of the features that you’d get with a certified G-Sync display such as ultra low motion blur, overclocking or variable overdrive.
Compatibility also used to be a big issue in the past as FreeSync only worked with AMD GPUs and G-Sync only worked with Nvidia GPUs. But the situation isn’t as black and white anymore. For instance, Nvidia GPUs now work with FreeSync monitors. AMD GPUs aren’t yet compatible with G-Sync, but this will also change soon.
At the end of the day, both of these technologies will get the job done, but FreeSync is obviously a budget choice, while G-Sync is the premium one. The proprietary nature of G-Sync still makes it quite expensive, but Nvidia is slowly shifting to a more liberal approach. So really, who knows what might happen further down the road.
So, all things considered, which company currently offers better GPUs, Nvidia or AMD?
The answer is neither.
Well, simply because it’s impossible to make generalizations without comparing specific GPU models. Both companies offer different solutions at different price points that can suit the requirements and budget constraints of a wide range of customers. On top of that, the situation can change drastically all the time.
Competition is definitely a lot better than it has been over the past few years, that much is certain. AMD used to be the definitive choice for budget and mid-range builds while Nvidia had a monopoly in the high-end. Now things are changing as Nvidia is offering more competition in the lower price ranges while AMD is finally taking Nvidia on in the high-end.
Overall, the two are on fairly even terms now. Ultimately, the more AMD manages to close the gap and provide adequate competition across the entire price spectrum, the better it will be for every consumer’s wallets.