If you haven’t been keeping up with the recent events in the hardware world or if you’ve only lately become interested in PC building and are thinking about getting an Nvidia graphics card, then you’ve undoubtedly noticed that the aforementioned company offers two seemingly different types of GPUs: GTX and RTX.
So, what’s the deal with that, what are the differences between GTX and RTX GPUs, and which should you pick? We’ll go over all of those questions in this article, so read on!
Table of ContentsShow
All of Nvidia’s gaming-oriented GPUs belong to their GeForce GPU brand that goes back to 1999, starting with the GeForce 256. Since then, the company has released dozens and dozens of different GPUs over the years, culminating in what are currently the two latest GPU lineups: the GeForce 20 series that launched in 2018 and the GeForce 16 series that launched 2019.
Now, the GeForce 20 series is comprised exclusively of RTX GPUs, and the GeForce 16 series is comprised solely of GTX GPUs. So, what do those letters mean?
Well, neither RTX nor GTX are abbreviations of any kind, and neither has a specific meaning per se. Instead, they are simply there for the sake of branding.
Nvidia has been using several similar two-letter and three-letter designations to provide a general indication as to what kind of performance a GPU can offer. For example, they used to use designations such as GT, GTS, GTX, along with many others over the years, though only the GTX and the new RTX designations “survived” into the present day, at least for the time being.
As mentioned above, all of the latest 20 series and 16 series GPUs come with RTX and GTX designations, respectively, but why is the series number going down instead of up? Moreover, does this imply that GTX GPUs aren’t as good as their RTX counterparts?
GeForce 20 vs. GeForce 16
First and foremost, we should note that the 20 series and the 16 series i.e., that the latest RTX and the latest GTX GPUs are both based on the same Turing GPU microarchitecture that Nvidia had originally introduced in 2018.
However, despite both series being based on the same architecture, the 20 series came first. And when it launched in 2018, Nvidia wanted to focus on the advanced features that the new architecture could offer. The lineup was comprised of upper mid-range and high-end GPUs that could demonstrate said features, and these were the first GPUs to come with the newly-introduced RTX designation.
Meanwhile, the 16 series came a year later because Nvidia also needed to offer some more budget-friendly solutions for those who couldn’t afford to spend $400 or more on a graphics card. These GPUs, however, came without the aforementioned advanced features, and so they retained the old GTX designation.
That said, currently, GTX GPUs are indeed weaker than RTX GPUs, but that is by design. The new RTX designation was introduced mainly for the sake of marketing in order to make the new GPUs come across as a big step forward, as something truly new, and the RTX designation itself was inspired by the most marketable new feature introduced with the GeForce 20 series: real-time ray tracing.
Now, real-time ray tracing is made possible by RT Cores that are currently found only in the 20 series and are absent from the 16 series. On top of that, there are also the Tensor Cores that provide AI acceleration, and when it comes to gaming, these enhance ray tracing performance and enable Deep Learning Super Sampling.
If we take those two key characteristics out of the picture, the 16 series GTX GPUs and the 20 series RTX GPUs aren’t really that different. Obviously, the more expensive RTX GPUs come with more transistors, more cores, better memory, and more, all of which allows them to offer better overall performance than their cheaper GTX counterparts. However, they don’t necessarily offer better value for your money.
With that out of the way, what are these new features and are they worth getting an RTX GPU for?
What Are RT Cores?
As mentioned above, RT cores, short for ray tracing cores, are GPU cores dedicated solely to real-time ray tracing.
What ray tracing does for video game graphics is it allows for much more realistic lighting and reflections. As the name suggests, this is achieved by tracing the paths of virtual rays of light, which allows the GPU to run a much more realistic simulation of how light interacts with the environment.
Ray tracing is still possible even on GPUs with no RT cores, but in that case, the performance is nothing short of abysmal, even on older flagship GPUs such as the GTX 1080 Ti.
And speaking of performance, real-time ray tracing actually delivers a big performance hit even when used with RTX GPUs, which inevitably leads to the question – is ray tracing even worth it?
As of April 2020, there are just over twenty titles that arrive with ray tracing, and only a few of them are new AAA games.
As you can see from the video above that shows how ray tracing looks in Control, the graphics enhancements provided by ray tracing are minor, and you probably wouldn’t even notice them unless you were specifically looking for them, all the while it cuts the FPS in half, down to 30 from a stable 60, and that’s with the high-end RTX 2070 Super GPU.
While the framerate drop is not as drastic in every game, the above is generally true – ray tracing currently offers very limited visual enhancements, all the while delivering a big performance hit.
Now, we don’t mean to say that real-time ray tracing is a gimmick, even though it might seem that way right now. Quite the contrary, it is an important advancement that will significantly enhance video game graphics in the years to come. Still, at the moment, the hardware just isn’t powerful enough, and developers aren’t making full use of the feature yet.
What Are Tensor Cores?
Despite ray tracing being the most heavily-marketed feature of the 20 series RTX GPUs, the Turing architecture also introduced another major new feature to the mainstream GeForce lineup – enhanced deep learning capabilities made possible with the help of specialized Tensor Cores.
These cores were originally introduced in 2017 in Nvidia’s Volta GPUs, but no gaming GPUs were based on this architecture. The Tensor cores present in the Turing GPUs are actually second-generation Tensor cores.
Now, when it comes to gaming, deep learning currently has one main application: deep learning super sampling, or DLSS for short, which is a brand-new anti-aliasing method.
So, how exactly does DLSS work, and is it better than conventional anti-aliasing methods?
What DLSS does is it uses deep learning models to generate detail and upscale the image to a higher resolution, thus making it sharper and reducing aliasing in the process. The aforementioned deep learning models are constructed on Nvidia’s supercomputers and are then executed by your GPU’s Tensor cores.
That said, DLSS makes for a crisper image, but it is also less hardware-intensive than most other anti-aliasing methods. Moreover, it can also noticeably improve performance when ray tracing is turned on, which is a good thing considering just how big of a performance hit ray tracing can deliver.
However, much like ray-tracing, the number of games that currently support DLSS is lamentably small, and there are even fewer games out there that support DLSS than there are games that support real-time ray tracing.
So, to summarize, the RTX designation was introduced by Nvidia mainly for the sake of marketing, making their 20 series Turing GPUs appear like a bigger upgrade than they actually are.
Granted, they do introduce two major new elements that are going to be essential in upcoming years, but as far as raw performance is concerned, they aren’t too far ahead of older Pascal-based GTX GPUs that came with similar price tags attached.
With all of the above in mind, we wouldn’t say that RTX GPUs are worth getting for the ray tracing and the DLSS alone, and performance-per-dollar should always come first, especially if you want to get the best value for your money.
If you’re shopping for a new graphics card, then you might want to check out this article where we list some of the best graphics cards that you can get at the moment.