If you’re building a new gaming PC, then the graphics card is definitely the most important component to focus on. After all, it does most of the heavy lifting when it comes to in-game graphics, so it’s clear why it should be at the top of your priority list.
But as is always the case with computer hardware, there are quite a few factors to keep in mind if you want to find the best graphics card for your needs, from performance to compatibility.
So, in this guide, we’ll be addressing the common questions that those who are new to PC building might have in this department, and if you believe you count among that crowd, read on!
Table of ContentsShow
GPU vs Graphics Card – What’s The Difference?
First and foremost, you might have noticed that the terms “GPU” and “graphics card” are often used seemingly interchangeably, but they are not the same thing.
And what is the difference?
Well, GPU stands for “graphics processing unit”, and it refers specifically to the graphics chip itself. At the moment, Nvidia and AMD are the two leading suppliers of GPUs.
The graphics card, however, doesn’t refer only to the chip. Rather, it also includes the PCB, the memory, the cooler, exterior design elements, and connectors.
Now, when buying a graphics card, there is always the option of buying them from Nvidia and AMD directly. However, most graphics cards are made by partner companies such as Asus, Gigabyte, MSI, EVGA, Sapphire, and others.
That said, while a graphics card manufacturer can alter virtually any aspect of a graphics card, they cannot modify the GPU itself, meaning that an RTX 3070 is an RTX 3070, no matter what PCB the manufacturer installs it on or what sort of cooling solution they set it up with.
And now that we have made this distinction, it’s time to get to the main subject, and that is finding the ideal graphics card.
Finding The Right GPU
When trying to find the right GPU for your needs, there are two key questions that you need to answer: what resolution do you want to play games in and what framerate are you hoping to achieve?
The resolution indicates the number of pixels displayed on-screen, and the more pixels there are, the more realistic and detailed the game will appear. In 2023, 1080p (FHD), 1440p (QHD), and 2160p (4K) are all popular resolutions for gaming, but playing games at a higher resolution requires more processing power. As such, the higher the resolution, the lower the framerate will be.
Generally speaking, the latest mid-range GPUs can now handle QHD at 60+ FPS and even 4K at 30-60 FPS. However, performance can vary wildly from game to game depending on how demanding and how optimized it is. On the bright side, you’re only a quick Google search away from discovering what sort of performance you can expect from a particular GPU in a specific game.
Speaking of the framerate, i.e., frames-per-second (FPS), it indicates how many frames the graphics card can render and output to the display each second, as the name implies. More frames equals a more fluid and responsive gaming experience, so it’s easy to see why some gamers (especially those focusing on competitive multiplayer) place performance first and graphics fidelity second.
So, while the average gamer would likely be more than happy with a stable 60 FPS, some inevitably want to aim even higher, which is where monitors with high refresh rates come in. Essentially, the average monitor or TV can usually display a maximum of 60 frames-per-second, so if you wanted more, you’d have to invest in a 144 Hz or a 240 Hz gaming monitor.
For a more detailed look at monitors, we suggest checking out our monitor buying guide, where we answer more questions on that subject and suggest some very good picks if you’re also thinking about getting a monitor soon.
Speaking of monitors, there’s also the question of VRR.
Much like framerates, VRR (variable refresh rate) is tied to the monitor, so what is it and what purpose does it serve?
When it comes to monitors with high refresh rates, they inevitably utilize VRR technology in order to keep the gaming experience smooth without any unsightly screen tearing or jarring stuttering. Today, you’ll find gaming monitors equipped with either AMD FreeSync or Nvidia G-Sync.
These technologies were developed by AMD and Nvidia respectively, and they both have their pros and cons. If you want, you can read the full article on that subject here, but here’s the gist of it:
- FreeSync is open-source and cheaper to implement so you can find it in a wider range of monitors, including some remarkably affordable ones. However, FreeSync monitors support a limited frequency range, and those are listed on AMD’s site.
- G-Sync, unlike FreeSync, is a proprietary Nvidia technology, so it’s more expensive to implement, meaning that G-Sync monitors tend to be a fair share pricier than their FreeSync counterparts. However, they make up for this with more stable performance and some additional features.
Then, there’s also the question of compatibility. Up until recently, FreeSync was compatible only with AMD GPUs while G-Sync was compatible only with Nvidia GPUs. Today, the situation is a bit more complicated, as some FreeSync monitors are now certified as “G-Sync Compatible”, meaning that they can support adaptive sync when used with an Nvidia GPU but not access the full G-Sync feature set.
At the moment, AMD GPUs are not compatible with G-Sync, but that could change soon, as Nvidia will reportedly be making G-Sync more accessible for consumers who don’t own Nvidia GPUs.
Real-Time Ray Tracing
One of the major new features that you’ll come across in every discussion about graphics today is real-time ray tracing. And what does this feature offer?
Essentially, it traces the paths of individual in-game rays of light in order to create extremely realistic reflections and lighting. Up until recently, it was a feature exclusive to Nvidia RTX GPUs and it had some big downsides that made it feel more like a gimmick than a major feature. Mainly, it was very demanding and could lead to massive FPS drops, not to mention that not that many games implemented it well.
Now, AMD has added ray tracing to their arsenal with the new RDNA2-based GPU RX 6000 lineup, plus both the new Xbox and PlayStation consoles are using RDNA2 GPUs, so ray tracing is becoming a much more accessible, more mainstream feature.
What this means is that developers are bound to start focusing on ray tracing more but Nvidia does still have an edge when it comes to ray tracing performance due to DLSS.
Finding The Right Graphics Card
Once you’ve settled on which GPU you’re going to get, the next step is to pick the actual card. Now, as mentioned before, while Nvidia and AMD are the ones manufacturing the chips i.e. the GPUs themselves, their partners are the ones manufacturing and selling the bulk of the graphics cards that you can buy today.
And in this section, we’ll go over all the factors that can vary from card to card.
First up, there’s the design and the aesthetics of the card. Granted, if you’re a function-over-form kind of gamer, you won’t care about this at all. However, with translucent cases rising in popularity, it’s obvious why hardware manufacturers place more stock in aesthetics than they did before.
When we’re talking about graphics card design, by that we are usually referring to the design of the shroud, the design of the backplate (if the card has one), and RGB lighting.
As you might expect, manufacturers want to make their products feel distinct from the competition, so different graphics cards can come with different design elements that accomplish precisely this.
In the past, companies often included colored highlights that functioned as something of a manufacturer signature e.g. Gigabyte cards had orange and Zotac cards had yellow. In 2023, however, manufacturers usually opt for a neutral overall design in order to ensure that their cards would be able to fit into as many PCs as possible when it comes to the aesthetics.
Of course, that doesn’t mean that color is completely gone. Instead, it is relegated to RGB, which is far superior to static decals. The customizability of RGB lighting allows the card to fit in with virtually any setup, so it’s easy to understand why it’s such a popular feature not only among graphics cards but also other components and gaming accessories.
In addition to RGB, backplates have also become a fairly common part of modern graphics cards’ design, so what purpose do they serve?
For the most part, backplates help support the card, i.e., they prevent the PCB from bending under the weight of the cooler, plus they can also make the back of the card easier to clean. However, the aesthetics are the main appeal behind backplates, as they are obviously much more pleasant to look at than a messy PCB.
Moreover, some backplates come complete with thermal pads that allow them to function as heatspreaders, but most backplates don’t really make much of a difference when it comes to the cooling.
Something that you’ll also notice when shopping for a graphics card is that they are not all the same size. Some seem quite small while others look like massive bricks that make standard-sized graphics cards seem diminutive in comparison.
So, what’s the deal with that?
Well, as we’ve already established, the manufacturer is the one designing the PCB, and there are several good reasons why “mini” graphics cards are a thing.
First, they tend to be slightly cheaper since they require less material to manufacture, and secondly, they can fit inside small form factor cases and external GPU enclosures.
However, smaller graphics cards come with a caveat—their cooling is less efficient.
Naturally, due to their small size, compact graphics cards are equipped with smaller heatsinks and are commonly cooled by only a single fan. This results in higher temperatures, more noise, and limited overclocking potential.
In contrast, the larger cards usually appear larger because they come with a larger heatsink that improves cooling efficiency and allows the manufacturer to install a triple-fan cooler, too, but more on that below.
All in all, a compact graphics card might be a good buy if you’re pinching pennies or if you’re simply looking for a graphics card to install in a Mini ITX case or use as an external GPU. On the other hand, a larger card will likely have better cooling and will generate less noise, although you’d need to make sure it can actually fit your case.
Now that we’ve touched upon the cooling, let’s take a closer look at it, shall we?
In short, modern graphics cards utilize one of the three main types of cooling: open-air, blower, and liquid.
Open-air cooling is by far the most common, and it is the best fit for most gaming PCs. Graphics cards cooled in this manner feature an open heatsink and anywhere from one to three fans that push air through it.
As mentioned above, a single fan can be enough to keep a graphics card running at acceptable temperatures but it ultimately leads to more noise generation. As such, dual and triple-fan cards are usually a better choice unless you’re specifically looking for a compact card or are really pinching pennies and are willing to settle for the cheapest one that you can get.
Blower-cooled graphics cards are less common, and there’s a reason for that. These cards have a closed heatsink and rely on a single blower fan that sucks cool air in and blows the hot air straight out of the case.
This helps prevent heat buildup inside the case, which can be good for smaller cases or workstations with multiple GPUs, but blower-cooled cards tend to run both significantly louder and significantly hotter than those equipped with an open-air cooler, so it’s understandable why they’re not as popular.
Liquid coolers, much like blowers, are far from mainstream, albeit for a different reason. While they can offer unprecedented cooling efficiency—potentially with lower noise generation, too—they are prohibitively expensive and are only really worth using with high-end GPUs if you intend on overclocking the card and really want to push the hardware to its limits.
Now that we’ve touched on the subject of overclocking, we should discuss it in greater detail as well. So, how important is overclocking when it comes to GPUs?
Well, the answer is: not very, at least not for the average gamer.
The reason for this is simple: it’s just not possible to squeeze any significant amount of extra performance out of a graphics card this way.
While high-end graphics cards equipped with quality liquid cooling can definitely yield very noticeable performance improvements, the average air-cooled graphics card’s clock speed can only be pushed so far, usually by about 5-10%. As you might expect, the increase in performance that this nets you is quite negligible, especially when it comes to budget and mid-range GPUs.
Now, knowing this, you might ask yourself: why even worry about whether the card has a good cooler?
Well, the answer to that is simple as well: lower noise generation, lower load temperatures, and better longevity are all reasons why you might want to get a card with a better cooler, and it is often a good idea to do so.
As mentioned before, every graphics card comes with its own VRAM, but what should you keep in mind here?
Well, video memory is a fairly simple matter at the moment, as capacity is the only thing that you really need to concern yourself with. Most modern graphics cards come with 4, 6, or 8 GB of VRAM, and while 4 GB remains viable for 1080p gaming, 6 or 8 GB are a must if you intend on playing with high-resolution textures or at higher resolutions i.e. 1440p or 4K.
In 2023, the latest mainstream graphics cards come with GDDR6 memory that offers roughly twice the data transfer rate that GDDR5 did, so unless you’re really pinching pennies and thinking about getting an older model in order to save some cash, it’s probably not a good idea if you want a more future-proof build and/or you intend on gaming in a high resolution.
We’ve mentioned that the manufacturers are the ones deciding on which ports and how many thereof the graphics card will have, so what should you pay attention to in this department?
There are three main connectors that you will encounter in graphics cards today: HDMI, DisplayPort, and Dual Link DVI-D.
DVI is the oldest connector of the bunch, and Dual Link DVI-D is its newest iteration that can still be seen in some monitors and graphics, as it can support both 1080p and 1440p at a refresh rate of 60 Hz. That said, while it is still viable in 2023, it is far from the ideal choice.
HDMI is by far the most common connector today, found in TVs and monitors alike, and you are likely to encounter both HDMI 1.4 and HDMI 2.0 today. And how do the two differ?
The main differences come down to the supported resolutions and refresh rates i.e. HDMI 1.4 only supports 1080p at 144 Hz, 1440p at 75 Hz, and 4K at up to 30 Hz, whereas HDMI 2.0 can do 1080p at 240 Hz, 1440p at 144 Hz, and 4K at 60 Hz. In addition to that, HDMI 2.0a and HDMI 2.0b also added HDR support, and HDMI 2.0b is the version of the port that you’ll see in newer hardware.
Fortunately, HDMI 2.0 is also backward-compatible with HDMI 1.4, but both the graphics card and the monitor would need to have an HDMI 2.0 port if you wish to take full advantage of the newer technology’s features.
Next, we have DisplayPort, which is a staple of modern gaming monitors, and it is the one you will likely end up using.
As with HDMI, you can encounter two versions of DisplayPort today: DisplayPort 1.2 and DisplayPort 1.4.
The older 1.2 port can support 1080p at an impressive 240 Hz and 1440p at 144 Hz, all the while also supporting 4K at up to 75 Hz. However, the newer 1.4 port takes things even further, as it can push 1440p all the way up to 240 Hz and 4K to an impressive 120 Hz.
It’s not all about high refresh rates, though—DisplayPort 1.4 is also the only technology that currently supports Nvidia G-Sync, and it also supports HDR, unlike DisplayPort 1.2.
As with HDMI, DisplayPort is also backwards compatible, but both the graphics card and the monitor would need to have the appropriate ports if you wish to take full advantage of the newer version’s full feature set.
Finally, we should also mention USB-C, as some high-end graphics cards now also come with this increasingly popular port included, and it is something to keep in mind if you intend on using a VR headset on your PC.
In any case, both HDMI and DisplayPort are viable for gaming in 2023, but performance-minded gamers, understandably, prefer DisplayPort.
And so, that would be it when it comes to all the important factors that you should keep in mind when choosing the right graphics card for your needs.
If you’re shopping for one right now, we suggest checking out our complete graphics card buying guide if you’d like to take a look at what we feel are the best gaming GPUs at the moment.