Monitors are better for competitive gaming since they are much more responsive than TVs. However, for those who don’t mind the higher response times and potential aliasing, a TV is just as viable a choice as a monitor, especially for those who want a bigger screen.
Not all displays are made equal. But in the world of gaming, your choices are not limited to just different types of computer monitors. Any gaming platform, be it a PC or a console, can use either a monitor or a TV as a display. You can even use a projector if you’d like.
But which one is ultimately better?
That is not an easy question to answer, so we will be looking at the differences, as well as discussing the advantages and disadvantages of both monitors and television sets.
TV vs Monitor – How Do They Differ?
The most obvious difference between a monitor and a TV is the size of the screen.
Monitors usually range from 19 inches to 27 inches, while modern TVs start at around that desk-friendly size but also go much higher, with some models going over 100 inches. The most popular sizes for TVs, however, are between 30 inches and 60 inches.
The bigger size of TV screens makes games easier to enjoy from your couch and also makes split-screen multiplayer more enjoyable.
Something that ties directly into screen sizes is the display resolution. As you most likely already know, resolution determines how many individual pixels there are on the screen, and the more pixels there are, the sharper and more detailed the image will be.
Today, monitors come with the following resolutions:
- 1080p, or Full HD
- 1440p, or QHD, also commonly referred to as 2K
- 2160p, or UHD, also most commonly known as 4K
They stick to the abovementioned size range since 27 inches is pretty much the biggest a monitor can be without becoming too large for viewing up close.
The situation with TVs, on the other hand, is a bit different. They come in 720p (HD Ready), 1080p (Full HD) and 2160p (4K) variants. Not only is there nothing in between Full HD and 4K, but there is also the size. Full HD TVs start at 32 inches, and any smaller TV will almost definitely be in 720p, a resolution which is outdated by modern gaming standards.
This is due to the fact that TVs are meant to be viewed from afar, not up close as monitors are, so using a TV instead of a monitor is a definite no-go. Furthermore, a bigger screen also means lower pixel density, which will inevitably result in aliasing.
Response time, or to be more precise, pixel response time determines how quickly a pixel can change color from black to white or from one shade of gray to another.
What makes it important for gaming is that it allows smooth camera movement, whereas high response times lead to extensive motion blur and, potentially, ghosting.
With modern monitors, you can generally choose between a 1ms TN monitor or a 4ms IPS monitor, the latter being limited to a higher response time due to the technology used. Conversely, TVs stick to IPS panels. They also tend to have higher overall response times since it is not as important for their intended application: multimedia.
All in all, due to being higher than those of monitors, TV response times are almost never revealed by manufacturers in order to avoid consumer bias. Usually, they tend to be over 10ms, but there is no definite way to be sure about how much motion blur you will be dealing with. The safest route is to stick with renowned brands such as Sony, Samsung, LG, Philips etc. And, if you are exceptionally worried about motion blur, be sure to see if you can test the TV in person.
And finally, we have connectors, which are perhaps where the majority of differences between monitors and TVs lie.
Monitors tend to have both DisplayPort and HDMI connectors for video, with some models also having DVI. In addition to that, most also have audio inputs and outputs in the form of 3.5mm jacks. Some models also have USB ports.
TVs do not use DisplayPort, but stick exclusively to HDMI for video transfer – usually at least two. They also have at least one USB port, a 3.5mm headphone jack, but also commonly include an optical audio port (TOSLINK) for Hi-Fi speakers and home cinema setups. For TV signals, they have a coaxial connector that is used to connect cable and antennae. Some models may also have analogue SCART, composite, and component inputs. And ultimately, an Ethernet port is also common in smart TVs so as to enable a wired internet connection.
In the end, you will only be able to use DisplayPort when connecting your PC to your monitor, as neither consoles nor TVs use it, preferring HDMI instead.
The Final Verdict
Ultimately, the biggest problems with gaming on a TV are high response times and, potentially, aliasing.
As mentioned above, a high response time can lead to extensive motion blur during fast camera movements, which may end up making fast-paced games a nauseating experience. However, this won’t be as big of a problem with a high-quality TV, especially if you are not used to super-fast response times anyway.
Aliasing is ever a problem on consoles since they don’t have the required graphics processing power for any anti-aliasing and they’re also commonly connected to large TVs. And, as mentioned before, this means lower pixel density and as such, more aliasing. If you’re connecting a solid gaming PC to a TV, however, this shouldn’t be as much of a problem when viewing the TV from an adequate distance.
So, let’s summarize!
With that said, remember that a TV can never replace a computer monitor, but neither can a monitor replace a TV.
When objectively, monitors are more responsive and provide a cleaner image due to the high pixel density. On the other hand, if the lower response times and aliasing don’t bother you that much, a TV can suit the purpose just as well.
Samuel is GamingScan’s editor-in-chief. He describes himself as a hardcore gamer & programmer and he enjoys getting more people into gaming and answering people’s questions. He closely follows the latest trends in the gaming industry in order to keep you all up-to-date with the latest news.