We know many of you are already familiar with monitor jargon or terms, but we still get questions about these aspects from time to time. You’ve seen these words or names in our reviews, so a better understanding would be beneficial in helping you to discern which display product is best for your needs. Monitors, especially those made for gaming and professional use, aren’t cheap, so it’s of utmost importance to understand their characteristics.
Image Quality and What’s Involved
Image quality and color quality are two of the most common monitor jargon we and other reviewers use to describe how good a monitor’s output is. Color accuracy is heavily factored into this aspect, but its also affected by the other characteristics of a monitor such as its contrast ratio. This mix of specifications will dictate how good a monitor will perform in tasks such as editing or if it’s suitable for competitive gaming.
This also means that the monitor in question will only be considered as a product with fantastic image quality if it excels in the majority of the monitor jargon listed below. It’s still a case-to-case basis for each individual since we all have varying visions and tastes. However, the terms listed below and their results in reviews will help you estimate how good a display product is, especially if it’s for a specific use case.
Gamut Coverage
Gamut coverage is the amount of colors the panel can represent that is then compared to an industry standard. Monitors with wide gamuts are often used for editing or HDR. while office variants or even some E-Sports models have lower numbers. It’s affected by the panel’s bit depth and dithering, with top choices having at least 8-bit+FRC or native 10-bit for full coverage.
Size
A monitor’s size is the length of its diagonal from a bottom corner to the upper one on the opposite side. We now have massive monitors of up to 65 inches, but not all of them are ideal for a specific use such as editing or competitive gaming.
Aspect Ratio
Aspect ratio is the proportion of the screen’s width to its height, with 16:9 being the most common by today’s standards. Ultrawides are usually 21:9, but we now have wider, 32:10 and 32:9 screens that will envelop your visual senses.
Resolution
Resolution is a monitor jargon that’s expressed in two numbers that describe the vertical and horizontal pixel counts of a monitor. The most common are 1920 x 1080 full HD, 2560 x 1440 QHD, and 3840 x 2160 4K. We now have wider monitors with resolutions that look like 3440 x 1440, 3840 x 1080, and 5120 x 1440, etc.
Refresh Rate
Refresh rate is a number that describes how much the screen refreshes every second and is usually expressed in Hz or Hertz. 60Hz displays are common for most uses, but gaming variants often offer 144Hz, 240Hz, and even 360Hz for smoother and seemingly faster visuals.
Response Time
Response time is how long your monitor can draw pixel transitions from different shades or how fast it can draw a frame. It’s not the same as the Input Lag monitor jargon, which monitor makers don’t mention in their spec sheets. Monitors with fast response times are often blur or smudge-free, so they are great for all-around gaming.
Color Accuracy
Color accuracy is a screen’s ability to reproduce colors that match the output from the source device through its video signal. It’s usually averaged (deltaE) average, and a score lower than 2.2 is generally preferred. Color-accurate monitors will show red images as red instead of orangey-red or magenta, so they are preferred by enthusiasts and professionals.
Contrast
the monitor jargon contrast is the association between a panel’s darkest and brightest colors, affecting the depth and saturation of its output. Low contrast monitors will often show black shades as grayish, while VA panels that excel in this regard can sometimes create “black crush” that eats up the fainter colors or objects on the screen.
Brightness
Brightness is the amount of illumination a monitor’s backlight or pixels (OLED) can produce and is expressed in cd/m2 or candelas per square meter. It’s an SI unit that describes how much light a monitor can put through the panel to affect its image quality and output. A good monitor usually has up to 400 cd/m2, but calibrated monitors only need 120 cd/m2 while true HDR monitors need a whopping 1000 cd/m2.
Panel Type
The monitor’s panel type dictates what kind of performance it can offer or for which task it is best suited. Each panel type has its strengths and weaknesses, so you can also use this aspect as the basis for your decision. The common panel types are:
TN – TN panels are considered the fastest when it comes to pixel response time, but they suffer from poor viewing angles and limited color reproduction. They are popular for office displays since they are cheaper, but they are also used in competitive scenarios since they are virtually blur-free.
IPS – IPS panels are the most popular now for their crisp colors, clarity, and wider viewing angles. The market has developed Fast IPS panels for gamers, so the line between it and TN has been blurred since it offers the best of both worlds. The biggest weakness of IPS technology is its susceptibility to backlighting bleeding, but it also has lower contrast ratios and longer pixel transition times.
VA – VA panels are considered as the middle ground between the two, but their main advantage is a substantially higher contrast ratio for deep blacks and saturation. It has the slowest pixel response time out of the three, so monitor manufacturers often package gaming variants with effective overdrive solutions to count it.
OLED – OLED monitors are still rare and prohibitively expensive, but they are considered as the best when it comes to image quality. OLED or organic light-emitting diode’s main draw is its ability to illuminate each individual pixel for better brightness and fidelity. Its main fault aside from the high price is it’s prone to burn-in issues where static images leave an imprint on the screen that is sometimes permanent.
Connectivity/Inputs
A monitor usually carries various connectors or ports to satisfy modern requirements, but it carries video inputs primarily for its main function. Some monitors only have the basics, while the premium variants often include valuable extras for your convenience. Here are the most common input or connector types:
DisplayPort – DisplayPort is widely preferred for PC use since it’s usually faster and has more bandwidth than a comparable HDMI version. It works best for 4K and high-refresh rates and is usually the standard that supports Adaptive Sync.
HDMI – HDMI is the most popular connector standard for consumer electronics so they are still common on monitors. It’s great for all-around use and it transmits both audio and video, so it’s great for all types of host devices such as gaming consoles.
USB – USB ports are now becoming common on monitors so they can accommodate peripherals and allow for better cable management. Some even have fast-charging capabilities, so they are nice to have for a workstation or a streaming setup. Some even use the Type-C connector which can provide high wattages while transmitting and receiving signals for single cable operation.
3.5mm jack – These round ports are intended for analog audio signals and equipment such as headphones or desktop speakers. Some monitors also have input slots for microphones, but they are usually found on higher-tiered models.
Adaptive Sync/VRR
Adaptive Sync and VRR or the variable refresh rate monitor jargon pertain to the ability of a monitor to match its refresh rate to the framerate of games. What this does is eliminates tearing and stuttering once your frames drop, giving off a smoother appearance with cleaner transitions. AMD calls their implementation FreeSync while Nvidia calls their’s G-Sync.
Both used to be exclusive with Nvidia locking theirs behind a premium and a dedicated FPGA controller. However, we now have monitors that work with both brands, so you don’t have to stay loyal to one brand to enjoy buttery smooth frames.
Input Lag
Input lag is the time or delay between your GPU’s rendered output and its appearance on the monitor’s display. There is no fixed standard to measure input lag, so reviewers often use dedicated tools and procedures to measure or estimate them. A monitor with 10ms or lower is preferable for gaming, but some users will find higher figures all the way up to 25ms acceptable.
Panel Uniformity
Panel uniformity describes how evenly lit or saturated a screen is, using its hotspot as the basis. The screen is measured with a colorimeter that divides it into quadrants, finding its hotspot, and measuring the differences of the other parts against it. A 10% variance won’t be trouble, but anything beyond that can become annoyingly noticeable especially in the dark.
About the Author: Paolo is a gaming veteran since the golden days of Doom and Warcraft and has been building gaming systems for family, friends, and colleagues since junior high. High-performance monitors are one of his fixations and he believes that it’s every citizen’s right to enjoy one. He has gone through several pieces of hardware in pursuit of every bit of performance gain, much to the dismay of his wallet. He now works with Monitornerds to scrutinize the latest gear to create reviews that accentuate the seldom explained aspects of a PC monitor.
Leave a Reply