first thin lcd monitors in stock
Take a good look at this paragraph. You’re reading it thanks to the magic of a computer display, whether it be LCD, CRT, or even a paper printout. Since the beginning of the digital era, users have needed a way to view the results of the programs they run on a computer–but the manner in which computers have spit out data has changed considerably over the last 70 years. Let’s take a tour.Blinking Indicator Lights
The first cathode-ray tubes first appeared in computers as a form of memory, not as displays (see Williams tubes). It wasn’t long before someone realized that they could use even more CRTs to show the contents of that CRT-based memory (as shown in the two computers on the left). Later, designers adapted radar and oscilloscope CRTs to use as primitive graphical displays (vector only, no color), such as those in the SAGE system and the PDP-1. They were rarely used for text at that time.The Early Teletype Monitor
Teletypes (even paper-based ones) cost a fortune in 1974–far out of reach of the individual in the do-it-yourself early PC days. Seeking cheaper alternatives, three people (Don Lancaster, Lee Felsenstein, and Steve Wozniak) hit on the same idea at the same time: Why not build a cheap terminal device using an inexpensive CCTV video monitor as a display? It wasn’t long before both Wozniak and Felsenstein built such video terminals into computers (the Apple I and the Sol-20, respectively), creating the first computers with factory video outputs in 1976.More Composite Monitors
In addition to RF television output, many early home PCs supported composite-video monitors (shown here) for a higher-quality image. (The Commodore 1702 also offered an alternative, higher-quality display through an early S-Video connection.) As the PC revolution got into full swing, computer makers (Apple, Commodore, Radio Shack, TI) began to design and brand video monitors–both monochrome and color–especially for their personal computer systems. Most of those monitors were completely interchangeable.
With video outputs came the ability to use ordinary television sets as computer monitors. Enterprising businesspeople manufactured “RF modulator” boxes for the Apple II that converted composite video into a signal that simulated an over-the-air broadcast–something a TV set could understand. The Atari 800 (1979), like video game consoles of the time, included an RF modulator in the computer itself, and others followed. However, bandwidth constraints limited the useful output to low resolutions, so “serious” computers eschewed TVs for dedicated monitors.
In the 1960s, an alternative display technology emerged that used a charged gas trapped between two glass plates. When a charge was applied across the sheets in certain locations, a glowing pattern emerged. One of the earliest computer devices to use a plasma display was the PLATO IV terminal. Later, companies such as IBM and GRiD experimented with the relatively thin, lightweight displays in portable computers. The technology never took off for PCs, but it surfaced again years later in flat-panel TV sets.The Early LCD Era
Yet another alternative display technology–the liquid crystal display–arrived on the scene in the 1960s and made its commercial debut in pocket calculators and wristwatches in the 1970s. Early portable computers of the 1980s perfectly utilized LCDs, which were extremely energy-efficient, lightweight, and thin displays. Early LCDs were monochrome and low contrast, and they required a separate backlight or direct illumination for users to read them properly.Early IBM PC Displays
In 1981, the IBM PC shipped with a directly attached monochrome video display standard (MDA) that rivaled a video terminal in sharpness. For color graphics, IBM designed the CGA adapter, which hooked to a composite-video monitor or the IBM 5153 display (which used a special RGB connection). In 1984, IBM introduced EGA, which brought with it higher resolutions, more colors, and, of course, new monitors. Various third-party IBM PC video standards competed with these in the 1980s–but none won out as IBM’s did.
The first Macintosh (1984) included a 9-inch monochrome monitor that crisply rendered the Mac’s 512-by-342-pixel bitmapped graphics in either black or white (no shades of gray here). It wasn’t until the Macintosh II (1987) that the Mac line officially supported both color video and external monitors. The Mac II video standard was similar to VGA. Mac monitors continued to evolve with the times, always known for their sharpness and accurate color representation.
The 1980s saw the launch of PC competitors to both the Macintosh and the IBM PC that boasted sharp, high-resolution, color graphics. The Atari ST series and the Commodore Amiga series both came with proprietary monochrome and RGB monitors that allowed users of those systems to enjoy their computer’s graphics to the fullest.
In the early days of the IBM PC, users needed a different monitor for each display scheme, be it MDA, CGA, EGA, or something else. To address this, NEC invented the first multisync monitor (called “MultiSync”), which dynamically supported a range of resolutions, scan frequencies, and refresh rates all in one box. That capability soon became standard in the industry.
In 1987, IBM introduced the VGA video standard and the first VGA monitors, in league with IBM’s PS/2 line of computers. Almost every analog video standard since then has built off of VGA (and its familiar 15-pin connector).
When LCDs first appeared, they were low-contrast monochrome affairs with slow refresh rates. Throughout the 1980s and 1990s, LCD technology continued to improve, driven by a market boom in laptop computers. The displays gained more contrast, better viewing angles, and advanced color capabilities, and they began to ship with backlights for night viewing. The LCD would soon be poised to leap from the portable sector into the even more fertile grounds of the desktop PC.
In the mid-1990s, just about all monitors–for PCs and for Macs–were beige. This was the era of the inexpensive, color, multisync VGA monitor that could handle a huge range of resolutions with aplomb. Manufacturers began experimenting with a wide assortment of physical sizes (from 14 inches to 21 inches and beyond) and shapes (the 4:3 ratio or the vertically oriented full-page display). Some CRTs even became flat in the late 1990s.
Computer companies had experimented with desktop LCD monitors since the 1980s in small numbers, but those monitors tended to cost a lot and offer horrible performance in comparison with the more prevalent CRTs. That changed around 1997, when a number of vendors such as ViewSonic (left), IBM (center), and Apple (right) introduced color LCD monitors with qualities that could finally begin to compete with CRT monitors at a reasonable price. These LCDs used less desk space, consumed less electricity, and generated far less heat than CRTs, which made them attractive to early adopters.
Today, LCD monitors (many widescreen) are standard across the PC industry (except for tiny niche applications). Ever since desktop LCD monitors first outsold CRT monitors in 2007, their sales and market share have continued to climb. Recently, LCD monitors have become so inexpensive that many people experiment with dual-monitor setups like the one shown here. A recent industry trend emphasizes monitors that support 3D through special glasses and ultrahigh refresh rates.
With most TV sets becoming fully digital, the lines between monitor and TV are beginning to blur just as they did in the early 1980s. You can now buy a 42-inch high-def flat-panel display for under $999 that you can hook to your computer, something that would make anyone’s head explode if you could convey the idea to people in the 1940s–back when they were still using paper.
The Xerox Alto computer, released on March 1, 1973, included the first computer monitor. The monitor used CRT technology and had a monochrome display.
LED display technology was developed by James P. Mitchell in 1977, but LED monitors were not readily available for purchase on the consumer market until about 30 years later.
LCD monitors outsold CRT monitors for the first time in 2003. By 2007, LCD monitors consistently outsold CRT monitors, and became the most popular type of computer monitor.
NEC was one of the first companies to manufacture LED monitors for desktop computers. Their first LED monitor, the MultiSync EA222WMe, was released in late 2009.
Touch screen LCD monitors started to become cheaper, more affordable for the average consumer in 2017. Prices for 20 to 22-inch touch screen monitors dropped below $500.
Electrically operated display devices have developed from electromechanical systems for display of text, up to all-electronic devices capable of full-motion 3D color graphic displays. Electromagnetic devices, using a solenoid coil to control a visible flag or flap, were the earliest type, and were used for text displays such as stock market prices and arrival/departure display times. The cathode ray tube was the workhorse of text and video display technology for several decades until being displaced by plasma, liquid crystal (LCD), and solid-state devices such as thin-film transistors (TFTs), LEDs and OLEDs. With the advent of metal-oxide-semiconductor field-effect transistors (MOSFETs), integrated circuit (IC) chips, microprocessors, and microelectronic devices, many more individual picture elements ("pixels") could be incorporated into one display device, allowing graphic displays and video.
One of the earliest electronic displays is the cathode ray tube (CRT), which was first demonstrated in 1897 and made commercial in 1922.electron gun that forms images by firing electrons onto a phosphor-coated screen. The earliest CRTs were monochrome and were used primarily in oscilloscopes and black and white televisions. The first commercial colour CRT was produced in 1954. CRTs were the single most popular display technology used in television sets and computer monitors for over half a century; it was not until the 2000s that LCDs began to gradually replace them.
1984 Super-twisted nematic display (STN LCD) to improve passive-matrix LCDs, allowing for the first time higher resolution panels with 540x270 pixels.
1987 optical micro-electro-mechanical technology that uses a digital micromirror device. While the Digital Light Processing (DLP) imaging device was invented by Texas Instruments, the first DLP-based projector was introduced by Digital Projection Ltd in 1997.
Most computer monitors range from 19� to 29�, but larger screens of 40� to 55� are also increasingly popular. Additionally, 60� to 90� monitors are perfect for the boardroom or classroom interaction. It really depends on how you�re using your computer. If it�s just for emailing and word processing, a smaller screen should be sufficient. But if you�re using your computer for designing, gaming or for viewing movies, a larger screen may be a better choice.
LED (Light- Emitting Diode) monitors essentially the same as LCD monitors � the only difference is in the backlighting technology. LCD monitors use fluorescent lamps to light the display, while LED monitors use the more energy-efficient light-emitting diodes. LED monitors use up to 40% less energy and contain no mercury, making them a more environmentally friendly choice. They may also help reduce eye strain, which is important if you spend a lot of time in front of your computer.
LED monitors tend to be more expensive than LCDs. But when selecting a computer monitor, it�s best to go with the highest quality and largest size that fits your budget.
Choosing the perfect computer monitor can be tough, so that"s why we"re here to help. Watch our video and learn how to pick out the perfect one. We"ll go over three different monitors for PC builds ranging from basic to high performance.
You’ve probably seen terms like HD and Full HD on the boxes of monitors and TVs, but what does that mean? As you may have guessed, HD refers to “High Definition,” a quick way to refer to a high-quality video output. So if you see the term “Full HD” on a monitor box, that’s just a shorthand to denote its resolution, which would be 1920 by 1080, also called 1080p. The reason why it’s specified as “Full HD” is that there are also some TVs and monitors that output at 720p (high definition but not relatively as high as 1080p), which is 1280 by 720 pixels. 1080p is considered the current standard for monitors, and popular manufacturers, including Dell, Acer, Samsung, LG, BenQ and Viewsonic, offer a variety of 1080p monitors in their product lineups.
As you can imagine, the more pixels there are to display, the more critical it is that your monitor has a high refresh rate, especially when it comes to gaming. Typically, the standard has been a 120-hertz refresh rate in gaming monitors, but many features a 144-hertz refresh rate. The quicker a monitor can refresh the display, and the smoother the visual experience will be. This is because the refresh rate in the monitor works in tandem with a low response time (which specifies how quickly the monitor can send and receive new information) to make a seamless visual transition. Sometimes, if the response rate is not quick enough, some residual pixels can remain on the screen as the monitor is trying to refresh new ones. This is called ‘ghosting.’ Although it’s standard to have a four-millisecond response time on many gaming monitors, Samsung, LG, BenQ, Viewsonic, and more all offer 2k and 4k monitors with one-millisecond response times. It is also important to ensure refresh rates are identical if you plan to sync two monitors for your display.
Regarding the internal specs, response time and refresh rate are the main factors contributing to a smooth, immersive viewing experience. Still, the physical panel type of the monitor can also play into this. First, there’s the matter of how the monitor lights up: either with LCD or LED. The main difference lies in the material that is used to light the liquid crystals in the display. In LCD, it’s cold cathode fluorescent lamps (CCFLs), and in LEDs, it’s tiny light emitting and low-energy consuming diodes. This is the preferred type in most monitors because it consumes less power and produces less harsh light, so darker colors appear more vivid. Additionally, LED monitors can be much thinner than LCD ones.
Newer LCD monitors have improved with the implementation of IPS (In-Plane Switching) panels. For some, it’s a matter of preference, but where the IPS panels have shown their strength with accurate color reproduction, which is great for content creators who want to do photo editing or graphic design. The panel type you choose depends more on preference than anything else. Samsung is well known for championing the IPS panel in their monitors, and many people also enjoy using them for gaming.
As touch navigation becomes more normalized due to mobile browsing, you may also want to consider a touch monitor for maximum accessibility. Planar offers a 22-inch optical touchscreen monitor that is highly portable due to its USB connection type. For something you can use in meetings for presentations, Dell provides a capacitive touchscreen monitor that also features an IPS panel.
Finally, another consideration is whether there are enough HDMI (High-Definition Multimedia Interface) ports. HDMI allows simultaneous digital video and audio transmission from one source to another. While HDMI ports are often standard, especially on gaming monitors, verifying that a monitor has enough HDMI compatibility for your setup before purchasing is essential.
Since monitors have to be lit in order for the viewer to see anything, the difference between the two types is in what is used to light up the crystals within the display. For LCD, that’s cold cathode fluorescent lamps (CCFL’s) and in LEDs, it’s tiny light emitting and low-energy consuming diodes. LED monitors tend to be thinner and more power-efficient, but improvements in the panel types have made LCDs more competitive.
Most ultrawide monitors are also curved. This design helps minimize viewing-angle problems—when you’re sitting centered, things on the far edges of the screen won’t look as washed out as they would on a flat display of a similar width. But this also makes ultrawide monitors inaccurate for precision tasks requiring straight lines, such as drawing, photo editing, or similar design work.
When LCD monitors first became widely available in the late 1990"s, their bezel widths rivaled those of CRT monitors. EIZO changed all that with the release of the FlexScan L675, the world"s first thin-bezel desktop LCD monitor. With a bezel width of what at the time was a revolutionary18.5 mm, the L675 was an instant hit in trading rooms and back offices.
To commemorate five years of manufacturing LCD monitors, EIZO released "Placeo", a 17-inch model with an aluminum cabinet. Only 3,000 pieces of this limited edition monitor were manufactured.
Although the migration to LCD technology in the corporate world was well underway, professionals in color-critical fields such as pre-press and photography were still clinging to their aging CRTs. EIZO made it safe for them to finally make the switch when it introduced its ColorEdge series – the world"s first line of LCD monitors specifically targeted at graphics professionals.
Building on the initial success of its ColorEdge monitors, EIZO introduced the ColorEdge CG220, the world"s first LCD monitor capable of reproducing the Adobe RGB color space.
EIZO became the first stand-alone monitor manufacturer to receive ISO 13485 certification for the quality management system for its medical display devices.
EIZO entered into a definitive agreement with eg-electronic GmbH of Wolfratshausen, Germany to acquire the latter"s business units for industrial monitors, air traffic control monitors, and monitor control boards. The acquisition was finalized in February 2009 and a new subsidiary, EIZO Technologies GmbH, in Wolfratshausen, Germany, officially began operations.
Previously only available in Japan, EIZO expanded its FORIS line of TVs to include PC monitors for home entertainment and released FORIS monitors worldwide.
Anticipating rapid growth in China for medical services and medical devices, EIZO established a wholly-owned subsidiary in China, EIZO Display Technologies (Suzhou) Co. Ltd., to manufacture medical monitors for the Chinese market.
a line of extreme and ultra-narrow bezel LCD displays that provides a video wall solution for demanding requirements of 24x7 mission-critical applications and high ambient light environments
Are you in the market for a new computer monitor, but you’re not sure how to decide which one best suits your needs? Not to worry. Sam’s Club® has an excellent selection of computer monitors with the latest technologies.
There are a few things to think about when you’re choosing the display size of your monitor. First, consider what you’ll be using the monitor for. If you’re doing graphic design work, or you’ll use the monitor to play games or watch TV shows and movies, a larger monitor makes sense. Smaller monitors may work just fine if you’re mostly using it for surfing the web, word processing or work that’s not graphics-intensive. Also, consider the size of the space where you’ll place your monitor. There are several size categories for monitors at Sam’s Club: Under 20”, 21” – 23”, 24” – 26” and 27” and above. When you’re shopping, look at the specs and pay particular attention to the monitor’s “display area.” That way, you can get an idea of the screen size without the monitor’s casing. Screen size is typically measured on the diagonal, so it’s the distance between opposite corners.
Resolution refers to the number of pixels that your monitor is capable of displaying. A common resolution you’ll see is 1920 x 1080. The first number, 1920, refers to the number of pixels displayed horizontally (across) and the second number, 1080 refers to the number of pixels displayed vertically (top to bottom). The more resolution you have, the clearer your picture will be. Standard resolutions are fine for most users, but if you’re doing detailed work with images, such as professional photo editing, you may want to consider a monitor with higher resolution.
Do you need more room on your desk or the flexibility to move your screen around? If so, you can get a monitor that’s mounted to your desk with an adjustable arm. These types of monitors are huge space-savers. Because you can change the monitor’s height, anyone who sits at the monitor can easily adjust it. If you need two screens, purchase two monitors and a dual monitor arm, so you can arrange your monitors to sit side-by-side.
Karl Ferdinand created the cathode ray tube as the first computer monitor. The cathode ray tube (CRT) is a vacuum tube that has one of its ends coated with phosphors. CRTs work by emitting light when electrons strike them. They were mainly used to display colorless vector graphics rather than viewing texts before the advancement to color cathode ray tubes that displayed graphics and texts.
Five types of monitors are CRT(Cathode Ray tube), LCD (Liquid Crystal Display), LED (Liquid Emitting Diode), OLED (Organic Light Emitting Diode), and Plasma monitors.
Karl Ferdinand Braun created the first computer monitors by introducing a fluorescent screen to the cathode ray tube. Whenever electrons hit the screen, they emitted light. He also contributed to the development of television and radio technology. In 1909, he won the Physics Nobel Prize, together with Guglielmo Marconi.
A computer monitor is a display screen that sends visuals from a computer, TV, camera or other video-emitting device. The two main today"s screen technologies are LCD and OLED.
The first computer monitors were invented by introducing a fluorescent screen to the cathode ray tube, a cathode ray oscilloscope. The fluorescent screen displayed a visible light whenever electrons struck it. The purpose of creating the cathode ray tube was to display texts and graphics used for data processing.
There are five types of monitors. They include: CRT(Cathode Ray tube), LCD (Liquid Crystal Display), LED (Liquid Emitting Diode), OLED (Organic Light Emitting Diode), and Plasma monitors. without exception, they are all used in computer desktops and TVs.
George H. Heilmeier, an American electrical engineer, invented the LCD monitor in 1968 at the RCA research laboratories. Mr. Heilmeier, together with his colleagues at Radio Corporation of America"s research laboratory, began experimental practicals on electronic image creation involving the manipulation of tiny liquid crystals on thin layers of glass. In May 1968, Radio Corporation of America approved the technology and started planning its application on products like watches. To answer the question, ‘what is LCD monitor?’ It is a thin flat panel used in phones, watches, television, and computer monitors to display texts and graphics - see all monitor panel types here.
George H. Heilmeier was an American electrical engineer who pioneered the liquid crystal display (LCD) invention. He was born on May 22, 1936, in Philadelphia. He went to Abraham Lincoln High School and later joined the University of Pennsylvania.
The first LCD monitor was invented by devising electronic control for light reflected from liquid crystals. George Heilmeier, Louis Zanoni, Lucian Barton, and other engineers and scientists at the RCA laboratories pioneered this invention.
The technology behind the LCD monitors is the active-matrix liquid-crystal display technology. This technology is a flat-panel display for high-resolution computer monitors, phones, and televisions. The technology was established by Bernard J. Lechner, a worker at the RCA laboratories, in 1968.
Pixels are tiny pigments that are manipulated to display information. The display comprises several layers of a substrate containing liquid crystals in a thin layer between the glass panel.
An LCD monitor doesn"t have a screen burn-in. Screen burn-in typically occurs in monitors with phosphor-based pixels. LCD uses organic material pixels that aren"t affected by screen burn-in.
LCD Supports a variety of shapes and sizes and supports both small and large profile designs. Smartphones use low-profile techniques that would be impossible with others.
An LCD monitor is durable. They can last for longer than non LCD computer monitors. It can have a lifespan of up to 60,000 hours, depending on your use frequency.
Nick Holonyak, an American electrical and computer engineer, invented the first LED monitor in 1962. He created the first LED monitor while working at the General Electric electronics laboratory in Syracuse, New York.
Nick Holonyak used gallium arsenide phosphide, a semiconductor material, and stimulated emission technology to develop the first LED monitor. This LED monitor produced red light. And in 1962, Nick Holonyak successfully operated the first practical visible LED device.
Nick Holonyak, Jr.is an American engineer well known for the invention of light-emitting diodes. He created the first visible light-emitting diode in 1968. He was born on November 3, 1928, in Zeigler, Illinois, U.S.
In 1954 and 1955, he worked at Bell Telephone Laboratories and later joined the military from 1955 to 1957. Nick Holonyak, while in Syracuse, New York, joined the General Electric electronics laboratory, where the team"s previous developments inspired him to develop the first light-emitting diode in 1962.
The first LED monitor was invented using gallium arsenide phosphide, a semiconductor material and stimulated emission technology. This first LED monitor produced red light.
The idea behind the creation of the first LED monitor was optoelectronics. This is a concept that deals with how electric currents can be converted into light.
The purpose for creating the first LED monitor was to produce a diode that emits visible light as opposed to the Hall"s laser that emitted infrared radiation only, which was invisible to the naked eye.
The technology behind the LCD monitor varies from one monitor to another. There are many types of technologies applied in LED monitors, as listed below:Edge LED. This LED technology has light-emitting diodes distributed along the edge of the screen all around or at the corners. Light reflectors allow a homogeneous illumination on the screen surface. Computer monitors using this technology have thinner screens than the others.
The LED monitor is environmentally friendly. The ability of the LED monitors to consume less power means less emission of gases to the environment, thus conserving the environment. The LED computer monitors do not emit ultraviolet radiation to the environment.
The LED monitor has a longer lifespan than other monitors. It saves resources that would otherwise be used for repair or replacement. The LED monitors can last up to 50,000 to 100,000 hours.
The first OLED monitor was invented by Ching Wan Tang and Steven Van Slyke at Kodak. He created the first OLED monitor in 1987 at Kodak. The OLED monitor was developed to advance flat panel displays in televisions, smartphones, and computer monitors.
The first OLED monitor was made by placing organic thin films between two conductors. A bright light is produced when a beam of electric current passes through.
Ching Wan Tang is a Hong Kong–American chemist who in 1987 pioneered the invention of the first OLED monitor. He was born on July 23, 1947, in Hong Kong.
At the Kodak laboratories, he researched organic electronics. It is here, where in 1987, he invented the first OLED monitor. He left Kodak company in 2006 and joined the University of Rochester to be a Chemical Engineering professor.
The first OLED monitor was made by using several organic thin films placed between two conductors. A bright light is emitted when a beam of electric current strikes between the conductors.
The technology behind OLED monitors exists in two varieties. The first technology uses small molecules, while the other technology uses polymers. When mobile ions are added to OLED, a light-emitting electrochemical cell is created.
The control scheme for the OLED can be a passive or active matrix. The passive scheme has the rows controlled one by one. Conversely, the active scheme applies a thin-film transistor backplane to control all pixels, enabling a higher resolution and more significant display sizes.
Gaming monitors are designed to make the output of your graphics card and CPU look as good as possible while gaming. They"re responsible for displaying the final result of all of your computer"s image rendering and processing, yet they can vary widely in their representation of color, motion, and image sharpness. When considering what to look for in a gaming monitor, it"s worth taking the time to understand everything a gaming monitor can do, so you can translate gaming monitor specs and marketing into real-world performance.
Common resolutions include 1,920 × 1,080 (sometimes called “Full HD” or FHD), 2,560 × 1,440 (“Quad HD”, QHD, or “Widescreen Quad HD”, WQHD), or 3840 × 2160 (UHD, or “4K Ultra HD”). Ultrawide monitors are also available with resolutions such as 2560 x 1080 (UW-FHD) and 3440 x 1440 (UW-QHD), 3840x1080 (DFHD), and 5120x1440 (DQHD).
You might already know that a screen with 4K display resolution doesn"t magically make everything it displays look 4K. If you play a 1080p video stream on it, that content usually won"t look as good a 4K Blu-ray. However, it may still look closer to 4K than it used to, thanks to a process called upscaling.
Upscaling is a way to scale lower-resolution content to a higher resolution. When you play a 1080p video on a 4K monitor, the monitor needs to “fill in” all of the missing pixels that it expects to display (as a 4K monitor has four times as many pixels as 1080p). A built-in scaler interpolates new pixels by examining the values of surrounding pixels. HDTVs often feature more complex upscaling than PC monitors (with line-sharpening and other improvements), as the latter often simply turn one pixel into a larger block of the same pixels. The scaler is likely to cause some blurring and ghosting (double images), especially if you look closely.
Monitors can also change resolution. Modern screens have a fixed number of pixels, which defines their "native resolution" but can also be set to approximate lower resolutions. As you scale down, onscreen objects will look larger and fuzzier, screen real estate will shrink, and visible jaggedness may result from interpolation. (Note that it wasn’t always this way: older analog CRT monitors can actually switch between resolutions without interpolation, as they do not have a set number of pixels.)
Players sit or stand close to their monitors, often within 20”-24”. This means that the screen itself fills much more of your vision than an HDTV (when seated at the couch) or a smartphone/tablet. (Monitors boast the best ratio of diagonal screen size to viewing distance among common displays, with the exception of virtual reality headsets). The benefits of 1440p or 4K resolution are more immediately perceptible in this close-range situation.
A monitor"s aspect ratio is the proportion of width to height. A 1:1 screen would be completely square; the boxy monitors of the 1990s were typically 4:3, or “standard”. They have largely been replaced by widescreen (16:9) and some ultrawide (21:9, 32:9, 32:10) aspect ratios.
Most online content, such as YouTube videos, also defaults to a widescreen aspect ratio. However, you"ll still see horizontal black bars onscreen when watching movies or TV shows shot in theatrical widescreen (2.39:1, wider than 16:9), and vertical black bars when watching smartphone videos shot in thinner “portrait” mode. These black bars preserve the original proportions of the video without stretching or cropping it.
UltrawidesWhy opt for an ultrawide screen over regular widescreen? They offer a few advantages: They fill more of your vision, they can provide a movie-watching experience closer to the theater (as 21:9 screens eliminate “letterboxing” black bars for widescreen films), and they let you expand field of view (FOV) in games without creating a “fisheye” effect. Some players of first-person games prefer a wider FOV to help them spot enemies or immerse themselves in the game environment. (But note that some popular FPS games do not support high FOV settings, as they can give players an advantage).
Curved screens are another common feature on ultrawide monitors. These can correct one typical issue with larger ultrawides: Images at the distant edges of the screen look less distinct than those in the middle. A curved screen helps compensate for this and provides a clearer view of the extreme edges of the screen. However, its benefits are most noticeable on larger screens over 27”.
When viewing two monitors side-by-side, it"s sometimes easy to see which has more brilliant hues, deeper blacks, or a more lifelike color palette. It can be harder to put the picture together in your head when reading specifications, however, because color in monitors is evaluated in many different ways. There"s no one spec to focus on: Contrast ratio, brightness, black level, color gamut, and more all come into play. Before moving on to larger color features, let"s define these terms one-by-one.
Use caution when LCDs advertise very high “dynamic contrast ratios”, which are achieved by changing the behavior of the backlight. For gaming or everyday use, the standard “static” contrast ratio discussed above is a better marker of the monitor"s quality.
Black LevelIn all LCD screens, light from the backlight inevitably leaks through the liquid crystal. This provides the basis for the contrast ratio: For example, if the screen leaks 0.1% of the illumination from the backlight in an area that"s supposed to be black, this establishes a contrast ratio of 1,000:1. An LCD screen with zero light leakage would have an infinite contrast ratio. However, this isn"t possible with current LCD technology.
“Glow” is a particular issue in dark viewing environments, which means that achieving low black levels is a major selling point for LCD monitors. However, an LCD screen can’t reach a black level of 0 nits unless it’s completely turned off.
Color DepthMonitors need to display many subtle shades of color. If they can"t smoothly transition between slightly different hues, we see onscreen color “banding” — a stark shift between two different colors, creating visibly lighter, and darker bands where we should see a seamless gradient. This is sometimes referred to as “crushing” the colors.
True 10-bit monitors are rare — many monitors use forms of internal color processing, such as FRC (frame rate control), to approximate a greater color depth. A “10-bit” monitor could be an 8-bit monitor with an additional FRC stage, often written as “8+2FRC”.
Some inexpensive LCD panels use 6-bit color along with “dithering” to approximate 8-bit color. In this context, dithering means the insertion of similar, alternating colors next to one another to fool the eye into seeing a different in-between color that the monitor cannot accurately display.
Monitors sometimes feature a Look-Up Table (LUT) corresponding to a higher color depth, such as 10-bit color. This helps speed up color correction calculations that take place within the monitor as it converts color input to a color output appropriate for your screen. This intermediate step can help create smoother color transitions and more accurate output. These are usually reserved for more professional grade monitors than general consumer and gaming displays.
Your eye can see a much wider spectrum of color than current displays can reproduce. To visualize all visible colors, a standard called CIE 1976 maps them to a grid, creating a horseshoe-shaped graph. The color gamuts available for monitors appear as subsets of this graph:
Common, mathematically defined color gamuts include sRGB, Adobe RGB, and DCI-P3. The first is a common standard for monitors (and the officially designated color space for the web). The second, wider standard is mostly used by photo and video editing professionals. The third, DCI-P3, is even wider, and is commonly used for HDR content.
Monitors advertising "99% sRGB" are claiming the screen covers 99% of the sRGB color gamut, which is often considered indistinguishable from 100% when viewed with the naked eye.
In LCD screens, the backlight and color filters determine the color space. All of the light created by the backlight passes through a color filter with red, green, and blue spots. Narrowing the “band-pass” of this filter restricts the wavelengths of light that can pass through, increasing the purity of the final colors produced. Although this lessens the screen"s efficiency (as the filter now blocks more of the backlight"s output), it creates a wider color gamut.
HDR monitors display brighter images with better contrast and preserve more detail in both light and dark areas of the screen. Using an HDR monitor, you might be better able to spot something moving down a dark corridor in a horror game, or see more dramatic shafts of sunlight in an open-world title.
Though they work best with HDR content (which only some games and movies support), these monitors typically support 10-bit color depth and backlights that support a wide color gamut, which will also improve standard content (SDR). (Note that HDR monitors are often not true 10-bit color, but rather 8+2FRC displays that accept a 10-bit input signal).
For LCD displays, a high-end backlight feature called local dimming is critical to HDR quality. Dimming zones for the backlight behind the screen control the brightness of groups of LEDs; more dimming zones means more precise control, less “blooming” (where light areas of the image brighten dark ones), and generally improved contrast.
The DisplayHDR standard is more reliable than specs that are advertised as "Typical", as that wording allows manufacturers to list results that are true averages. Look for monitors that meet the minimum specification for different levels of DisplayHDR.
Refresh rate is the frequency at which your entire screen refreshes the image. Higher refresh rates make onscreen motion look smoother, because the screen updates the position of each object more rapidly. This can make it easier for competitive players to track moving enemies in a first-person shooter, or just make a screen feel more responsive as you scroll down a webpage or open an app on your phone.
Response rates are measured in hertz: A response rate of 120Hz, for example, means that the monitor refreshes every pixel 120 times per second. While 60Hz was once the standard for both PC monitors and smartphones, manufacturers are increasingly adopting higher refresh rates.
The benefits of jumping from 60Hz to 120Hz or 144Hz are clear to most players, especially in fast-paced first-person games. (However, you"ll only see benefits if you also have a GPU powerful enough to render frames faster than 60fps at the resolution and quality settings you"ve chosen).
A higher refresh rate makes it easier to track moving objects with your eye, makes sharp camera movements feel smoother, and reduces perceived motion blur. Online communities are divided about the improvement provided by monitors over 120Hz. If interested, it"s worth checking one out in person to see how much of a difference it might make for you.
Players sometimes confuse response time with input lag, a measurement of the delay before your actions appear onscreen, similarly measured in milliseconds. Input lag is felt rather than seen, and is often a priority for players of fighting games and first-person shooters.
G-Sync monitors use NVIDIA"s proprietary G-Sync scaler chip to match monitor refresh rates to GPU output, as well as predict GPU output based on recent performance. It also helps prevent stutter and input lag, which can result from duplicate frames being drawn as the first one waits to be displayed.
AMD Radeon FreeSync monitors operate along similar lines, matching the display to GPU output to avoid screen tearing and stutters. Rather than using a proprietary chip, they"re built on open Adaptive Sync protocols, which have been built into DisplayPort 1.2a and all later DisplayPort revisions. Though FreeSync monitors are often cheaper, the trade-off is that they aren"t subject to standard testing before release, and vary widely in quality.
Variable Refresh Rate (VRR) is a general term for technologies that sync up your monitor and GPU. Adaptive Sync is an open protocol included in DisplayPort 1.2a and later revisions. Recent Intel, AMD, and NVIDIA graphics technologies can all work with Adaptive Sync monitors.
Both LCDs and OLEDs "sample and hold", displaying moving objects as a series of static images that are rapidly refreshed. Each sample remains onscreen until it"s replaced with the next refresh. This "persistence" causes motion blur, as the human eye expects to track objects smoothly rather than see them jump to a new position. Even at high refresh rates, which update the image more often, the underlying sample-and-hold technology causes motion blur.
This mimics the operation of older CRT monitors, which worked differently than current LCD technology. CRT screens were illuminated by phosphors that rapidly decayed, providing brief impulses of illumination. This meant that the screen was actually dark for most of the refresh cycle. These quick impulses actually created a smoother impression of motion than sample-and-hold, and motion blur reduction features work to replicate this effect.
Cathode Ray Tube (CRT)These boxy computer monitors were common from the 1970s until the early 2000s, and are still prized by some players today for their low input lag and response times.
CRTs used three bulky electron guns to send a beam to excite red, green, and blue phosphors on the screen. These phosphors decayed within a few milliseconds, meaning the screen was illuminated by brief impulses on each refresh. This created a smooth illusion of motion, but also visible flickering.
Liquid Crystal Display (LCD)In TFT LCDs (thin-film-transistor liquid crystal displays), a backlight shines light through a layer of liquid crystals that can twist, turn, or block it. The liquid crystals do not emit light themselves, which is a key difference between LCDs and OLEDs.
Older LCDs used Cold-Cathode Fluorescent Lamps (CCFLs) as backlights. These large, energy-inefficient tubes were incapable of controlling the brightness of smaller zones of the screen, and were eventually phased out in favor of smaller, energy-efficient light-emitting diodes (LEDs).
LCD panels are available in a range of technologies and can vary widely in color reproduction, response time, and input lag, especially among high-end options. However, the following generalizations about panels usually hold true:
Oldest and most affordable LCD panel type. High refresh rates and response times for high-speed gaming such as first-person shooters or fighting games.
Organic Light-Emitting Diode (OLED)OLED screens are emissive, meaning they create their own light, rather than transmissive screens that require a separate light source (like LCDs). Here, the application of electric current causes a layer of organic molecules to light up on the front of the screen.
Backlights may be imperfectly blocked by the liquid crystals in an LCD, causing black areas of an image to appear gray. Because OLEDs have no backlight, they can achieve “true black” by simply turning off a pixel (or at least 0.0005 nits, the lowest measurable brightness).
OLEDs therefore boast very high contrast ratios and vibrant color. The elimination of the backlight also makes them slimmer than LCDs. Much as LCDs were a thinner, more energy-efficient evolution of CRTs, OLEDs may prove a thinner evolution of LCDs. (They can also be more energy-efficient when displaying dark content, like movies, but less energy-efficient with white screens, such as word processing programs).
Gaming monitors often include a mount with adjustable height, tilt, and degree of rotation. These help you find an ergonomic position for your monitor and help it fit in varied workspaces.
VGA (Video Graphics Array): Older monitors may feature this legacy port, a 15-pin analog connection introduced in 1987. It transmits only video, at resolutions up to 3840 × 2400.
Single-Link DVI (Digital Visual Interface): The oldest display interface found on many modern monitors, this 24-pin digital connection dates back to 1999. It transmits only video and can connect to VGA or HDMI with an adapter. It supports resolutions up to 1920 × 1200.
DisplayPort: High-bandwidth ports that transmit video and audio. All DisplayPort cables work with all DisplayPort versions up to 2.0, which requires active cables (cables that include an electronic circuit) for full bandwidth. Revisions 1.2 and later allow you to link multiple monitors together via “daisy chaining” (though this also requires compatible monitors).
USB: These common ports transfer both data and power. Many monitors let you connect keyboards and mice to them to free up USB ports on your PC. USB Type-C ports feature a reversible design and can double as DisplayPorts.
Figuring out what to look for in a gaming monitor depends heavily on the choices you"ve made about the rest of your computer. Modern monitors can generally help you avoid the dropped frames, input lag, and visual artifacts common in older technologies, but the value of increased resolution, color depth, and motion smoothing features will vary from player to player. It"s down to you to separate the must-haves from the nice-to-haves.