origin of lag in lcd displays free sample
Display lag is a phenomenon associated with most types of liquid crystal displays (LCDs) like smartphones and computers and nearly all types of high-definition televisions (HDTVs). It refers to latency, or lag between when the signal is sent to the display and when the display starts to show that signal. This lag time has been measured as high as 68 ms,Hz display. Display lag is not to be confused with pixel response time, which is the amount of time it takes for a pixel to change from one brightness value to another. Currently the majority of manufacturers quote the pixel response time, but neglect to report display lag.
For older analog cathode ray tube (CRT) technology, display lag is nearly zero, due to the nature of the technology, which does not have the ability to store image data before display. The picture signal is minimally processed internally, simply for demodulation from a radio-frequency (RF) carrier wave (for televisions), and then splitting into separate signals for the red, green, and blue electron guns, and for the timing of the vertical and horizontal sync. Image adjustments typically involve reshaping the signal waveform but without storage, so the image is written to the screen as fast as it is received, with only nanoseconds of delay for the signal to traverse the wiring inside the device from input to the screen.
For modern digital signals, significant computer processing power and memory storage is needed to prepare an input signal for display. For either over-the-air or cable TV, the same analog demodulation techniques are used, but after that, then the signal is converted to digital data, which must be decompressed using the MPEG codec, and rendered into an image bitmap stored in a frame buffer.
For progressive scan display modes, the signal processing stops here, and the frame buffer is immediately written to the display device. In its simplest form, this processing may take several microseconds to occur.
For interlaced video, additional processing is frequently applied to deinterlace the image and make it seem to be clearer or more detailed than it actually is. This is done by storing several interlaced frames and then applying algorithms to determine areas of motion and stillness, and to either merge interlaced frames for smoothing or extrapolate where pixels are in motion, the resulting calculated frame buffer is then written to the display device.
De-interlacing imposes a delay that can be no shorter than the number of frames being stored for reference, plus an additional variable period for calculating the resulting extrapolated frame buffer; delays of 16-32ms are common.
While the pixel response time of the display is usually listed in the monitor"s specifications, no manufacturers advertise the display lag of their displays, likely because the trend has been to increase display lag as manufacturers find more ways to process input at the display level before it is shown. Possible culprits are the processing overhead of HDCP, Digital Rights Management (DRM), and also DSP techniques employed to reduce the effects of ghosting – and the cause may vary depending on the model of display. Investigations have been performed by several technology-related websites, some of which are listed at the bottom of this article.
LCD, plasma, and DLP displays, unlike CRTs, have a native resolution. That is, they have a fixed grid of pixels on the screen that show the image sharpest when running at the native resolution (so nothing has to be scaled full-size which blurs the image). In order to display non-native resolutions, such displays must use video scalers, which are built into most modern monitors. As an example, a display that has a native resolution of 1600x1200 being provided a signal of 640x480 must scale width and height by 2.5x to display the image provided by the computer on the native pixels. In order to do this, while producing as few artifacts as possible, advanced signal processing is required, which can be a source of introduced latency. Interlaced video signals such as 480i and 1080i require a deinterlacing step that adds lag. Anecdotallyprogressive scanning mode. External devices have also been shown to reduce overall latency by providing faster image-space resizing algorithms than those present in the LCD screen.
Many LCDs also use a technology called "overdrive" which buffers several frames ahead and processes the image to reduce blurring and streaks left by ghosting. The effect is that everything is displayed on the screen several frames after it was transmitted by the video source.
Display lag can be measured using a test device such as the Video Signal Input Lag Tester. Despite its name, the device cannot independently measure input lag. It can only measure input lag and response time together.
Lacking a measurement device, measurement can be performed using a test display (the display being measured), a control display (usually a CRT) that would ideally have negligible display lag, a computer capable of mirroring an output to the two displays, stopwatch software, and a high-speed camera pointed at the two displays running the stopwatch program. The lag time is measured by taking a photograph of the displays running the stopwatch software, then subtracting the two times on the displays in the photograph. This method only measures the difference in display lag between two displays and cannot determine the absolute display lag of a single display. CRTs are preferable to use as a control display because their display lag is typically negligible. However, video mirroring does not guarantee that the same image will be sent to each display at the same point in time.
In the past it was seen as common knowledge that the results of this test were exact as they seemed to be easily reproducible, even when the displays were plugged into different ports and different cards, which suggested that the effect is attributable to the display and not the computer system. An in depth analysis that has been released on the German website Prad.de revealed that these assumptions have been wrong. Averaging measurements as described above lead to comparable results because they include the same amount of systematic errors. As seen on different monitor reviews the so determined values for the display lag for the very same monitor model differ by margins up to 16 ms or even more.
To minimize the effects of asynchronous display outputs (the points of time an image is transferred to each monitor is different or the actual used frequency for each monitor is different) a highly specialized software application called SMTT
Several approaches to measure display lag have been restarted in slightly changed ways but still reintroduced old problems, that have already been solved by the former mentioned SMTT. One such method involves connecting a laptop to an HDTV through a composite connection and run a timecode that shows on the laptop"s screen and the HDTV simultaneously and recording both screens with a separate video recorder. When the video of both screens is paused, the difference in time shown on both displays have been interpreted as an estimation for the display lag.16 ms or even more.
Display lag contributes to the overall latency in the interface chain of the user"s inputs (mouse, keyboard, etc.) to the graphics card to the monitor. Depending on the monitor, display lag times between 10-68 ms have been measured. However, the effects of the delay on the user depend on each user"s own sensitivity to it.
Display lag is most noticeable in games (especially older video-game consoles), with different games affecting the perception of delay. For instance, in PvE, a slight input delay is not as critical compared to PvP, or to other games favoring quick reflexes like
If the game"s controller produces additional feedback (rumble, the Wii Remote"s speaker, etc.), then the display lag will cause this feedback to not accurately match up with the visuals on-screen, possibly causing extra disorientation (e.g. feeling the controller rumble a split second before a crash into a wall).
TV viewers can be affected as well. If a home theater receiver with external speakers is used, then the display lag causes the audio to be heard earlier than the picture is seen. "Early" audio is more jarring than "late" audio. Many home-theater receivers have a manual audio-delay adjustment which can be set to compensate for display latency.
Many televisions, scalers and other consumer-display devices now offer what is often called a "game mode" in which the extensive preprocessing responsible for additional lag is specifically sacrificed to decrease, but not eliminate, latency. While typically intended for videogame consoles, this feature is also useful for other interactive applications. Similar options have long been available on home audio hardware and modems for the same reason. Connection through VGA cable or component should eliminate perceivable input lag on many TVs even if they already have a game mode. Advanced post-processing is non existent on analog connection and the signal traverses without delay.
A television may have a picture mode that reduces display lag for computers. Some Samsung and LG televisions automatically reduce lag for a specific input port if the user renames the port to "PC".
LCD screens with a high response-time value often do not give satisfactory experience when viewing fast-moving images (they often leave streaks or blur; called ghosting). But an LCD screen with both high response time and significant display lag is unsuitable for playing fast-paced computer games or performing fast high-accuracy operations on the screen, due to the mouse cursor lagging behind.
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.
When you"re using a monitor, you want your actions to appear on the screen almost instantly, whether you"re typing, clicking through websites, or gaming. If you have high input lag, you"ll notice a delay from the time you type something on your keyboard or when you move your mouse to when it appears on the screen, and this can make the monitor almost unusable.
For gamers, low input lag is even more important because it can be the difference between winning and losing in games. A monitor"s input lag isn"t the only factor in the total amount of input lag because there"s also delay caused by your keyboard/mouse, PC, and internet connection. However, having a monitor with low input lag is one of the first steps in ensuring you get a responsive gaming experience.
Any monitor adds at least a few milliseconds of input lag, but most of the time, it"s small enough that you won"t notice it at all. There are some cases where the input lag increases so much to the point where it becomes noticeable, but that"s very rare and may not necessarily only be caused by the monitor. Your peripherals, like keyboards and mice, add more latency than the monitor, so if you notice any delay, it"s likely because of those and not your screen.
There"s no definitive amount of input lag when people will start noticing it because everyone is different. A good estimate of around 30 ms is when it starts to become noticeable, but even a delay of 20 ms can be problematic for reaction-based games. You can try this tool that adds lag to simulate the difference between high and low input lag. You can use it to estimate how much input lag bothers you, but keep in mind this tool is relative and adds lag to the latency you already have.
There are three main reasons why there"s input lag during computer use, and it isn"t just the monitor that has input lag. There"s the acquisition of the image, the processing, and finally actually displaying it.
The acquisition of the image has to do with the source and not with the monitor. The more time it takes for the monitor to receive the source image, the more input lag there"ll be. This has never really been an issue with PCs since previous analog signals were virtually instant, and current digital interfaces like DisplayPort and HDMI have next to no inherent latency. However, some devices like wireless mice or keyboards may add delay. Bluetooth connections especially add latency, so if you want the lowest latency possible in the video acquisition phase, you should use a wired mouse or keyboard or get something wireless with very low latency.
Once the image is in a format that the video"s processor understands, it will apply at least some processing to alter the image somehow. A few examples:
The time this step takes is affected by the speed of the video processor and the total amount of processing. Although you can"t control the processor speed, you can control how many operations it needs to do by enabling and disabling settings. Most picture settings won"t affect the input lag, and monitors rarely have any image processing, which is why the input lag on monitors tends to be lower than on TVs. One of these settings that could add delay is variable refresh rate, but most modern monitors are good enough that the lag doesn"t increase much.
Once the monitor has processed the image, it"s ready to be displayed on the screen. This is the step where the video processor sends the image to the screen. The screen can"t change its state instantly, and there"s a slight delay from when the image is done processing to when it appears on screen. Our input lag measurements consider when the image first appears on the screen and not the time it takes for the image to fully appear (which has to do with our Response Time measurements). Overall, the time it takes to display the image has a big impact on the total input lag.
Now, let"s talk about how we measure the input lag. It"s a rather simple test because everything is done by our dedicated photodiode tooland special software. We use this same tool for our response time tests, but it measures something differently with those. For the input lag, we place the photodiode tool at the center of the screen because that"s where it records the data in the middle of the refresh rate cycle, so it skews the results to the beginning or end of the cycle. We connect our test PC to the tool and the TV. The tool flashes a white square on the screen and records the amount of time it takes until the screen starts to change the white square; this is an input lag measurement. It stops the measurement the moment the pixels start to change color, so we don"t account for the response time during our testing. It records multiple data points, and our software records an average of all the measurements, not considering any outliers.
When a TV displays a new image, it progressively displays it on the screen from top to bottom, so the image first appears at the top. As we have the photodiode tool placed in the middle, it records the image when it"s halfway through its refresh rate cycle. On a 120Hz TV, it displays 120 images every second, so every image takes 8.33 ms to be displayed on the screen. Since we have the tool in the middle of the screen, we"re measuring it halfway through the cycle, so it takes 4.17 ms to get there; this is the minimum input lag we can measure on a 120Hz TV. If we measure an input lag of 5.17 ms, then in reality it"s only taking an extra millisecond of lag to appear of the screen. For a 60Hz TV, the minimum is 8.33 ms.
Some people may confuse our response time and our input lag tests. For input lag, we measure the time it takes from when the photodiode tool sends the signal to when it appears on-screen. We use flashing white squares, and the tool stops the measurement the moment the screen changes color so that it doesn"t include the response time measurement. As for the response time test, we use grayscale slides, and this test is to measure the time it takes to make a full transition from one gray shade to the next. In simple words, the input lag test stops when the color on the screen changes, and the response time starts when the colors change.
This test measures the input lag of 1080p signals with a 60Hz refresh rate. This is especially important for older console games (like the PS4 or Xbox One) or PC gamers who play with a lower resolution at 60Hz. As with other tests, this is done in Game Mode, and unless otherwise stated, our tests are done in SDR.
We repeat the same process but with Game Mode disabled. This is to show the difference between in and out of Game Mode. It could be important if you scroll a lot through your TV"s smart OS and you easily notice delay, so if you find it"s too high and it"s bothering you, simply switch into Game Mode when you need to scroll through menus.
This result can also be important if you want to play video games with the TV"s full image processing. You might consider this if you"re playing a non-reaction-based game.
This result is important if you play 1440p games, like from an Xbox or a PC. However, 1440p games are still considered niche, and not all TVs support this resolution, so we can"t measure the 1440p input lag of those.
The 4k @ 60Hz input lag is probably the most important result for most console gamers. Along with 1080p @ 60Hz input lag, it carries the most weight in the final scoring since most gamers are playing at this resolution. We expect this input lag to be lower than the 4k @ 60Hz with HDR, chroma 4:4:4, or motion interpolation results because it requires the least amount of image processing.
With the PC sending a 4k @ 60Hz signal, we use an HDFury Linker to add an HDR signal. This is important if you play HDR games, and while it may add some extra lag, it"s still low for most TVs.
The average input lag when all the TV settings are optimized to reduce it at this specific resolution with proper full 4:4:4 chroma, without subsampling.
This test is important for people wanting to use the TV as a PC monitor. Chroma 4:4:4 is a video signal format that doesn"t use any image compression, which is necessary if you want proper text clarity. We want to know how much delay is added, but for nearly all of our TVs, it doesn"t add any delay at all compared to the 4k @ 60Hz input lag.
Like with 1080p @ 60Hz Outside Game Mode, we measure the input lag outside of Game Mode in 4k. Since most TVs have a native 4k resolution, this number is more important than the 1080p lag while you"re scrolling through the menus.
Motion interpolation is an image processing technique that increases the frame rate to a higher one, like if you want to increase a 30 fps video up to 60 fps. However, for most TVs, you need to disable the Game Mode to enable the motion interpolation setting, as only Samsung offers motion interpolation in Game Mode. As such, most TVs will have a high input lag with motion interpolation. Also, we measure this with the motion interpolation settings at their highest because we want to see how the input lag will increase at the strongest, like a worst-case scenario.
We repeat most of the same tests but with 120 fps signals instead. This is especially important for gaming on some gaming consoles, like the Xbox Series X or Xbox One X, as some other devices don"t output signals at 120 fps. The 120Hz input lag should be around half the 60Hz input lag, but it"s not going to be exactly half.
Once again, this result is only important for PC and Xbox gamers because they use 1440p signals. Not all TVs support this resolution either, so we can"t always test for it.
This test is important if you"re a gamer with an HDMI 2.1 graphics card or console. Since most 4k @ 120Hz signals require HDMI 2.1 bandwidth, you don"t have to worry about this if your TV or gaming console is limited to HDMI 2.0. For this test, we use our HDMI 2.1 PC with an NVIDIA RTX 3070 graphics card because we need an HDMI 2.1 source to test it.
We also measure the input lag with any variable refresh rate (VRR) support enabled, if the TV has it. VRR is a feature gamers use to match the TV"s refresh rate with the frame rate of the game, even if the frame rate drops. Enabling VRR could potentially add lag, so that"s why we measure it, but most TVs don"t have any issue with this. We measure this test by setting the TV to its maximum refresh rate and enabling one of its VRR formats, like FreeSync or G-SYNC.
Input lag is not an official spec advertised by most TV companies because it depends on two varying factors: the type of source and the settings of the television. The easiest way you can measure it is by connecting a computer to the TV and displaying the same timer on both screens. Then, if you take a picture of both screens, the time difference will be your input lag. This is, however, an approximation, because your computer does not necessarily output both signals at the same time. In this example image, an input lag of 40 ms (1:06:260 – 1:06:220) is indicated. However, our tests are a lot more accurate than that because of our tool.
Most people will only notice delays when the TV is out of Game Mode, but some gamers might be more sensitive to input lag even in Game Mode. Keep in mind that the input lag of the TV isn"t the absolute lag of your entire setup; there"s still your PC/console and your keyboard/controller. Every device adds a bit of delay, and the TV is just one piece in a line of electronics that we use while gaming. If you want to know how much lag you"re sensitive to, check out this input lag simulator. You can simulate what it"s like to add a certain amount of lag, but keep in mind this tool is relative to your current setup"s lag, so even if you set it to 0 ms, there"s still the default delay.
The most important setting to ensure you get the lowest input lag possible is the Game Mode setting. This varies between brands; some have Game Mode as its own setting that you can enable within any picture mode, while others have a Game picture mode. Go through the settings on your TV to see which it is. You"ll know you have the right setting when the screen goes black for a second because that"s the TV putting itself into Game Mode.
Many TVs have an Auto Low Latency Mode feature that automatically switches the TV into Game Mode when you launch a game from a compatible device. Often, you need to enable a certain setting for it to work.
Some peripherals like Bluetooth mice or keyboards add lag because Bluetooth connections have inherent lag, but those are rarely used with TVs anyways.
Input lag is the time it takes a TV to display an image on the screen from when it first receives the signals. It"s important to have low input lag for gaming, and while high input lag may be noticeable if you"re scrolling through Netflix or other apps, it"s not as important for that use. We test for input lag using a special tool, and we measure the input lag at different resolutions, refresh rates, and with different settings enabled to see how changing the signal type can affect the input lag.
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.
Yes we all know that CRTs have the lowest input lag of any display ever made. The problem with this longstanding fact is that any non-CRT screen is shunned by smashers as if they might contract the bubonic plague instantly upon use.
There has been talk of LCD monitors being used at upcoming international events such as Evo. This talk, as expected, has elicited a negative reaction from many smashers. The primary goal of this article is to outline just how close these monitors can get to the response time of a CRT, what the effect of lag is on gameplay, and to dispel some myths people have about LCD monitors.
In order to determine if a particular monitor lags, people will often try the subjective strategy. That is, they ask a smasher to play on a monitor and comment on how it feels. This strategy usually results in comments such as:
Fantastic responses! We have determined nothing. One major problem here is that smashers tend to placebo hard as soon as something that isn’t 2 feet in depth is placed in front of them. The other major problem is that humans are actually really bad at determining small fraction of a second differences.
With that said, it makes sense to attempt an objective strategy. With some help from Mofo, I developed a method for objectively testing lag on any type of screen. One of the beautiful things about this method is that it actually uses melee itself as part of the test. This means that we can say for a fact that every single possible source of lag has been accounted for.
As players, the lag we are sensitive to is the time between when an input is pressed to the time when the game’s reaction to that input appears on screen. In simpler words, lag is when stuff happens later than it should.
The game itself, however, cannot physically react to inputs immediately when they are received. This is because the game operates in discrete frames. For example, a character’s move can only ever begin on a frame, it cannot begin at any point in between. That said, players can and do press buttons in the time between frames. This means that if the A button is pushed to jab, the game will only begin the jab startup in a variable amount of time between 0 ms and 16.66 ms (length of one frame) – it will not start the jab immediately upon receiving the input.
The way this reality interacts with button combinations is rather interesting. It means that even if, on two separate occasions, the same two buttons are pressed with precisely the same delay in between, the same result is not guaranteed.
They understand the margin of error and press inputs in consequence to this. In the above graphic this would mean adding some time between button presses to guarantee the timing is always met
The point to make here is that, even on a monitor that is lagless, lag still exists in some way. A fully lagless experience is impossible. One way or another, players are capable of dealing with some lag. Lag induced by a laggy screen, however, is an added constant on top of this variable lag – the effect of which will be explored later.
The original concept for determining the latency of a monitor was to somehow detect the time difference between when an input is pressed to when a particular frame shows up on screen. Unfortunately, this approach is iffy at best for the reasons described in the previous section. The result of such a test would be (true lag) + (time until next frame) where time until next frame is between 0 and 16.66 ms. Luckily, there is another signal that is separate from the video and yet is related to it time wise – sound.
Let’s consider Captain Falcon standing on FD about to falcon punch. When the b button is registered by the console, the console will send the video frame information consistent with displaying a falcon punch as well as the sound information containing the famous words “FALCON PUNCH!”. The sound and the video will always be sent out by the console at the same times respective to each other, irrespective of where the input landed in the subframe region.
When the video signal reaches the display, it is first processed, and then displayed. This processing time is effectively the lag of the monitor – it will cause the two signals, audio and video, to appear desynchronized. It is the amount of time of this desynchronization that will be measured to determine lag.
When a player pauses a match in melee, two events happen nearly simultaneously: a white decal surrounds the screen and a high pitched sound is played, both denoting the pause has happened. These are both very strong and easy to recognize signals.
The audio output of the console is hooked up to an Arduino. The arduino lights an LED upon the detection of an audio signal. This effectively turns an audible signal (sound) into a visible signal (light).
When this test is executed on a CRT, the gold standard for response time, a time difference is obtained from step 4. This time difference is the expected value required if another system is to be called truly lagless.
When this test is executed on a laggy monitor, the time difference will be greater than what was seen on the CRT. The audio output will be detected at the same time but the pause decal will show up at some time later. The lag of this TV can then be calculated by the formula:
There are a few sources of error in the testing method. In order to improve the accuracy of the results, the test was executed multiple times on each monitor and the results averaged. That said, there is likely still about plus or minus 1 ms of error for the LCD results.
CRTs have less error associated with their measurement because determining when the video signal has appeared is less subjective. On an LCD, as can be seen in the gif above, the decal shows up in an incomplete fashion before the signal is accepted. This is done to provide a more fair comparison to the CRT – on which the top part of the decal is instantly fully clear.
If you have ever been introduced to the website www.displaylag.com, you might wonder how it is possible for the results to be so low with the RL2455HM.
The RL2455HM monitor displays a frame from top to bottom. This method of showing a frame is identical to how a CRT displays a frame. Display lag database uses the average latency across three zones (top, center, and bottom). Using this metric, even a CRT would not have zero lag – it would have 8.3 ms of lag. This is because it takes a full 16.66 ms to display the entire frame from top to bottom. In this particular case and in many others, when comparing to a CRT it is more fair to subtract about 8 ms from the number reported by Display Lag.
Because CRTs actually take time to display a whole frame, it is technically possible for a flat screen monitor to appear faster than a CRT. Given a small initial delay such as the RL2455HM + LGP and a faster refresh rate, it may be possible for the center of a frame and certainly the bottom of a frame to appear on screen earlier than it otherwise would on a CRT.
It is important to make a distinction between events that are affected by monitor lag and events that are not. Events that are executed via muscle memory timing, pressing one button at the correct timing after another, such as a wavedash are very easy to execute even on very laggy monitors because they do not utilize much visual prompting.
Let’s consider a person who can successfully power shield a laser 95% of the time. Assuming that human reaction follows the gaussian pattern, a gaussian response that could meet this success rate has a mean at 1 frame before the laser hits and a variance of about 72.25 (ms). Introducing the lag of the monitor and assuming that the distribution is shifted over by an amount equal to the lag, the probability of a successful power shield only drops to 93.7%.
Now a 95% success rate on a power shield is rather good. Let’s assume the person is still good but not super human – they have a lower success rate of 50% caused by an increase in variance. Given the same amount of lag, their success rate only drops to 49.7%. The takeaway from this is that given a higher variance, larger variation in a person’s ability to respond in a given amount of time, the effect of monitor latency diminishes.
It is also possible for the mean to not be perfectly centered along the target area. For example, consider a person has the same variance as described in the first example – a variance which signifies the person is quite proficient at hitting a 2 frame window. Now consider that this person tends to power shield late, late enough that their success rate is only 62.3% on a lagless system. With the lag added, this person’s success rate would drop to 49.1%. This scenario is just about the worst case given this kind of variance. The best case scenario is when the lag actually helps the player. If the same person had a tendency to hit early instead, their success rate would actually increase from 62.3% to 74.2%.
Notice that in the example where the percent of success dropped from 62.3% to 49.1%, the person was not extremely proficient at hitting the window to begin with. In contrast, when the success rate was 95% to start, the percent of success dropped a very small amount. A person proficient at hitting a 1 or 2 frame window either has a mean that is very close to centered on that target window, or has a very small variance. That said, there is a limit to how small a human’s variance can be. If a person has a 95%+ success rate hitting a 1 or 2 frame window, it is likely safe to assume it is caused by a well placed mean. Hence, players that can hit these timings very often will be very minimally affected by the added latency.
Now let’s talk about if the lag was a bit worse. Let’s consider a monitor that is slow by one full frame, 16.66 ms. In the first example with the 95% success rate person, their rate on this monitor would drop all the way to 50%. That is, 93.7% on a 2.86 ms monitor, 50% on a 16.66 ms monitor. This highlights the fact that there is a major difference between a monitor that is pretty good and one that is very good. Most monitors that people have tried would likely fall under the “pretty good” category at best. Do not allow past experiences with other monitors to influence your conception of these “very good” monitors.
All the calculations in this section were made under the assumption that the human does not adapt to the new lag. It may also be possible that the brain notices the slight offset and corrects to some degree. If the brain does do some correction, then the difference would be even smaller than described.
By the reasoning outlined in the previous section, minor lag does not appear to be a major factor for player performance. That said, I ask the reader, have you ever heard someone claim that some CRTs lag? Why do people think this? In my test results, CRT 2 is a 14 inch CRT. I have heard many negative comments about CRT 2. People just don’t seem to like it, often claiming that it lags. As shown by the results, the lag difference is essentially non-existent – it is well within the error of the test. So then, why do people not like it?
My theory is that people are also sensitive to image distortion. CRT 2 has a very clear image, but being a small monitor, it has a rather rounded screen. This rounded screen causes the image to appear somewhat distorted. This is very minor and difficult to notice but it may be the cause for the hate it has received.
The takeaway here is that lag is not the only problem with a screen. When an image looks different than another, it can throw a player off. This precise issue leads to one of the most powerful arguments for having a pro-LCD position. All the screens are the same. No more swapping back and forth between small CRTs and large CRTs. No more old, ugly, discolored CRTs. No more terrible terrible audio. The same image – same experience – every time.
It is true that very fast monitors such as the RL2455HM have some problems. Periods of fast movement can lead to minor ghosting. But overall the image quality is extremely good. After a few hours of using one, I fully expect a player to be used to it and be capable of ignoring any of its image defects.
We have seen that major tournament hosts and companies are reluctant to use CRTs. Maybe having an assortment of unique, archaic TVs gives their venue an unprofessional look. Maybe obtaining CRTs from the community is a hassle. Regardless of the reason, it is certainly a point which weakens these entities’ desire to host smash events.
That said, maybe they will accept our CRTs this year. But what about the next? And then the year after that? I expect if you are reading this you have a desire to see smash grow. CRTs are dead technology. Can we not adapt to changing technology? What kind of image does that portray to people that are not part of our scene?
There’s no game quite like melee. The fluidity of movement and execution skill cap enable a brilliant form of art we’ve come to worship. The love is real, the potential for growth is now.
Everyone has noticed the growth which our exposure at Evo provided. These big events are paramount to our continued growth. If dealing with an extremely small amount of delay helps aid that cause, how can you not support it? This small amount of lag, by the way, is bound to reduce even further as the technology improves. Maybe our exposure at these big events and our willingness to try new technology will encourage companies like BenQ or Asus to come out with new monitors that are even better for our use – monitors with support for native component input, or even maybe composite inputs.
To those that own these set ups or plan on getting one, I implore that you configure them properly and invite people to try them. For those that haven’t tried them, I encourage you to give it a fair chance. Who knows? You may find that these convenient monitors are not so bad after all.
For gamers, display lag is a very real concern, and display processing is a nebulously reported (if at all) specification for just about all LCD displays. Ultimately, what matters isn’t GTG, full on, full off pixel response times, or what’s reported on the spec sheet, but the holistic latency of the monitor compared to something we can all agree is lag-free. We previously used a baseline LCD and compared with it as our benchmark of no display lag - we’ve since started using a 17” Princeton EO700 CRT. It supports 1024x768 at 85 Hz.
To do these tests, we connect the CRT up to a DVI to VGA adapter on our test computer’s ATI Radeon HD5870, and the LCD panel under test to DVI using an HDMI to DVI cable. I debated for some time the merits of using the same VGA signal, however, what really matters here is how the two display methods matter in the way that you, readers, are most likely to set things up. In addition, using the VGA input on any LCD is bound to add additional lag, as this is definitely a hardware scaler operation to go from analog to digital signaling, compared to the entirely digital DVI datapath. We run the CRT at 1024x768 and 85 Hz, its highest refresh rate, and clone the display to the LCD panel.
We use the same 3Dmark03 Wings of Fury benchmark on constant loop, take a bunch of photos with a fast camera (in this case, a Nikon D80 with a 17-50mm F/2.8) with wide open aperture for fast shutter speeds, in this case up to 1/800 of a second. Any differences on the demo clock will be our processing lag, and we’ll still get a good feel for how much pixel response lag there is on the LCD. As I mentioned earlier, the only downside is that this means our old data is no longer a valid reference.
To compute the processing lag, I do two things. First, I watch for differences in the clock between the CRT and LCD, noting these whenever they are visible. I did this for 10 captures of the same sequence. Second, one can compute the processing difference by taking into account the FPS and the frame number difference:
Of course, not every one of those frames is written to the display, but we can still glean how much time difference there is between these respective frames with much more precision than from averaging the time, which only reports down to 1/100ths of a second. An example shot of what this difference looks like on the X270W is the following:
There’s an interesting trend emerging already, and we’ve only got two data points. First off, it’s obvious by now from doing these tests that relying on the time counter at the bottom of the 3Dmark 03 window is relatively unreliable - you either get 10 ms of difference (.01 seconds), or no difference at all. It’s very binary since the processing lag we’re looking for is effectively below our sampling rate, and as a consequence it takes a lot of these points to get data (I averaged 15). On the other hand, it’s very easy to weight the frame difference by FPS and compute the time between, and that tells a different story with greater precision. From those metrics, it’s apparent that the X270W does have lower processing lag than the G2410H. The difference is slight, however, at 2.84 ms - way under what the human eye can perceive - but a difference nonetheless.
We still don’t get near the 2 ms response quoted by Sceptre, but being roughly 6 ms slower than the CRT is pretty darn good, so good that I honestly don’t think it’s humanly possible to tell the difference.
LCD performance still isn’t technically at parity with CRTs, but you’d be hard pressed to tell the difference. There’s still a visible ghosting image before and after the primary frame, visible in the photo above. This is something virtually all the LCDs we’ve tested exhibit, but in practice the ghosting isn’t discernable at all.
I consider myself an avid PC gamer and threw the X270W at FPS, RTS, and RPG titles alike and never noticed ghosting or any perceptible lag, ever. I think it’s more than fair to say that the X270W is a worthy choice for gamers that are generally very discerning about their input lag. By the numbers, the X270W is the best we’ve tested with our new methods thus far, but then again we’ve only got two data points.
When you have a setup that includes a projector, you will have the projector and the input that tells the projector what to project onto the screen. There are several different ways that your projector can lag, including the following:
Input lag is the delay between the action requested by a remote or any device that you are using and your projector. For example, if you press the pause button and it takes some time before the projector pauses, this is input lag.
Video lag occurs when you get a sudden drop in frames per second, which shows up as stuttering. It is usually a problem that comes from streaming or from your video source.
Component lag happens when the cables that connect your devices are damaged or improperly connected. The lag can slow down and make the image glitch, or it can stop the projector from working.
Any of these types of lag will make it very difficult to enjoy watching whatever is playing on your screen. It can make the images stutter and glitch, and you will want to fix it.
Most of the time, you can measure the input lag in a projector in milliseconds. Projectors come with this spec, and it lets you know the amount of time between the projector receiving the image and actually displaying it.
The input lag is found using the resolution and refresh rate of the projector, and they will usually tell you how they calculated this time. If the projector has a higher refresh rate, it will likely have a smoother performance.
How much input lag matters depends on how you use your projector. For gamers, it is important to have an input lag of under 40ms, while people who use the projector to watch movies can tolerate an input lag of 50ms or more.
The first thing that you need to do is determine what is causing the lag. You can find out what is causing it by testing the projector and trying to find out when the lag occurs.
You need to determine whether the lag is caused by the input, the video, or the component. Then, you can investigate further to find out what is causing the lag.
It is important that you aren’t trying to project an image that has a higher quality than the projector is capable of handling. If you do, you will experience lag.
Another cause of input lag can be the room temperature. You should check that your projector hasn’t overheated because when it gets hot, it can stop functioning as well.
Finally, you should check to make sure that your device is compatible with the projector. You can find a copy of your user manual to double-check what devices are compatible with your projector.
Another way to prevent lag is to ensure that your streaming device is up to date. You should check to see that you have installed the latest software updates before you try to stream a movie.
In addition, you may need to install the latest drivers and configure your device to work with the projector. Make sure that your video file isn’t corrupted as well.
You need to keep your projector clean and avoid eating or drinking anything near the projector or any of your other devices. Electronic devices don’t do well if they get wet.
The most important thing is you aren’t trying to project an image that is higher quality than the projector can handle. This will lead to lagging and other types of distortion.
Input lag can be very frustrating because it ruins the movies and videos that you are trying to project on your big screen. It can be caused by different factors, but you can take steps to fix this problem.
You need to determine the cause of the lagging, and then you can correct the problem. Go through the list to make sure that you have your projector set up the right way.
Check your connections and look at your cables; you may need to replace them. Be sure to check your streaming device to see if there is any reason that it might have a lag.
The lag can be caused by the connection, the input device, the video device, and more. It is disruptive when you are trying to watch a movie, and it can make it impossible to play a game.
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.
with the dark and light squares as exactly the same size as possible. Once this is done, this will be your display"s measured "Moving Picture Response Time" (MPRT) / measured "Motion Clarity Ratio" (MCR).
MPRT is not the same thing as GtG. See FAQ: GtG versus MPRT. A different animation is TestUFO: GtG versus MPRT. Moving Picture Response Time (MPRT) is display persistence. GtG is the pixel transition time, while MPRT is pixel visibility time. MPRT can still create a lot of display motion blur even if pixel response (GtG) is instant.
MPRT is a more accurate representation of visible motion blur blur (see Eye Tracking Motion Blur Animation demo). For the scientifically ideal instant-response sample-and-hold display, MPRT is exactly equal to the time period of one refresh cycle.
For the scientifically ideal impulse-driven display (e.g. square-wave strobe backlight), MPRT is exactly equal to strobe flash length. (see Black Frames Insertion Animation).
However, the more scientific term is Moving Picture Response Time (MPRT) found in science papers on Google Scholar. For more information about display persistence, see Blur Busters Law.
Motion Clarity Ratio (MCR) is equal to 1000 divided by MPRT. Similiar terms are sometimes used by TV manufacturers ("Clear Motion Ratio", "Motion Clarity Index", etc.)
to represent an equivalence to a refresh rate. Techniques such as frame-interpolation and impulse-driving (scanning backlights, strobing) frequently combine to
create higher Motion Clarity Ratios (MCR). It represents the same perceived display motion blur as an ideal sample-and-hold display refreshing at a Hz matching the MCR value.
- For more accurate measurements, do both black/white and inverse white/black tests, and average the results. LCD GtG pixel transitions are often asymmetric.
- A larger "Size" setting allows a more accurate measurement. However, faster displays and higher refresh rates are easier with a smaller "Size" setting.
- With lower-persistence displays and/or high refresh rates, try using smaller Checkerboard Size numbers. Avoid Thickness values bigger than Checkerboard Size.
- Strobe backlight technologies (e.g. ULMB and LightBoost) often have MPRT values of 1ms to 2ms, and may require Checkerboard Size of 1 or 2 to measure.
TV manufacturers have been trying to combat something called "motion blur" for years. You may have noticed the blur before and not been able to put your finger on what exactly was so bothersome about it. Or you may be enjoying watching television in blissful ignorance, never even realizing that your TV looks soft. Sorry in advance for ruining your viewing experience, but there are a few potential solutions to consider. However, these methods often have side effects that, for many people, are worse than the cure.
High refresh rates and motion smoothing are just the beginning. Numerous other anti-blurring technologies, including TVs. Learning the pros and cons of each should help you get a TV image you"re happier with. Or at least, happier than before I ruined TVs for you.
Motion blur is when anything on-screen blurs, becoming fuzzy and less distinct, when it moves. This can be a single object, like a ball or car, or the entire screen, as when the camera pans across a landscape.
I always notice it when there"s a closeup of a face, and then the person turns away. One second you"re seeing every eyelash and wrinkle, the next it"s a blurry mess.
In the early days of flat TVs and displays, the culprit was often the slow speed of the liquid crystal elements that create an image on LCD TV. These days most LCDs are able to change their states fast enough that motion blur is caused by something else: "sample and hold."
LCDs -- and modern OLED TVs -- configure their pixels to show an image and then hold that image until the screen refreshes. With most TVs this means that for a full one-sixtieth of a second, the image is stationary on screen. Then the screen refreshes and a new image is held there for another one-sixtieth of a second. Some TVs have faster refresh rates, and in some countries TV refresh every one-fiftieth of a second, but the process is the same.
Sixty still images every second is fast enough to exceed your brain"s flicker fusion threshold. You don"t see still images, you see fluid motion. However, your brain is working fast enough that it"s expecting to see motion during those hold times. The images are held long enough that your brain assumes anything in motion is going to continue being in motion… but it isn"t. It"s actually stationary and then jumps to the next position, which is also stationary.
Your brain and eyes, expecting smooth motion, blur the object by moving to follow where it should be. The physiological reasons behind this are beyond the scope of this article, but the key aspect is that motion blur is in your head (isn"t everything?), which is important when it comes to discussing how we get rid of it.
Higher refresh rates don"t, in and of themselves, fix the motion blur problem. The images are still being held, and if you just double the number of still images to fit 60 into 120 you haven"t really changed anything. You need something to change to, and that"s when things get interesting.
The processing in modern TVs can determine, with a surprising amount of accuracy, what happens in between two frames of video. For instance, if a ball is on the left side of the screen in frame A, and the right side of the screen in frame B, the TV could safely assume that if there was a frame between A and B, the ball would be in the center of the screen.
A 120Hz TV determines what this "AB" frame would look like, then inserts it between frames A and B. This means there are more frames to switch between, and less time "held" on each frame. This is called frame or motion interpolation. With video content like sports, a new frame is inserted between every original frame, and the result is less motion blur and greater apparent detail. With movies and scripted TV shows, however, there"s a problem.
Nearly every movie and nonreality TV show is recorded at a frame rate of 24 frames per second. This goes back to when nearly everything was shot on film. Though the early days had a variety of frame rates, Hollywood settled on 24, and it has been that way for decades.
Interpolating frames increases the apparent frame rate, so 24fps content no longer looks like 24fps content, because when shown on these TVs, it isn"t 24fps content. The interpolation effectively increases the frame rate so 24fps content looks more like 30 or 60fps. More like sports, reality TV or the content that gives this effect its name: the soap opera effect. That"s where our friend Tom comes in.I’m taking a quick break from filming to tell you the best way to watch Mission: Impossible Fallout (or any movie you love) at home. pic.twitter.com/oW2eTm1IUA— Tom Cruise (@TomCruise) December 4, 2018
Many people don"t notice, or don"t care, about the soap opera effect. Others, like Tom and me, can"t stand it. The ultrasmooth motion is not just artificial-looking, but can be distracting and unpleasant. Most Hollywood creators hate it, too, because it isn"t what the director intended for his or her creative vision. If they wanted to record at 48fps, they"d have recorded at 48fps, like
Fortunately, most TVs not only give you the option to turn it off, but let you adjust how intense the frame interpolation is. So instead of a created frame that"s halfway between A and B, maybe it"s only slightly different from A or slightly different than B. If your TV has this adjustment, it"s worth playing with to see if you can find a setting that reduces motion blur enough that you"re not bothered by it, but isn"t as intrusive as the more intense frame interpolation modes are. Some even separate out the processing to reduce the judder caused by
This, too, has its history in cinema. Though filmed at 24 frames per second, movies weren"t shown at 24 frames per second. This was slow enough that some people saw the flicker. Instead, each film frame was shown twice, with a shutter blocking the light in between. Some cinemas went even further, showing each film frame three times. This blanking was a simple way to give some of the "performance" of a higher frame rate without the cost of additional film stock.
With black frame insertion, there"s less "hold" in the sample-and-hold. It fools your brain better into thinking there"s smooth motion. Once again, however, there are a couple downsides.
When the TV spends half of its time showing a black screen, its light output drops. In many cases this trade-off is acceptable, as modern TVs are exceptionally bright. In other cases, not as much. I have a front projector, for example, and the BFI mode can make the image look very dim.
There"s also the potential for visible flicker, as the TV essentially flashes on and off with the inserted black frames. CNET"s TV reviews often find that the flicker from BFI is too intense to be worth the improvement in motion blur.
Like frame interpolation, black frame insertion has different implementations. Rarely would a TV with a BFI mode show a black frame for the same length of time it shows a real frame. It"s also not necessarily a "frame" at all. All LCDs create light with a
There are also levels of how "black" the black frame is. A 120Hz TV could insert a frame that"s a duplicate of the previous frame, but darker. Not "black," just dimmer. There are pros and cons to this method, too. Not as much light is lost, but perhaps the motion doesn"t seem quite as sharp.
The only two flat-panel TV technologies available today, LCD and OLED, both suffer from motion blur. However, there is still one display technology that doesn"t:
Currently only found in front projectors, Digital Light Processing uses millions of tiny mirrors that rapidly flash on and off to build an image on a screen.
If you loathe motion blur, though, this is easily the best option. I am a projector proselytizer, but it"s definitely a lifestyle choice. You"d really, really have to hate motion blur for this to be the reason you switch.
Many new TVs, especially midrange and high-end models, have some adjustability in how they handle motion blur. Hopefully, if motion blur bothers you, you can find a setting that works for you without annoying the rest of the family.
I have long loathed motion blur, being far more aware and annoyed by it than my peers. Since I also hate the soap opera effect, the only current option for reducing motion blur on my current projector is black frame insertion. And after a few months… I turned it off. The trade-off of a dimmer picture, and a just-noticeable flicker, was no longer worth the better apparent detail.
I"m not telling you to just give up, fellow blur haters. If you"ve had your TV for a while and just can"t get past the motion blur, definitely try the various settings mentioned above. If you"ve gotten a new TV, perhaps upgrading from an old plasma or DLP rear-projection TV, see if any of the settings give you relief. If not, give it a bit of time and see if you get used to it. Hopefully you will.
The size and shape of displayed images. Typical HD/UHD content is 16:9 wide screen, while movie theaters use 2.35:1. Older “letterbox” content from before HD was 4:3.
The professional adjustment of a projector or display to accepted standards such as Adobe RGB, DCI-P3, Rec. 709, and Rec. 2020. If a projector offers good calibration the image you get on your screen stays very close to the source material, or to the way film makers and TV producers wanted the work to be viewed by audiences. In other words, accurate color means faithful reproduction of content. A poorly calibrated projector may depict a red car as orange, as an extreme example. You certainly don’t want that.
Since color is effectively energetic light, temperature determines what we see. Temperatures measured in kelvins, with almost all content in the 4000-7000K range. You may encounter the term “D65”. That refers to the temperature of light during daytime and serves as a reference in many color spaces, including Rec. 709. D65 measures approximately 6500 kelvins.
The range and depth of color displayed. Different devices have varying color capabilities, essentially the number of colors they can show (even if the human eye and brain can’t perceive them). Examples include DCI-P3, Rec. 709, and Rec 2020.
Component responsible for generating color in projectors. Basic color wheels have just three segments: red, green, and blue. Having multiple segments of each primary color helps produce richer colors and wider a wider color gamut, so good projectors have RGBRGB color wheels (two segments per primary color). Variations exist, such as RGBW (adds a white-dedicated segment) and RGBCWY (red, green, blue, cyan, white, yellow). Color wheels with high primary color purity offer the best color performance.
The difference between white and dark. Higher contrast helps create more dramatic, impactful images. Contrast ratios describe the amount of white per unit of black, for example 30,000:1.
DCI-P3 is a specialized color gamut developed for digital movie projection in professional cinemas and is wider than Rec. 709, especially in green and red spaces, that is to say DCI-P3 produces more nuanced greens and reds, while blues generally equal what you get with Rec. 709.
Digital light processing, a technology that uses millions of micromirrors to generate a precise image sent by a powerful light source. Compared to other projection technologies like LCD, DLP mechanisms resist dust build up and don’t require complex filters that end up degrading image quality. Most importantly, the mirrors used in DLP are extremely long-lasting, while LCD panels deteriorate much more quickly. Even if the light source (lamp) requires replacement, image quality remains at peak with DLP, while with LCD quality worsens over time regardless of how new the lamp may be.
Digital micromirror device, a precision-crafted component that has millions of tiny mirrors working together with a processor to allow projection of images in resolutions up to true 4K.
A very useful and advantageous mechanism in select projectors that’s integrated between the projector lamp and lens which opens and closes depending on the overall brightness of the projected image to adjust light output. Used to fine-tune the projected image by letting the projector enhance contrast performance, resulting in accurate dark scenes and optimized bright areas, thus preventing the loss of image detail.
The number of times a display refreshes per second. The more frames, the smoother video appears. TV and movie content usually uses 24 frames, video games typically run at 30 or 60, but increasingly go up to 144. Often measured in Hertz (Hz).
High Definition Multimedia Interface, the most common home media connector and cable standard since the mid 2000’s. HDMI 2.0 and up required for 4K UHD at 60fps.
Hybrid Log-Gamma (HLG) is a version of HDR that was jointly developed by NHK and the BBC. Unlike other forms of HDR, HLG does not use metadata, which means it is compatible with both SDR and HDR displays.
The delay between an image being sent out by a projector and the same image getting displayed on screen. In the context of gaming, also includes the additional delay between the screen and the controller (also known as controller latency). Measured in milliseconds, input lag of over 40ms makes video games essentially unplayable and may cause lip sync issues when watching movies and TV.
When a projector isn’t properly aligned in front of a screen, the image stops being rectangular and loses its aspect ratio. Fixed by (automatic) keystone correction on select projectors.
Liquid crystal display, one of the leading technologies in TVs a