g sync lcd panel pricelist
Enter the world of gaming and you will hear the terms refresh rates and frame rates thrown about frequently. To explore brand new alien landscapes and capture enemy lines in smooth, seamless movements, gamers seek out equipment that delivers ultra-fast refresh rates and super-high frame rates.
Continue reading to learn about how Adaptive Sync prevents screen tearing and game stuttering for the smoothest gameplay possible. Or discover ViewSonic ELITE’s range of professional gaming monitors equipped with the latest sync capabilities.
However, no matter how advanced the specifications are, the monitor’s refresh rate and the graphics card’s frame rate need to be synced. Without the synchronization, gamers will experience a poor gaming experience marred with tears and judders. Manufacturers such as NVIDIA, AMD, and VESA have developed different display technologies that help sync frame rates and refresh rates to eliminate screen tearing and minimize game stuttering. And one such technology is Adaptive Sync.
To understand what Adaptive Sync is, we need to first understand what causes screen tearing, game stuttering, and input lag, and how Adaptive Sync resolves them.
Traditional monitors tend to refresh their images at a fixed rate. However, when a game requires higher frame rates outside of the set range, especially during fast-motion scenes, the monitor might not be able to keep up with the dramatic increase. The monitor will then show a part of one frame and the next frame at the same time.
As an example, imagine that your game is going at 90 FPS (Frames Per Second), but your monitor’s refresh rate is 60Hz, this means your graphics card is doing 90 updates per second with the display only doing 60. This overlap leads to split images – almost like a tear across the screen. These lines will take the beautiful viewing experience away and hamper any gameplay.
Game stuttering or micro-stuttering is when frames are being repeated, skipped, or frozen. This usually happens when there is input delay between the GPU and your display. Games, especially fast-paced and graphics-intensive types, will feel slow and laggy and players will experience delayed action and sudden screen hiccups.
Input lags are often caused by a decrease in FPS when the GPU renders images at a slower rate as compared to the monitor. A drop in frame rates below your display’s refresh rate will lead to game stuttering and input delays – and this can be caused by V-Sync.
V-Sync, also known as Vertical Sync, is the original GPU technology that synchronizes the game’s frame rate to the monitor’s refresh rate and is featured in most modern graphics cards. It was developed primarily to combat screen tears.
When V-Sync is enabled on a monitor, it helps limit the frame rate output of the graphics card to the monitor’s refresh rate. This allows the monitor to avoid handling a higher FPS that it can manage and hence, eliminates screen tearing. However, if the game’s required framerate drops below the refresh rate of the monitor, having V-Sync enabled will cause the FPS to drop even further to match the monitor. This brings added latency which will impede performance and increase input delays.
In every gameplay, different scenes demand varying levels of framerates. The more effects and details the scene has (such as explosions and smoke), the longer it takes to render the variance in framerate. Instead of consistently rendering the same framerate across all scenes, whether they are graphics-intensive or not, it makes more sense to sync the refresh rate accordingly.
Developed by VESA, Adaptive Sync adjusts the display’s refresh rate to match the GPU’s outputting frames on the fly. Every single frame is displayed as soon as possible to prevent input lag and not repeated, thus avoiding game stuttering and screen tearing.
Outside of gaming, Adaptive Sync can also be used to enable seamless video playback at various framerates, whether from 23.98 to 60 fps. It changes the monitor’s refresh rate to match with the framerate of the video content, thus banishing video stutters and even reducing power consumption.
Unlike V-Sync which caps your GPU’s frame rate to match with your display’s refresh rate, Adaptive Sync dynamically changes the monitor’s refresh rate in response to the game’s required framerates to render. This means it does not only annihilate screen tearing but also addresses the juddering effect that V-Sync causes when the FPS falls.
To illustrate Adaptive Sync with a diagram explained by VESA, you will see that Display A will wait till Render B is completed and ready before updating to Display B. This ensures that each frame is displayed as soon as possible, thus reducing the possibility of input lag. Frames will not be repeated within the display’s refresh rate set to avoid game stuttering. It will adapt the refresh rate to the rendering framerate to avoid any screen tearing.
AMD FreeSync is no different from VESA Adaptive Sync. It utilizes VESA’s royalty-free technology to sync the refresh rate to the FPS. It also works on most monitors, which keeps the prices down. However, AMD has left the framerate range in the hands of the manufacturers which reduces the usefulness of the sync technology.
NVIDIA G-Sync uses the same principle as Adaptive Sync. But it relies on proprietary hardware that must be built into the display. With the additional hardware and strict regulations enforced by NVIDIA, monitors supporting G-Sync have tighter quality control and are more premium in price.
Both solutions are also hardware bound. If you own a monitor equipped with G-Sync, you will need to get an NVIDIA graphics card. Likewise, a FreeSync display will require an AMD graphics card. However, AMD has also released the technology for open use as part of the DisplayPort interface. This allows anyone can enjoy FreeSync on competing devices. There are also G-Sync Compatible monitors available in the market to pair with an NVIDIA GPU.
Choosing a sync technology depends on your needs and preferences. If you seek a smoother gaming experience, it is ideal for your gaming monitor to have Adaptive Sync on top of V-Sync capabilities. Especially if you play a lot of fighting or shooting games that require precise clicks and lightning reflexes, then a few frames of difference can lead to victory or defeat.
If you need a refresher on the differences between refresh rate and frame rate, you can check our guide here. Or you can browse through ViewSonic ELITE’s professional gaming monitors for a tear-and-stutter-free gaming experience.
We’re displaying for keeps at CES this week in Las Vegas with the expansion of the G-SYNC ecosystem, pre-orders of our new BFGD monitors and the announcement of new G-SYNC displays.
And we broke new ground with G-SYNC HDR monitors that are the very best displays for HDR gaming on PC. They offer the best tech, the highest brightness, cinematic color and never before seen contrast in gaming monitors to display deep, dark black tones.
Since the launch of G-SYNC, gaming monitors have evolved quickly with higher refresh rates, HDR and new form factors. They’ve become the standard for pro gaming, and we continue to help guide their evolution by working with partners and with end-to-end development and certification testing.
There are hundreds of monitor models available capable of variable refresh rates (VRR) using the VESA DisplayPort Adaptive-Sync protocol. However, the VRR gaming experience can vary widely.
To improve the experience for gamers, NVIDIA will test monitors. Those that pass our validation tests will be G-SYNC Compatible and enabled by default in the GeForce driver.
G-SYNC Compatible tests will identify monitors that deliver a baseline VRR experience on GeForce RTX 20-series and GeForce GTX 10-series graphics cards, and activate their VRR features automatically.
Support for G-SYNC Compatible monitors will begin Jan. 15 with the launch of our first 2019 Game Ready driver. Already, 12 monitors have been validated as G-SYNC Compatible (from the 400 we have tested so far). We’ll continue to test monitors and update our support list. For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.
For the most demanding gamers, G-SYNC and G-SYNC Ultimate HDR displays featuring an NVIDIA G-SYNC processor will represent the very highest image quality, performance and experience. These displays will benefit from an end-to-end certification process that includes more than 300 tests for image quality.
They’ll feature a full refresh rate range from 1 Hz to the display panel’s maximum rate, plus other advantages like variable overdrive, refresh rate overclocking, ultra-low motion blur display modes and industry-leading HDR with 1,000 nits, full matrix backlight and DCI-P3 color.
In February, G-SYNC HDR tech will be available in 65-inch super-sized NVIDIA Big Format Gaming Displays featuring 4K, 144 Hz with 1,000 nit HDR, 384 zone matrix backlight and cinematic DCI-P3 color.
If you want the biggest and best G-SYNC Ultimate PC gaming display, it’s available for pre-order now from HP. Other partners will start taking pre-orders as we approach the launch of BFGDs later this quarter.
Soon, ASUS will unleash its curved 35-inch, 3440×1440, 200 Hz G-SYNC Ultimate display. It reaches a brilliantly bright 1,000 nits, has a 512 zone matrix backlight and best-in-class color and contrast for the best possible HDR gaming experience.
LG has just started shipping its 34-inch, 3440×1440, 120 Hz 34GK950G display, which is the first G-SYNC monitor to feature LG’s Nano IPS technology, delivering stunning color fidelity.
And Lenovo unveiled its 27-inch, 2560×1440, 240 Hz monitor, bringing esports-class refresh rates to QHD resolution for crisp, detailed visuals during gameplay.
23.8" LED-backlit LCD monitor / FHD (1920x1080) resolution IPS display / 178° (H) 178° (V) viewing angle / 250 cd/m² brightness / 5ms (GTG) response time (in FAST mode) / 1xUSB‑C input (video/audio/60W charging capability), 1xDisplayPort (in), 1xDisplayPort (out) with MST, 1xHDMI, 1xUSB‑C output (data/15W charging capability), 3xSuperSpeed 10Gbps USB Type A ports / 1xDisplayPort, 1xUSB-C to USB Type A cable included / 3 year system/panel warranty
It’s difficult to buy a computer monitor, graphics card, or laptop without seeing AMD FreeSync and Nvidia G-Sync branding. Both promise smoother, better gaming, and in some cases both appear on the same display. But what do G-Sync and FreeSync do, exactly – and which is better?
Most AMD FreeSync displays can sync with Nvidia graphics hardware, and most G-Sync Compatible displays can sync with AMD graphics hardware. This is unofficial, however.
The first problem is screen tearing. A display without adaptive sync will refresh at its set refresh rate (usually 60Hz, or 60 refreshes per second) no matter what. If the refresh happens to land between two frames, well, tough luck – you’ll see a bit of both. This is screen tearing.
Screen tearing is ugly and easy to notice, especially in 3D games. To fix it, games started to use a technique called V-Syncthat locks the framerate of a game to the refresh rate of a display. This fixes screen tearing but also caps the performance of a game. It can also cause uneven frame pacing in some situations.
Adaptive sync is a better solution. A display with adaptive sync can change its refresh rate in response to how fast your graphics card is pumping out frames. If your GPU sends over 43 frames per second, your monitor displays those 43 frames, rather than forcing 60 refreshes per second. Adaptive sync stops screen tearing by preventing the display from refreshing with partial information from multiple frames but, unlike with V-Sync, each frame is shown immediately.
Enthusiasts can offer countless arguments over the advantages of AMD FreeSync and Nvidia G-Sync. However, for most people, AMD FreeSync and Nvidia G-Sync both work well and offer a similar experience. In fact, the two standards are far more similar than different.
All variants of AMD FreeSync are built on the VESA Adaptive Sync standard. The same is true of Nvidia’s G-Sync Compatible, which is by far the most common version of G-Sync available today.
VESA Adaptive Sync is an open standard that any company can use to enable adaptive sync between a device and display. It’s used not only by AMD FreeSync and Nvidia G-Sync Compatible monitors but also other displays, such as HDTVs, that support Adaptive Sync.
AMD FreeSync and Nvidia G-Sync Compatible are so similar, in fact, they’re often cross compatible. A large majority of displays I test with support for either AMD FreeSync or Nvidia G-Sync Compatible will work with graphics hardware from the opposite brand.
AMD FreeSync and Nvidia G-Sync Compatible are built on the same open standard. Which leads to an obvious question: if that’s true, what’s the difference?
Nvidia G-Sync Compatible, the most common version of G-Sync today, is based on the VESA Adaptive Sync standard. But Nvidia G-Sync and G-Sync Ultimate, the less common and more premium versions of G-Sync, use proprietary hardware in the display.
This is how all G-Sync displays worked when Nvidia brought the technology to market in 2013. Unlike Nvidia G-Sync Compatible monitors, which often (unofficially) works with AMD Radeon GPUs, G-Sync is unique and proprietary. It only supports adaptive sync with Nvidia graphics hardware.
It’s usually possible to switch sides if you own an AMD FreeSync or Nvidia G-Sync Compatible display. If you buy a G-Sync or G-Sync Ultimate display, however, you’ll have to stick with Nvidia GeForce GPUs. (Here’s our guide to the best graphics cards for PC gaming.)
This loyalty does net some perks. The most important is G-Sync’s support for a wider range of refresh rates. The VESA Adaptive Sync specification has a minimum required refresh rate (usually 48Hz, but sometimes 40Hz). A refresh rate below that can cause dropouts in Adaptive Sync, which may let screen tearing to sneak back in or, in a worst-case scenario, cause the display to flicker.
G-Sync and G-Sync Ultimate support the entire refresh range of a panel – even as low as 1Hz. This is important if you play games that may hit lower frame rates, since Adaptive Sync matches the display refresh rate with the output frame rate.
For example, if you’re playing Cyberpunk 2077 at an average of 30 FPS on a 4K display, that implies a refresh rate of 30Hz – which falls outside the range VESA Adaptive Sync supports. AMD FreeSync and Nvidia G-Sync Compatible may struggle with that, but Nvidia G-Sync and G-Sync Ultimate won’t have a problem.
AMD FreeSync Premium and FreeSync Premium Pro have their own technique of dealing with this situation called Low Framerate Compensation. It repeats frames to double the output such that it falls within a display’s supported refresh rate.
Other differences boil down to certification and testing. AMD and Nvidia have their own certification programs that displays must pass to claim official compatibility. This is why not all VESA Adaptive Sync displays claim support for AMD FreeSync and Nvidia G-Sync Compatible.
AMD FreeSync and Nvidia G-Sync include mention of HDR in their marketing. AMD FreeSync Premium Pro promises “HDR capabilities and game support.” Nvidia G-Sync Ultimate boasts of “lifelike HDR.”
This is a bunch of nonsense. Neither has anything to do with HDR, though it can be helpful to understand that some level of HDR support is included in those panels. The most common HDR standard, HDR10, is an open standard from the Consumer Technology Association. AMD and Nvidia have no control over it. You don’t need FreeSync or G-Sync to view HDR, either, even on each company’s graphics hardware.
PC gamers interested in HDRshould instead look for VESA’s DisplayHDR certification, which provides a more meaningful gauge of a monitor’s HDR capabilities.
Both standards are plug-and-play with officially compatible displays. Your desktop’s video card will detect that the display is certified and turn on AMD FreeSync or Nvidia G-Sync automatically. You may need to activate the respective adaptive sync technology in your monitor settings, however, though that step is a rarity in modern displays.
Displays that support VESA Adaptive Sync, but are not officially supported by your video card, require you dig into AMD or Nvidia’s driver software and turn on the feature manually. This is a painless process, however – just check the box and save your settings.
AMD FreeSync and Nvidia G-Sync are also available for use with laptop displays. Unsurprisingly, laptops that have a compatible display will be configured to use AMD FreeSync or Nvidia G-Sync from the factory.
A note of caution, however: not all laptops with AMD or Nvidia graphics hardware have a display with Adaptive Sync support. Even some gaming laptops lack this feature. Pay close attention to the specifications.
VESA’s Adaptive Sync is on its way to being the common adaptive sync standard used by the entire display industry. Though not perfect, it’s good enough for most situations, and display companies don’t have to fool around with AMD or Nvidia to support it.
That leaves AMD FreeSync and Nvidia G-Sync searching for a purpose. AMD FreeSync and Nvidia G-Sync Compatible are essentially certification programs that monitor companies can use to slap another badge on a product, though they also ensure out-of-the-box compatibility with supported graphics card. Nvidia’s G-Sync and G-Sync Ultimate are technically superior, but require proprietary Nvidia hardware that adds to a display’s price. This is why G-Sync and G-Sync Ultimate monitors are becoming less common.
My prediction is this: AMD FreeSync and Nvidia G-Sync will slowly, quietly fade away. AMD and Nvidia will speak of them less and lesswhile displays move towards VESA Adaptive Sync badgesinstead of AMD and Nvidia logos.
If that happens, it would be good news for the PC. VESA Adaptive Sync has already united AMD FreeSync and Nvidia G-Sync Compatible displays. Eventually, display manufacturers will opt out of AMD and Nvidia branding entirely – leaving VESA Adaptive Sync as the single, open standard. We’ll see how it goes.
With Computex kicking off today NVIDIA has a number of announcements hitting the wire at the same time. The biggest news of course is the launch of the GeForce GTX 980 Ti, however the company is also releasing a number of G-Sync announcements today. This includes the launch of Mobile G-Sync for laptops, Windowed G-Sync support for laptops and desktops, new G-Sync framerate control functionality, and a number of new G-Sync desktop monitors.
We"ll kick things off with the biggest of the G-Sync announcements, which is Mobile G-Sync. Today NVIDIA is announcing a very exciting product for notebook gamers. After much speculation (and an early prototype leak) NVIDIA’s G-Sync technology is now coming to notebooks.
Anand took a look at the original G-Sync back in 2013 and for those that need a refresher on the technology, this would be a great place to start. But what G-Sync allows for is a variable refresh rate on the display which allows it to stay in sync with the GPU’s abilities to push out frames rather than forcing everything to work at a single fixed rate as dictated by the display.
From a technical/implementation perspective, because desktop systems can be hooked to any monitor, desktop G-Sync originally required that NVIDIA implement a separate module - the G-Sync module - to be put into the display and to serve as an enhanced scaler. For a desktop monitor this is not a big deal, particularly since it was outright needed in 2013 when G-Sync was first introduced. However with laptops come new challenges and new technologies, and that means a lot of the implementation underpinnings are changing with the announcement of Mobile G-Sync today.
With embedded DisplayPort (eDP) now being a common fixture in high-end notebooks these days, NVIDIA will be able to do away with the G-Sync module entirely and rely just on the variable timing and panel self-refresh functionality built in to current versions of eDP. eDP"s variable timing functionality was of course the basis of desktop DisplayPort Adaptive-Sync (along with AMD"s Freesync implementation), and while the technology is a bit different in laptops, the end result is quite similar. Which is to say that NVIDIA will be able to drive variable refresh laptops entirely with standardized eDP features, and will not be relying on proprietary features or hardware as they do with desktop G-Sync.
Removing the G-Sync module offers a couple of implementation advantages. The first of these is power; even though the G-Sync module replaced a scaler, it was a large and relatively power-hungry device, which would make it a poor fit for laptops. The second advantage is that it allows G-Sync to be implemented against traditional, lower-cost laptop eDP scalers, which brings the price of the entire solution down. In fact for these reasons I would not be surprised to eventually see NVIDIA release a G-Sync 2.0 for desktops using just DisplayPort Adaptive-Sync (for qualified monitors only, of course), however NVIDIA obviously isn"t talking about such a thing at this time. Laptops as compared to desktops do have the advantage of being a known, fixed platform, so there would be a few more issues to work out to bring something like this to desktops.
Moving on, while the technical underpinnings have changed, what hasn"t changed is how NVIDIA is approaching mobile G-Sync development. For laptops to be enabled for mobile G-Sync they must still undergo qualification from NVIDIA, and while NVIDIA doesn"t release specific financial details, there is a fee for this process (and presumably per-unit royalties as well). Unfortunately NVIDIA also isn"t commenting on what kind of price premium G-Sync enabled laptops will go for, though they tell us that they don"t expect the premium to be dramatically different, if only because they think that all gaming laptops will want to have this feature.
As far as qualification goes. the qualification process is designed to ensure a minimum level of overall quality in products that receive G-Sync branding, along with helping ODMs tune their notebooks for G-Sync. This process is something NVIDIA considers a trump-card of sorts for the technology, and something they believe delivers a better overall experience. From what we"re hearing on quality, it sounds like NVIDIA is going to put their foot down on low quality panels, for example, so that the G-Sync brand and experience doesn"t get attached to subpar laptops. Meanwhile the tuning process involves a similar process as on the desktop, with laptops and their respective components going through a profiling and optimization process to determine its refresh properties and pixel response times in order to set G-Sync timings and variable overdrive.
Which on that note (and on a slight tangent), after initially staying mum on the issue in the early days of G-Sync (presumably as a trade secret), NVIDIA is now confirming that all G-Sync implementations (desktop and mobile) include support for variable overdrive. As implied by the name, variable overdrive involves adjusting the amount of overdrive applied to a pixel in order to make overdrive more compatible with variable refresh timings.
As a quick refresher, the purpose of overdrive in an LCD is to decrease the pixel response time and resulting ghosting by overdriving pixels to get them to reach the desired color sooner. This is done by setting a pixel to a color intensity (voltage) above or below where you really want it to go, knowing that due to the response times of liquid crystals it will take more than 1 refresh interval for the pixel to reach that overdriven value. By driving a pixel harder and then stopping it on the next refresh, it"s possible to reach a desired color sooner (or at least, something close to the desired color) than without overdrive.
Overdrive has been a part of LCD displays for many years now, however the nature of overdrive has always implied a fixed refresh rate, as it"s not possible to touch a pixel outside of a refresh window. This in turn leads to issues with variable refresh, as you don"t know when the next refresh may happen. Ultimately there"s no mathematically perfect solution here - you can"t predict the future with 100% accuracy - so G-Sync variable overdrive is a best-effort attempt to predict when the next frame will arrive, and adjusting the overdrive values accordingly. The net result is that in motion it"s going to result in a slight decrease in color accuracy versus using a fixed refresh rate due to errors in prediction, but it allows for an overall reduction in ghosting versus not running overdrive at all.
But getting back to the subject at hand of mobile G-Sync, this is a big win for notebooks for a couple of reasons. First, more notebooks are sold now than desktops, so this makes G-Sync available to a bigger audience. Of course not all those devices even have GPUs, but NVIDIA has seen steady growth in the mobile GeForce segment over the last while, so the market is strong. The other reason this is important though is because mobile products are much more thermally constrained, as well as space constrained, so the mobile parts are always going to be slower than desktop parts. That gap has reduced with the latest Maxwell parts, but it is still there. G-Sync on mobile should help even more than it does on the desktop due to the lower overall framerate of laptop parts.
In order for G-Sync to be available on a laptop, a couple of things need to be true. First, the laptop must have a GeForce GPU obviously. Second, the laptop manufacturer needs to work with NVIDIA to enable this, since NVIDIA has to establish the parameters for the particular laptop panel in order to correctly know the maximum and minimum refresh rate as well as the amount of over/under-drive necessary. But the third is the big one. The laptop display must be directly connected to the GeForce GPU.
What this means is that in order for G-Sync to be available, Optimus (NVIDIA’s ability to switch from the integrated CPU graphics to the discrete NVIDIA graphics) will not be available. They are, at least for now, mutually exclusive. As a refresher for Optimus, the integrated GPU is actually the one that is connected to the display, and when Optimus is enabled, the iGPU acts as an intermediary and is the display controller. The discreet GPU feeds through the iGPU and then to the display. Due to the necessity of the GPU being directly connected to the display, this means that Optimus enabled notebooks will not have G-Sync available.
Obviously this is a big concern because Optimus is found on almost all notebooks that have GeForce GPUs, and has been one of the big drivers to reasonable battery life on gaming notebooks. However, going forward, it is likely that true gaming notebooks will drop this support in order to offer G-Sync, and more versatile devices which may use the GPU just once in a while, or for compute purposes, will likely keep it. There is going to be a trade-off that the ODM needs to consider. I asked specifically about this and NVIDIA feels that this is less of an issue than it was in the past because they have worked very hard on the idle power levels on Maxwell, but despite this there is likely going to be a hit to the battery life. Going forward this is something we"d like to test, so hopefully we"ll be able to properly quantify the tradeoff in the future..
As for release details, mobile G-Sync is going to be available starting in June with laptops from Gigabyte’s Aorus line, MSI, ASUS, and Clevo. Expect more soon though since this should be a killer feature on the less powerful laptops around.
Wrapping things up, as I mentioned before, mobile G-Sync seems like a good solution to the often lower capabilities of gaming laptops and it should really bring G-Sync to many more people since a dedicated G-Sync capable monitor is not required. It really is a shame that it does not work with Optimus though since that has become the standard on NVIDIA based laptops. ODMs could use hardware multiplexer to get around this, which was the solution prior to Optimus, but due to the added cost and complexity needed my guess is that this will not be available on very many, if any, laptops which want to leverage G-Sync.
The second major G-Sync announcement coming from NVIDIA today is that G-Sync is receiving windowed mode support, with that functionality being rolled into NVIDIA"s latest drivers. Before now, running a game in Windowed mode could cause stutters and tearing because once you are in Windowed mode, the image being output is composited by the Desktop Window Manager (DWM) in Windows. Even though a game might be outputting 200 frames per second, DWM will only refresh the image with its own timings. The off-screen buffer for applications can be updated many times before DWM updates the actual image on the display.
NVIDIA will now change this using their display driver, and when Windowed G-Sync is enabled, whichever window is the current active window will be the one that determines the refresh rate. That means if you have a game open, G-Sync can be leveraged to reduce screen tearing and stuttering, but if you then click on your email application, the refresh rate will switch back to whatever rate that application is using. Since this is not always going to be a perfect solution - without a fixed refresh rate, it"s impossible to make every application perfectly line up with every other application - Windowed G-Sync can be enabled or disabled on a per-application basis, or just globally turned on or off.
Meanwhile SLI users will be happy to know that Windowed G-Sync works there as well. However there will be a slight catch: for the moment it works for 2-way SLI, but not 3-way or 4-way SLI.
Finally, NVIDIA is also noting at this time that Windowed G-Sync is primarily for gaming applications, so movie viewers looking to get perfect timing in their windowed media players will be out of luck for the moment. The issue here isn’t actually with Windowed G-Sync, but rather current media players do not know about variable refresh technology and will always attempt to run at the desktop refresh rate. Once media players become Windowed G-Sync aware, it should be possible to have G-Sync work with media playback as well.
Third up on NVIDIA’s list of G-Sync announcements is support for controlling the behavior of G-Sync when framerates reach or exceed the refresh rate limit of a monitor. Previously, NVIDIA would cap the framerate at the refresh rate, essentially turning on v-sync in very high framerates. However with their latest update, NVIDIA is going to delegate that option to the user, allowing users to either enable or disable the framerate cap as they please.
The tradeoff here is that capping the framerate ensures that no tearing occurs since there are only as many frames as there are refresh intervals, but it also introduces some input lag if frames are held back to be displayed rather than displayed immediately. NVIDIA previously opted for a tear-free experience, but now will let the user pick between tear-free operation or reducing input lag to the bare minimum. This is one area where NVIDIA’s G-Sync and AMD’s Freesync implementations have significantly differed – AMD was the first to allow the user to control this – so NVIDIA is going for feature parity with AMD in this case.
Last but certainly not least from today’s G-Sync announcements, NVIDIA is announcing that their partners Acer and Asus are preparing several new G-Sync monitors for release this year. Most notably, both will be releasing 34” 3440x1440 ultra-wide monitors. Both displays are IPS based, with the Asus model topping out at 60Hz while the Acer model tops out at 75Hz. Meanwhile Acer will be releasing a second, 35” ultra-wide based on a VA panel and operating at a resolution of 2560x1080.
Asus and Acer will also be releasing some additional traditional format monitors at 4K and 1440p. This includes some new 27”/28” 4K IPS monitors and a 27” 1440p IPS monitor that runs at 144Hz. All of these monitors are scheduled for release this year, however as they’re third party products NVIDIA is unable to give us a precise ETA. They’re hoping for a faster turnaround time than the first generation of G-Sync monitors, though how much faster remains to be seen.
This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.