not supported with gsync lcd panel pricelist

Information on this error message is REALLY sketchy online. Some say that the G-Sync LCD panel is hardwired to the dGPU and that the iGPU is connected to nothing. Some say that dGPU is connected to the G-Sync LCD through the iGPU. Some say that they got the MUX switch working after an intention ordering of bios update, iGPU drivers then dGPU drivers on a clean install.

I"m suspecting that if I connect an external 60hz IPS monitor to one of the display ports on the laptop and make it the only display, the Fn+F7 key will actually switch the graphics because the display is not a G-Sync LCD panel. Am I right on this?

If I"m right on this, does that mean that if I purchase this laptop, order a 15inch Alienware 60hz IPS screen and swap it with the FHD 120+hz screen currently inside, I will also continue to have MUX switch support and no G-Sync? The price for these screens is not outrageous.

not supported with gsync lcd panel pricelist

At first i thought that maybe i was sent a laptop with a g-sync display but when i checked in device manager the display is listed as "generic pnp display" no mention of g-sync yet i cant seem to be able to turn off the gpu and whenever i press fn+f7 i get the following message "not supported with g-sync ips display" even though the display is not a g-sync display.

not supported with gsync lcd panel pricelist

G-Sync gaming monitors don"t have to be crazy expensive anymore. That"s thanks to Nvidia"s support for many FreeSync gaming monitors when paired with an Nvidia graphics card, you can find both lower cost G-Sync Compatible gaming monitors as well as those equipped with an actual G-Sync module. G-Sync ensures your graphics card and monitor play together nicely to avoid the nasty screen tearing caused by misalignment between your GPU"s frame rate output and monitor"s refresh rate. G-Sync avoids the stuttering and input lag of V-Sync (not to mention limiting your frame rate to just 60fps).

So, if you"re in the market for a monitor with variable refresh rate (VRR) technology and you have an Nvidia graphics processor, it"s worth your while to see the affordable options available to you. We"ve picked out budget monitors that deliver either true G-Sync or confirmed compatibility through FreeSync – and click here to find them in the UK.

1See on AmazonScreen size: 31.5" | Aspect ratio: 16:9 | Resolution: 2,560 x 1,440 | Panel type: IPS FreeSync Premium, G-Sync compatible |Brightness: 400cd/m2 | Refresh Rate: 165Hz | Response time: 1ms | Inputs: 2 x HDMI 2.0, 1 x DisplayPort 1.4

The LG UltraGear 32GP750-B brings G-Sync support and much more to love for around $350. You’re getting a big 31.5-inch display to see all the action, and the 1440p resolution helps keep it all crisp. LG is giving you an HDR boost with an IPS panel capable of dazzling at 400 nits of brightness and a solid contrast ratio. Colors should be vivid and accurate thanks to the monitor covering 99% of the sRGB color gamut and offering a 10-bit color depth.

0See on AmazonScreen size: 24" | Aspect ratio: 16:9 | Resolution: 1,920 x 1,080 | Panel type: TN FreeSync Premium |Brightness: 350cd/m2 | Refresh rate: 165Hz | Response time: 0.5ms | Inputs: 2 x HDMI 1.4, 1 x DisplayPort 1.2

If you"re looking to get G-Sync for the lowest price possible, the Asus TUF Gaming VG248Q1B delivers. Asus has an incredible lineup of budget-friendly G-sync compatible displays, a few of which are on this list, but this one will give you the best bang for your buck, coming in at around $170. You can get gaming at a fast refresh rate up to 165Hz at the high end or keep pace with your Nvidia graphics card all the way down to 48Hz. That’s a wide range of variable refresh rate support to ensure your gaming with the best possible visuals at all times. If you’ve got an AMD GPU in another rig, you also get support for FreeSync Premium. And your smooth gameplay is further enhanced by 0.5ms response time.

The monitor itself is a fairly straightforward ordeal, with no fancy HDR or extra-high peak brightness. You’re getting a 24-inch 1080p display toting basic features like flicker-free technology, low motion blur, and plenty of connectivity. Asus also packs in Shadow boost software, so you can spot enemies lurking in the darkness. The Asus TUF Gaming VG248Q1B is doing the job it set out to do, so don"t shy away from this monitor if you"re focus is fast gaming with no tearing over the sharpest possible image.

14See on AmazonScreen size: 28" | Aspect ratio: 16:9 | Resolution: 3,840 x 2,160 | Panel type: IPS FreeSync |Brightness: 350cd/m2 | Refresh Rate: 60Hz | Response time: 5ms | Inputs: 2 x HDMI 2.0, 1 x DisplayPort 1.2

Speed is great for competitive gaming, but some games are just about the eye candy, and a 4K display can make them really shimmer. The Asus TUF Gaming VG289Q1A brings that 4K capability at an affordable price and backs it up with support for G-Sync. Since a steady framerate can be hard to maintain at 4K, having the G-Sync to back you up is crucial for clear visuals free of tearing or stutter.

0See on AmazonScreen size: 32" 1500R | Aspect ratio: 16:9 | Resolution: 2,560 x 1,440 | Panel type: VA FreeSync Premium Pro, G-Sync compatible | HDR compatibility: DisplayHDR 400 | Brightness: 350cd/m2 | Refresh rate: 165Hz | Response time: 1ms | Inputs: 1 x DisplayPort 1.2, 2 x HDMI 2.0

You can get a lot of features all in one place if you know where to look. If you want a large display with a sharp resolution, a fast refresh rate, HDR support, and VRR compatiblity, then the Gigabyte G32QC is where you should look.

This monitor delivers a 32-inch display with a 1500R curvature to wrap it around your field of view and immerse you in your games. That display has DisplayHDR 400 certification, ensuring a strong picture quality that is just made better by its 1440p resolution, solid contrast ratio, and 165Hz refresh rate. G-Sync support will let you stay smooth even when you"re not displaying 165 frames per second.

6See on AmazonScreen size: 24.5" | Aspect ratio: 16:9 | Resolution: 1,920 x 1,080 | Panel type: TN G-Sync Compatible |Brightness: 400cd/m2 | HDR Support: DisplayHDR 400 | Refresh rate: 280Hz | Response time: 0.5ms | Inputs: 2 x HDMI 2.0, 1 x DisplayPort 1.2

Going with 1080p may not mean you"ll get the sharpest image, but you can get something those folks with 1440p and 4K monitors can"t yet: blistering speeds. And, at 1080p, you can even get it at a budget price. The Asus TUF Gaming VG258QM brings you a 280Hz refresh rate for just $300.

There are a few tradeoffs to hit that low price. For one thing, Asus used a TN panel. These don"t tend to have the best color or viewing angles, but when you"re gaming this hardcore, you should be looking at your monitor head-on anyway. Asus did manage to make the monitor fairly bright with a 400-nit peak brightness and DisplayHDR 400 certification, so you won"t miss out on enemies on account of a dim display. So, on a budget, this is a good pick for fast-paced esports.

0See on NZXTScreen size: 27" | Aspect ratio: 16:9 | Resolution: 2,560 x 1440 | Panel type: IPS FreeSync, G-Sync Compatible |Brightness: 300cd/m2 | Refresh rate: 165Hz | Response time: 1ms | Inputs: 2 x HDMI 2.0, 1 x DisplayPort 1.2, 1 x USB-C (DP Alt)

Getting a G-Sync Compatible monitor instead of one that features a true G-Sync module may seem like cheating, but with everything that the NZXT Canvas 27Q has to offer, you’ll forget all about the fact that it"s not actual G-Sync. And if you also have another PC packing an AMD graphics card, you can take advantage of FreeSync Premium. Thanks to this impressive adaptive sync technology, this monitor delivers a tear-free gaming experience with a max refresh rate of 165Hz on the DisplayPort and USB-C connections. You also get an admirable 144Hz over the HDMI ports. Not bad for a monitor that’ll only set you back $320, and that’s just the start.

The NZXT Canvas 27Q totes a 1440p resolution on a 27-inch IPS panel. For such an affordable display, you still get a relatively high peak brightness at 300 nits that, along with its anti-glare coating, make it a great option for even your brighter spaces. However, this display truly shines when it comes to color performance, covering 99% of the sRGB color gamut. You get a wide range of vibrant colors, only you may find darker objects have a bit of haloing, and darker scenes have a gray glow. And annoyingly, you are going to have to purchase the stand separately for $40. If those slight flaws aren’t deal breakers, most should find this sturdy and attractive monitor to be a suitable option for their PC build.

11See on AmazonScreen size: 27" 1500R | Aspect ratio: 16:9 | Resolution: 1,920 x 1,080 | Panel type: VA FreeSync, G-Sync compatible | Brightness: 250cd/m2 | Refresh rate: 240Hz | Response time: 1ms | Inputs: 2 x HDMI 2.0, 1 x DisplayPort 1.2

Your budget doesn’t have to define your speeds with the Acer Nitro ED270. This 240Hz monitor is wildly affordable. It starts at $300, but we’ve seen it go as low as $219. That’s impressively low for any 27-inch gaming monitor, let alone one that can cruise at 240Hz. Aside from speed, the Acer Nitro ED270 isn’t entirely special. You won’t find a bright, HDR-capable panel to make AAA games really pop, and the monitor is achieving its speeds by using a VA panel rather than the IPS panels that tend to offer more attractive visuals.

But, at 1080p and 27 inches, the Nitro ED270 can provide sharp enough visuals that will provide a ton of extra clarity thanks to the high refresh rate. If you’re on a budget, your hardware might not always hit a full 240 fps, so that’s where the G-Sync compatibility of the monitor will come in to prevent stutter and tearing. Get ready for super-smooth gaming.

In general, it works really well, but there are requirements. You need an Nvidia graphics card (minimum of a GTX 650 Ti Boost) and a G-Sync-enabled monitor connected by DisplayPort 1.2 or newer, and you need to turn it on in the Nvidia control panel software.

All G-Sync monitors have a proprietary G-Sync module in them that allows it to work, and with the right monitor and video card, you could get synced frame rates up to 240Hz.

G-Sync monitors tend to be around $100 to $200 more than the comparable FreeSync panels (which is attributed to the required module), so ‘cheap’ has a slightly different connotation here. That said, the best cheap G-Sync monitors let you experience a tear-free gaming experience without having to break the bank.

Better yet Nvidia has decreed that some Freesync monitors can be G-Sync Compatible, making them an amazingly affordable way to sidestep your way to a completely tear-free and smooth gaming experience with an Nvidia-powered system.

While Freesync monitors don"t include a dedicated G-Sync display module, Nvidia has tested over 400 displays and certified that at least 50 of them will definitely work with G-Sync. That said, our own testing has revealed there are even more compatible monitors out there.

not supported with gsync lcd panel pricelist

Please note: Some Adaptive Sync monitors will ship with the variable refresh rate setting set to disabled. Consult with the user manual for your monitor to confirm the Adaptive Sync setting is enabled. Also some monitors may have the DisplayPort mode set to DisplayPort 1.1 for backwards compatibility. The monitor must be configured as a DisplayPort 1.2 or higher to support Adaptive Sync.

If your Adaptive Sync monitor isn’t listed as a G-SYNC Compatible monitor, you can enable the tech manually from the NVIDIA Control Panel. It may work, it may work partly, or it may not work at all. To give it a try:

3. From within Windows, open the NVIDIA Control Panel -> select "Set up G-SYNC" from the left column -> check the "Enable settings for the selected display model"box, and finally click on the Apply button on the bottom right to confirm your settings.

For the best gaming experience we recommend NVIDIA G-SYNC and G-SYNC Ultimate monitors: those with G-SYNC processors that have passed over over 300 compatibility and quality tests, and feature a full refresh rate range from 1Hz to the display panel’s max refresh rate, plus other advantages like variable overdrive, refresh rate overclocking, ultra low motion blur display modes, and industry-leading HDR with 1000 nits, full matrix backlight and DCI-P3 color.

not supported with gsync lcd panel pricelist

When shopping for a gaming monitor, you’ll undoubtedly come across a few displays advertising Nvidia’s G-Sync technology. In addition to a hefty price hike, these monitors usually come with gaming-focused features like a fast response time and high refresh rate. To help you know where your money is going, we put together a guide to answer the question: What is G-Sync?

In short, G-Sync is a hardware-based adaptive refresh technology that helps prevent screen tearing and stuttering. With a G-Sync monitor, you’ll notice smoother motion while gaming, even at high refresh rates.

G-Sync is Nvidia’s hardware-based monitor syncing technology. G-Sync solves screen tearing mainly, synchronizing your monitor’s refresh rate with the frames your GPU is pushing out each second.

V-Sync emerged as a solution. This software-based feature essentially forces your GPU to hold frames in its buffer until your monitor is ready to refresh. That solves the screen tearing problem, but it introduces another: Input lag. V-Sync forces your GPU to hold frames it has already rendered, which causes a slight delay between what’s happening in the game and what you see on screen.

Nvidia introduced a hardware-based solution in 2013 called G-Sync. It’s based on VESA’s Adaptive-Sync technology, which enables variable refresh rates on the display side. Instead of forcing your GPU to hold frames, G-Sync forces your monitor to adapt its refresh rate depending on the frames your GPU is rendering. That deals with input lag and screen tearing.

However, Nvidia uses a proprietary board that replaces the typical scaler board, which controls everything within the display like decoding image input, controlling the backlight, and so on. A G-Sync board contains 768MB of DDR3 memory to store the previous frame so that it can be compared to the next incoming frame. It does this to decrease input lag.

With G-Sync active, the monitor becomes a slave to your PC. As the GPU rotates the rendered frame into the primary buffer, the display clears the old image and gets ready to receive the next frame. As the frame rate speeds up and slows down, the display renders each frame accordingly as instructed by your PC. Since the G-Sync board supports variable refresh rates, images are often redrawn at widely varying intervals.

For years, there’s always been one big caveat with G-Sync monitors: You need an Nvidia graphics card. Although you still need an Nvidia GPU to fully take advantage of G-Sync — like the recent RTX 3080 — more recent G-Sync displays support HDMI variable refresh rate under the “G-Sync Compatible” banner (more on that in the next section). That means you can use variable refresh rate with an AMD card, though not Nvidia’s full G-Sync module. Outside of a display with a G-Sync banner, here’s what you need:

For G-Sync Ultimate displays, you’ll need a hefty GeForce GPU to handle HDR visuals at 4K. They’re certainly not cheap, but they provide the best experience.

As for G-Sync Compatible, it’s a newer category. These displays do not include Nvidia’s proprietary G-Sync board, but they do support variable refresh rates. These panels typically fall under AMD’s FreeSync umbrella, which is a competing technology for Radeon-branded GPUs that doesn’t rely on a proprietary scaler board. Nvidia tests these displays to guarantee “no artifacts” when connected to GeForce-branded GPUs. Consider these displays as affordable alternatives to G-Sync and G-Sync Ultimate displays.

Since G-Sync launched in 2013, it has always been specifically for monitors. However, Nvidia is expanding. Last year, Nvidia partnered with LG to certify recent LG OLED TVs as G-Sync Compatible. You’ll need some drivers and firmware to get started, which Nvidia outlines on its site. Here are the currently available TVs that support G-Sync:

FreeSync has more freedom in supported monitor options, and you don’t need extra hardware. So, FreeSync is a budget-friendly alternative to G-Synch compatible hardware.Asus’ MG279Qis around $100 less than the aforementioned ROG Swift monitor.

In addition, users point to a lack of compatibility with Nvidia’s Optimus technology. Optimus, implemented in many laptops, adjusts graphics performance on the fly to provide the necessary power to graphics-intensive programs and optimize battery life. Because the technology relies on an integrated graphics system, frames move to the screen at a set interval, not as they are created as seen with G-Sync. One can purchase an Optimus-capable device or a G-Sync-capable device, but no laptop exists that can do both.

not supported with gsync lcd panel pricelist

On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really big thing actually. In recent years we all have been driven by the knowledge that on a 60 Hz monitor you want 60 FPS rendered, this was for a reason, you want the two as close as possible to each other as that offers you not only the best gaming experience, but also the best visual experience. This is why framerate limiters are so popular, you sync each rendered frame in line with your monitor refresh rate. Obviously 9 out of 10 times that is not happening. This results into tow anomalies that everybody knows and experiences, stutter and tearing.

Very simply put, the graphics card is always firing of frames as fast as it can possibly do, that FPS this is dynamic and can bounce from say 30 to 80 FPS in an matter of split seconds. On the eye side of things, you have this hardware which is the monitor, and it is a fixed device as it refreshes at 60 Hz (60Hz is example). Fixed and Dynamic are two different things and collide with each other. So on one end we have the graphics card rendering at a varying framerate while the monitor refreshes at 60 images per second. That causes a problem as with a slower or faster FPS then 60 you"ll get multiple images displayed on the screen per refresh of the monitor. So graphics cards don’t render at fixed speeds. In fact, their frame rates will vary dramatically even within a single scene of a single game, based on the instant load that the GPU sees.

Enabling VSYNC helps a lot, but with the video card firing off all these images per refresh you can typically see some pulsing (I don"t wanna call it vsync stuttering) when that framerate varies and your you pan from left to right in your 3D scene. So that is not perfect.

Nvidia is releasing G-Sync. Now as I explained the graphics card is running dynamic Hz, the monitor is static Hz, these two don"t really match together. G-Sync is both a software and a hardware solution that will solve screen tearing and stuttering. A daughter hardware board (it actually looks a little like a mobile MXM module) is placed into a G-Sync enabled monitor which will do something very interesting. With G-Sync the monitor will become a slave to your graphics card as the its refresh rate in Hz becomes dynamic. Yes, it is no longer static. So each time your graphics card has rendered one frame that frame is aligned up with the monitor refresh rate. So the refresh rate of the monitor will become dynamic. With both the graphics card and monitor both dynamically in sync with each other you have eliminated stutter and screen tearing completely.

It gets even better, without stutter and screen tearing on an nice IPS LCD panel even at 30+ Hz you"d be having an incredibly good gaming experience (visually). BTW monitors upto 177 hz will get supported with Gsync as well as 4K monitors.

Not a lot really. But sure, low FPS could get nasty as say 10 FPS on a LCD panel would look like weird. Now 10 fps doesn"t mean that your panel will flicker at 10 Hz as LCDs do not flicker. Unlike CRTs which have physical refresh rate. Even if your video card gives 3 frames per sec, it will be slideshow, but it should be a pretty nice one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs. But sure, in an optimal situation you will need a graphics card that can stay above 30 FPS as minimum. Secondly, dynamically altering the Hz refresh rate on your monitor just has to put some load on the monitor hardware, it MIGHT have an effect on your monitors lifespan. Last but not least. It is Nvidia proprietary technology and thus works with selected Nvidia GeForce Graphics cards only.

That is not yet disclosed, but we think you can expect a 75 EUR/USD price premium per monitor for this solution. But after such an upgrade, even a Geforce GTX 760 running 30+ Hz/FPS would result into a very nice visual gaming experience. We learned that Asus will release the VG248QE (used in the demo) in a G-Sync-Enhanced version for 399 U.S. dollars .

In the end we feel Nvidia G-Sync has the potential to be a game changer in the PC gaming industry. As even with the more mainstream graphics card you"ll be enhancing your graphics experience greatly, think of it .. no more vsync stutter or screen tearing. That means silky smooth input lag free gaming at say 40 FPS. As such G-Sync has huge potential for the you guys the gamers, and the hardware industry.

not supported with gsync lcd panel pricelist

Continue reading to learn about how Adaptive Sync prevents screen tearing and game stuttering for the smoothest gameplay possible. Or discover ViewSonic ELITE’s range of professional gaming monitors equipped with the latest sync capabilities.

However, no matter how advanced the specifications are, the monitor’s refresh rate and the graphics card’s frame rate need to be synced. Without the synchronization, gamers will experience a poor gaming experience marred with tears and judders. Manufacturers such as NVIDIA, AMD, and VESA have developed different display technologies that help sync frame rates and refresh rates to eliminate screen tearing and minimize game stuttering. And one such technology is Adaptive Sync.

Traditional monitors tend to refresh their images at a fixed rate. However, when a game requires higher frame rates outside of the set range, especially during fast-motion scenes, the monitor might not be able to keep up with the dramatic increase. The monitor will then show a part of one frame and the next frame at the same time.

As an example, imagine that your game is going at 90 FPS (Frames Per Second), but your monitor’s refresh rate is 60Hz, this means your graphics card is doing 90 updates per second with the display only doing 60. This overlap leads to split images – almost like a tear across the screen. These lines will take the beautiful viewing experience away and hamper any gameplay.

In every gameplay, different scenes demand varying levels of framerates. The more effects and details the scene has (such as explosions and smoke), the longer it takes to render the variance in framerate. Instead of consistently rendering the same framerate across all scenes, whether they are graphics-intensive or not, it makes more sense to sync the refresh rate accordingly.

Developed by VESA, Adaptive Sync adjusts the display’s refresh rate to match the GPU’s outputting frames on the fly. Every single frame is displayed as soon as possible to prevent input lag and not repeated, thus avoiding game stuttering and screen tearing.

Outside of gaming, Adaptive Sync can also be used to enable seamless video playback at various framerates, whether from 23.98 to 60 fps. It changes the monitor’s refresh rate to match with the framerate of the video content, thus banishing video stutters and even reducing power consumption.

Unlike V-Sync which caps your GPU’s frame rate to match with your display’s refresh rate, Adaptive Sync dynamically changes the monitor’s refresh rate in response to the game’s required framerates to render. This means it does not only annihilate screen tearing but also addresses the juddering effect that V-Sync causes when the FPS falls.

To illustrate Adaptive Sync with a diagram explained by VESA, you will see that Display A will wait till Render B is completed and ready before updating to Display B. This ensures that each frame is displayed as soon as possible, thus reducing the possibility of input lag. Frames will not be repeated within the display’s refresh rate set to avoid game stuttering. It will adapt the refresh rate to the rendering framerate to avoid any screen tearing.

NVIDIA G-Sync uses the same principle as Adaptive Sync. But it relies on proprietary hardware that must be built into the display. With the additional hardware and strict regulations enforced by NVIDIA, monitors supporting G-Sync have tighter quality control and are more premium in price.

Both solutions are also hardware bound. If you own a monitor equipped with G-Sync, you will need to get an NVIDIA graphics card. Likewise, a FreeSync display will require an AMD graphics card. However, AMD has also released the technology for open use as part of the DisplayPort interface. This allows anyone can enjoy FreeSync on competing devices. There are also G-Sync Compatible monitors available in the market to pair with an NVIDIA GPU.

not supported with gsync lcd panel pricelist

With Computex kicking off today NVIDIA has a number of announcements hitting the wire at the same time. The biggest news of course is the launch of the GeForce GTX 980 Ti, however the company is also releasing a number of G-Sync announcements today. This includes the launch of Mobile G-Sync for laptops, Windowed G-Sync support for laptops and desktops, new G-Sync framerate control functionality, and a number of new G-Sync desktop monitors.

We"ll kick things off with the biggest of the G-Sync announcements, which is Mobile G-Sync. Today NVIDIA is announcing a very exciting product for notebook gamers. After much speculation (and an early prototype leak) NVIDIA’s G-Sync technology is now coming to notebooks.

Anand took a look at the original G-Sync back in 2013 and for those that need a refresher on the technology, this would be a great place to start. But what G-Sync allows for is a variable refresh rate on the display which allows it to stay in sync with the GPU’s abilities to push out frames rather than forcing everything to work at a single fixed rate as dictated by the display.

From a technical/implementation perspective, because desktop systems can be hooked to any monitor, desktop G-Sync originally required that NVIDIA implement a separate module - the G-Sync module - to be put into the display and to serve as an enhanced scaler. For a desktop monitor this is not a big deal, particularly since it was outright needed in 2013 when G-Sync was first introduced. However with laptops come new challenges and new technologies, and that means a lot of the implementation underpinnings are changing with the announcement of Mobile G-Sync today.

With embedded DisplayPort (eDP) now being a common fixture in high-end notebooks these days, NVIDIA will be able to do away with the G-Sync module entirely and rely just on the variable timing and panel self-refresh functionality built in to current versions of eDP. eDP"s variable timing functionality was of course the basis of desktop DisplayPort Adaptive-Sync (along with AMD"s Freesync implementation), and while the technology is a bit different in laptops, the end result is quite similar. Which is to say that NVIDIA will be able to drive variable refresh laptops entirely with standardized eDP features, and will not be relying on proprietary features or hardware as they do with desktop G-Sync.

Removing the G-Sync module offers a couple of implementation advantages. The first of these is power; even though the G-Sync module replaced a scaler, it was a large and relatively power-hungry device, which would make it a poor fit for laptops. The second advantage is that it allows G-Sync to be implemented against traditional, lower-cost laptop eDP scalers, which brings the price of the entire solution down. In fact for these reasons I would not be surprised to eventually see NVIDIA release a G-Sync 2.0 for desktops using just DisplayPort Adaptive-Sync (for qualified monitors only, of course), however NVIDIA obviously isn"t talking about such a thing at this time. Laptops as compared to desktops do have the advantage of being a known, fixed platform, so there would be a few more issues to work out to bring something like this to desktops.

As far as qualification goes. the qualification process is designed to ensure a minimum level of overall quality in products that receive G-Sync branding, along with helping ODMs tune their notebooks for G-Sync. This process is something NVIDIA considers a trump-card of sorts for the technology, and something they believe delivers a better overall experience. From what we"re hearing on quality, it sounds like NVIDIA is going to put their foot down on low quality panels, for example, so that the G-Sync brand and experience doesn"t get attached to subpar laptops. Meanwhile the tuning process involves a similar process as on the desktop, with laptops and their respective components going through a profiling and optimization process to determine its refresh properties and pixel response times in order to set G-Sync timings and variable overdrive.

Which on that note (and on a slight tangent), after initially staying mum on the issue in the early days of G-Sync (presumably as a trade secret), NVIDIA is now confirming that all G-Sync implementations (desktop and mobile) include support for variable overdrive. As implied by the name, variable overdrive involves adjusting the amount of overdrive applied to a pixel in order to make overdrive more compatible with variable refresh timings.

As a quick refresher, the purpose of overdrive in an LCD is to decrease the pixel response time and resulting ghosting by overdriving pixels to get them to reach the desired color sooner. This is done by setting a pixel to a color intensity (voltage) above or below where you really want it to go, knowing that due to the response times of liquid crystals it will take more than 1 refresh interval for the pixel to reach that overdriven value. By driving a pixel harder and then stopping it on the next refresh, it"s possible to reach a desired color sooner (or at least, something close to the desired color) than without overdrive.

Overdrive has been a part of LCD displays for many years now, however the nature of overdrive has always implied a fixed refresh rate, as it"s not possible to touch a pixel outside of a refresh window. This in turn leads to issues with variable refresh, as you don"t know when the next refresh may happen. Ultimately there"s no mathematically perfect solution here - you can"t predict the future with 100% accuracy - so G-Sync variable overdrive is a best-effort attempt to predict when the next frame will arrive, and adjusting the overdrive values accordingly. The net result is that in motion it"s going to result in a slight decrease in color accuracy versus using a fixed refresh rate due to errors in prediction, but it allows for an overall reduction in ghosting versus not running overdrive at all.

But getting back to the subject at hand of mobile G-Sync, this is a big win for notebooks for a couple of reasons. First, more notebooks are sold now than desktops, so this makes G-Sync available to a bigger audience. Of course not all those devices even have GPUs, but NVIDIA has seen steady growth in the mobile GeForce segment over the last while, so the market is strong. The other reason this is important though is because mobile products are much more thermally constrained, as well as space constrained, so the mobile parts are always going to be slower than desktop parts. That gap has reduced with the latest Maxwell parts, but it is still there. G-Sync on mobile should help even more than it does on the desktop due to the lower overall framerate of laptop parts.

In order for G-Sync to be available on a laptop, a couple of things need to be true. First, the laptop must have a GeForce GPU obviously. Second, the laptop manufacturer needs to work with NVIDIA to enable this, since NVIDIA has to establish the parameters for the particular laptop panel in order to correctly know the maximum and minimum refresh rate as well as the amount of over/under-drive necessary. But the third is the big one. The laptop display must be directly connected to the GeForce GPU.

What this means is that in order for G-Sync to be available, Optimus (NVIDIA’s ability to switch from the integrated CPU graphics to the discrete NVIDIA graphics) will not be available. They are, at least for now, mutually exclusive. As a refresher for Optimus, the integrated GPU is actually the one that is connected to the display, and when Optimus is enabled, the iGPU acts as an intermediary and is the display controller. The discreet GPU feeds through the iGPU and then to the display. Due to the necessity of the GPU being directly connected to the display, this means that Optimus enabled notebooks will not have G-Sync available.

Obviously this is a big concern because Optimus is found on almost all notebooks that have GeForce GPUs, and has been one of the big drivers to reasonable battery life on gaming notebooks. However, going forward, it is likely that true gaming notebooks will drop this support in order to offer G-Sync, and more versatile devices which may use the GPU just once in a while, or for compute purposes, will likely keep it. There is going to be a trade-off that the ODM needs to consider. I asked specifically about this and NVIDIA feels that this is less of an issue than it was in the past because they have worked very hard on the idle power levels on Maxwell, but despite this there is likely going to be a hit to the battery life. Going forward this is something we"d like to test, so hopefully we"ll be able to properly quantify the tradeoff in the future..

As for release details, mobile G-Sync is going to be available starting in June with laptops from Gigabyte’s Aorus line, MSI, ASUS, and Clevo. Expect more soon though since this should be a killer feature on the less powerful laptops around.

Wrapping things up, as I mentioned before, mobile G-Sync seems like a good solution to the often lower capabilities of gaming laptops and it should really bring G-Sync to many more people since a dedicated G-Sync capable monitor is not required. It really is a shame that it does not work with Optimus though since that has become the standard on NVIDIA based laptops. ODMs could use hardware multiplexer to get around this, which was the solution prior to Optimus, but due to the added cost and complexity needed my guess is that this will not be available on very many, if any, laptops which want to leverage G-Sync.

The second major G-Sync announcement coming from NVIDIA today is that G-Sync is receiving windowed mode support, with that functionality being rolled into NVIDIA"s latest drivers. Before now, running a game in Windowed mode could cause stutters and tearing because once you are in Windowed mode, the image being output is composited by the Desktop Window Manager (DWM) in Windows. Even though a game might be outputting 200 frames per second, DWM will only refresh the image with its own timings. The off-screen buffer for applications can be updated many times before DWM updates the actual image on the display.

NVIDIA will now change this using their display driver, and when Windowed G-Sync is enabled, whichever window is the current active window will be the one that determines the refresh rate. That means if you have a game open, G-Sync can be leveraged to reduce screen tearing and stuttering, but if you then click on your email application, the refresh rate will switch back to whatever rate that application is using. Since this is not always going to be a perfect solution - without a fixed refresh rate, it"s impossible to make every application perfectly line up with every other application - Windowed G-Sync can be enabled or disabled on a per-application basis, or just globally turned on or off.

Meanwhile SLI users will be happy to know that Windowed G-Sync works there as well. However there will be a slight catch: for the moment it works for 2-way SLI, but not 3-way or 4-way SLI.

Finally, NVIDIA is also noting at this time that Windowed G-Sync is primarily for gaming applications, so movie viewers looking to get perfect timing in their windowed media players will be out of luck for the moment. The issue here isn’t actually with Windowed G-Sync, but rather current media players do not know about variable refresh technology and will always attempt to run at the desktop refresh rate. Once media players become Windowed G-Sync aware, it should be possible to have G-Sync work with media playback as well.

Third up on NVIDIA’s list of G-Sync announcements is support for controlling the behavior of G-Sync when framerates reach or exceed the refresh rate limit of a monitor. Previously, NVIDIA would cap the framerate at the refresh rate, essentially turning on v-sync in very high framerates. However with their latest update, NVIDIA is going to delegate that option to the user, allowing users to either enable or disable the framerate cap as they please.

The tradeoff here is that capping the framerate ensures that no tearing occurs since there are only as many frames as there are refresh intervals, but it also introduces some input lag if frames are held back to be displayed rather than displayed immediately. NVIDIA previously opted for a tear-free experience, but now will let the user pick between tear-free operation or reducing input lag to the bare minimum. This is one area where NVIDIA’s G-Sync and AMD’s Freesync implementations have significantly differed – AMD was the first to allow the user to control this – so NVIDIA is going for feature parity with AMD in this case.

Last but certainly not least from today’s G-Sync announcements, NVIDIA is announcing that their partners Acer and Asus are preparing several new G-Sync monitors for release this year. Most notably, both will be releasing 34” 3440x1440 ultra-wide monitors. Both displays are IPS based, with the Asus model topping out at 60Hz while the Acer model tops out at 75Hz. Meanwhile Acer will be releasing a second, 35” ultra-wide based on a VA panel and operating at a resolution of 2560x1080.

not supported with gsync lcd panel pricelist

5K monitors. Take it up a notch with the LG UltraFine monitor, boasting an immersive 27-inch display with 5120 x 2880 resolution and 218 ppi that lets you experience the beauty of a MacBook or MacBook Pro on a large 5K screen. Add to that an unbelievable color spectrum of P3 99% and you’ll be editing and enjoying high-res photos like never before. Too much for you to handle? There’s an impressive 21.5-inch Mac-friendly 4K version as well.

Gaming monitors. Get panoramic multitasking and immersive gaming with features like a 144Hz refresh rate—the highest of any gaming monitor—plus a response time of less than 1ms (1/1000 of a second) in Motion 240 Mode. It’s the ideal monitor for real-time strategy and first-person shooters. Dynamic Action Sync lets you catch every single moment in real time, while LG’s Black Stabilizer brightens dark scenes so the enemy can’t hide. Get true-to-life color and brilliant clarity from virtually every angle.

TV monitors. Get Full HD 1080p entertainment from the TV, as well as your computer. With features like a built-in digital tuner and Dolby Surround sound, LG TV monitors can bring your favorite movies and TV shows to life, along with your favorite online content and games.

not supported with gsync lcd panel pricelist

This single PCIe Gen4 cannot be included in any type of RAID setup on your Laptop. Only the 3 NVMe PCIe Gen3 drive can be put into a RAID 0, 1 or 5 RAID array.

The reason for this is that the technology used in the newer Gen4 slot is not closely compatible enough with the Gen3 slot to be together in a RAID array.

Starting your notebook from powered off...you hit the power button and start to select F2 to go into the BIUOS however, the computer boots directly into windows not reacting to the F2 and going into the BIOS.

You just upgraded your RAM to the maximum capacity of 128GB in your Eurocom Sky Z7 R2 but now your notebook will not boot. It starts but just runs and runs and never boots up.

The way to fix this issue is to add your RAM one stick at a time. If you doing a major upgrade in RAM sometimes the system resources are overwhelmed by the large change. Simply put on stick of 32GB of RAM in your notebook and start your notebook. Then shut off your notebook and add a second stick and re-start. Then do the same with the 3rd and 4th.

Yes...If you load Linux Mint the IRST (Intel Rapid Storage Technology) will not run correctly. This will mean you cannot setup any type of RAID setup with your Hard Drives. If you are not interested in running any type of RAID array, this issue will not effect you.

All our system have been tested with Linux/Ubuntu and they do work properly. Until recently the only known restriction is you could not use a Killer WLAN with Linux/Ubuntu. Recent releases of Linux has resolved this issue.

If your notebook is configured with G-Sync you may encounter some older video games that do not support G-Sync. To play these games you simply temporarily disable G-Sync using the following method...

If you are interested, when you purchase a new Eurocom Notebook you can request we install a Linux OS for you. We can install Ubuntu 14.04 LTS and the cost is $75.00

Unfortunately No. In order for a notebook to use NVidia G-Sync it must come from the factory with G-Sync enabled. Part of the requirements are as following...

If your notebook is equipped with an M.2 Hard Drive you can use it alone and as the primary drive in your notebook. There are no restriction on how it can be used.

Occasionally Microsoft will use their Auto Update feature in Windows 10 and will send out new drivers to your notebook. Generally this is fine however sometimes you do not want drivers such as your Video Card drivers updated as the newest drivers will not work properly in your notebook. To turn off these driver updates please follow this easy procedure...

During normal use the fans will increase speed on their own as heat builds however, many users like to prevent the temps in their notebook from building so they toggle the fans to full speed before stressing the notebook.

If you are just sitting down to intense gaming or starting some heavy workload on your notebook and you want to turn your fans on full speed to keep things nice and cool, simply use the keystroke FN-1 and this will toggle the fans to full speed. Toggle the FN-1 again and the fans will return to their normal speed as you choose.

On the Eurocom Tornado notebook there is a Fan button on the top case that will throttle the fans to full speed and back to normal as it is pushed. This is the only model Eurocom notebook that does not use the FN-1 selection to control fan speed.

The drivers for all Video Cards are available in our Product Showroom. On the Orange Options Menu you will see a series of links. Please choose Drivers and you will be taken to the proper drivers for your notebook.

Yes it sure will. New drivers have been released that have resolved past problems. The only restriction is that RAID will not work with Linux OS on Eurocom Notebooks

not supported with gsync lcd panel pricelist

This can be fixed by v-sync but that has its own drawbacks: mainly input lag and deteriorated performance. Input lag is rather straight-forward: you press a key and there’s a delay before you see the intended result. These two problems, (although may not seem like much of an issue) can be the cause of victory or defeat in eSports and competitive titles.

Traditional monitors come with a fixed refresh rate. The most common is 60 Hz. This is the rate at which the monitor refreshes the screen, before displaying the next frame. It tells you how many frames your monitor can display per sec without tearing. However, as I’m sure you already know games don’t always run at a fixed frame rate, there are inconsistencies, sometimes the GPU ends up rendering more frames than your monitor can display, sometimes less. This results in screen tearing and lags, respectively.

With NVIDIA’s G-Sync monitors (not G-Sync compatible), there’s one big caveat. They require a proprietary G-Sync module which costs a good $100-200, greatly adding to the cost of the monitor. To make matters worse, NVIDIA is the ONLY supplier of these G-Sync enabling kits.

AMD’s FreeSync technology, on the other hand, uses the VESA certified Adaptive-Sync standard built atop DP 1.2a. So basically any monitor with Display Port 1.2a or higher can integrate FreeSync. The best part is that AMD itself doesn’t manufacture the FreeSync hardware scalars. Instead, a number of third-party OEMs do the job. And since there is more than one manufacturer (competition), the prices are much lower. You can get a Free-Sync monitor for as low as $100!

In contrast, even the cheapest G-Sync Compatible monitors which basically use the same technology cost nearly twice as much. Apparently, Jensen and Co. vouch for each monitor individually and not all displays make the cut. That requires resources and thereby the additional cost. Or so they say. Regardless, you can now use most FreeSync monitors with NVIDIA cards over DP 1.2a out of the box.

There is one main difference between how G-Sync (again, not G-Sync Compatible) and FreeSync function. NVIDIA’s G-Sync displays lock the frame rate to the monitor’s upper limit (using LFC when the average FPS is too low) while with FreeSync (and G-Sync compatible) monitors, it’s supported within a range, usually between 45 to 75Hz. If the frame rate goes above 75 Hz, there will be tearing. However, there still won’t be any input lag which is more important in eSports and fast-paced shooters.

These days, most higher-end FreeSync and FreeSync 2 monitors have a wide supported range, and very rarely will you face screen tearing if you’re using one of these displays.

One of the main differences between FreeSync and G-Sync is that the former like most of AMD’s technologies is OpenSource. It leverages the VESA Adaptive-Sync that comes along with Display Port 1.2a. As such, there are no penalties or royalties that need to be paid to implement FreeSync, allowing OEMs to integrate it into even the cheapest of monitors. The lower end FreeSync models cost less than $120.

When it comes to quality, G-Sync monitors take the cake. Of course, they cost more than most modern graphics cards, and that’s the drawback. Low-end FreeSync monitors “get the job done”. Not to say that they are inferior per se, but most of the cheap ones only support Adaptive Sync between 48Hz and 75Hz. Basically, if your frame rate goes below 48, it’ll result in unbearable stuttering.

AMD has come up with something called Low Framerate Compensation (LFC) to deal with this, but only the higher-end models support it. LFC duplicates the frames to push up the average frame rate to the minimum supported by your monitor. Say you’re getting 25 FPS in a game and your monitor supports FreeSync north of 50 FPS. Then LFC will render an identical copy of each frame and display it between constant intervals to increase the average FPS to 50.

As already mentioned, G-Sync Compatible monitors are FreeSync monitors (VESA Adaptive-Sync) that NVIDIA has tested and approved. While technically you can use any FreeSync monitor as a G-Sync compatible display, the ones that are not approved might run into issues like flickering, frame skipping, etc.

Keep in mind that most G-Sync Compatible monitors certified by NVIDIA are usually the more expensive FreeSync models, the ones that come with LFC. Of course, NVIDIA claims that these monitors undergo several dozen tests but this is the main advantage.

Lastly, there’s FreeSync 2 and G-Sync Ultimate.These come with advanced features such as HDR1000, LFC, DCI 95% color gamut and abrightness rating higher than 1000 nits. They are the best monitors in all the land. However, they will cost you an astronomical amount, and that goes doubly for the G-Sync Ultimate ones.

AMD has got another advantage with respect to connectivity. Traditional G-Sync monitors only work over Display Port. Many G-Sync Compatible monitors support HDMI as well, but the more expensive ones are largely limited to DP. Both FreeSync and FreeSync2 monitors come with HDMI as well as DP support, providing more versatile connectivity options.

So there you have it, G-Sync vs FreeSync, vs G-Sync Compatible AKA “FreeSync NVIDIA Edition”. Earlier, the main difference was that FreeSync was for the masses: Not the best but affordable. G-Sync was largely limited to enthusiasts with deep pockets. The differences are more subtle now. FreeSync2 is improved and older screens with LFC are mostly on par with G-Sync Compatible models, but can’t be had for the same dirt cheap prices. With NVIDIA’s adoption of the VESA standard, the playing field has mostly leveled. However, you can still find FreeSync monitors much cheaper than rival G-Sync screens. They’re not the best, but considering the dirt cheap prices, they’re well worth it.

not supported with gsync lcd panel pricelist

Ever since I had that Vsync epiphany, or at least up until last fall when NVIDIA announced G-SYNC, I considered its design and effect on games to be very good. But, I hadn’t put thought into its downsides; the biggest one being that your game is unlikely to be able to run in perfect sync with your display, which is to say that if you’re running a 60Hz monitor, your game would have to perform at 60 FPS 100% of the time. There’s also the side-effect of the GPU and display not working in unison to deliver the best frames.

Admittedly, it wasn’t until I saw G-SYNC in person at a press event last fall that I truly realized how lacking Vsync’s design was. It’s not that I suddenly believe Vsync is a waste of time, because it’s not. Instead, it’s that I realized how much better display sync could be. When NVIDIA’s Tom Peterson showed G-SYNC off to a room full of press, I was reassured pretty quickly by my colleagues that I wasn’t the only one genuinely impressed. NVIDIA seemed to have a winner on its hands.

As much as I’d love to splash the first-half of this article with an explanation of how G-SYNC works, being a review, I think it’d be a little more fair to first take a look at the product on hand; ASUS’ Republic of Gamers SWIFT PG278Q.

The fact that this display includes G-SYNC is a massive clue that it’s targeting gamers, but ASUS didn’t stop there with game-related features. For the sake of getting a look at the hardware out-of-the-way first, I’ll talk about those later.

As great as what G-SYNC brings to the table is, it does unfortunately come with a caveat: Displays that use G-SYNC are limited to DisplayPort. This is due to the fact that G-SYNC simply requires the bandwidth and technology it offers. It really shouldn’t prove to be a limitation to anyone buying a G-SYNC display, since all current NVIDIA graphics cards are guaranteed to include at least one DisplayPort connector.

The downside with this is obvious: Some people, me included, want to use their display for more than one thing. Previous to the PG278Q, I was using a very similar ASUS display called the PB278Q. Because I have somewhat limited space, I used it for multiple things. I hooked my SHIELD portable up to it via HDMI, for starters, and I had another PC here hooked in via DVI. With the PG278Q, it’s G-SYNC / DisplayPort or bust. For a lot of people, this isn’t going to be an issue at all, but it’s definitely something worth noting, because I wouldn’t blame you if you assumed any high-end display would come with more than just a single video connector.

While the PG278Q’s buttons are rarely going to be seen, they carry on with the aesthetics established by the rest of the display. Pushing the top button (which can be moved in four directions and be pressed in) brings up a great-looking menu system. While I couldn’t do it justice with a camera, I included a shot of it below anyway.

Navigating this menu is straight-forward, and all of it is controlled with the top multi-directional button. The other buttons are also straight-forward, though there’s only two that are really special. One of these is GamePlus, which enables an on-screen crosshair and / or timer; either of which can be adjusted on the screen as necessary. The crosshair is meant to aide those who need an easier-to-see crosshair on their screen, as admittedly, it’s easy to lose your in-game one sometimes in the heat of battle.

The other special button is called Turbo, and it’s used to change the refresh rate between 60Hz, 120Hz, and 144Hz. This is a feature that might not be used by many, outside of setting it to 144Hz and being done with it, but it could potentially be useful to those who are suffering compatibility issues. I encountered just such an issue with Tony Hawk’s Pro Skater HD. While the game worked fine for the most part at 120Hz+, it was prone to crashing. When I set the monitor back to 60Hz, the issue disappeared.

As I touched on briefly in the intro, while Vsync serves its purpose pretty well, it has a couple of flaws that can’t be worked around unless the display is able to talk directly to your PC’s graphics card. That’s of course the problem G-SYNC, or “GPU-SYNC”, fixes. With a G-SYNC module installed in a display, the graphics card – in this case, NVIDIA’s own GeForce – gets to call the shots. Rather than have the display inaccurately choose which frames to display, the GPU does instead.

If you’ve ever played a game without Vsync, what I’m about to say should come as no surprise: It’s not good. Framerates might be high, but so too will be the amount of tearing and stuttering. This will be especially evident when quickly turning or moving at a really quick pace. In an FPS, for example, you could see the result simply by moving the mouse left and right; you may not even have to do it quickly. NVIDIA provides an example of what will be displayed during one frame when this tearing occurs:

That may look a little exaggerated, but remember that we’re talking about a single frame here. Even if tearing is very obvious while playing, it’s not going to look quite that bad during gameplay since you’ll be seeing dozens of frames per second.

If you’re running such a high-end PC that your games will never run below your display’s framerate, G-SYNC’s benefit isn’t going to be quite as easily seen. The vast majority of people don’t run PCs like that, however, and even the highest-end rigs are still likely to succumb to sub-60 FPS framerates at some point with today’s hottest games running with high detail levels.

The problem becomes more evident when you’re wanting to run a display with an even higher refresh rate. Following the same logic above, a 120Hz display would allow 120 individual frames to be shown each second – and I’m sure it’s obvious, but running today’s best-looking games at good detail without dipping below 120 FPS – even at a modest resolution – is just not going to happen. Even with a killer rig, a hiccup is bound to occur somewhere.

The next graphic shows a side-by-side example of how a game will behave when Vsync is either on or off. When off, the delivered FPS is all over the place, which results in the tearing, while with it on, unpredictable variance in the framerate causes the stuttering.

It might have taken a good number of paragraphs to explain the problem we’re dealing with, but it’ll take just this one to explain NVIDIA’s solution. With extra hardware installed into the monitor, by way of the G-SYNC module, the graphics card and the monitor have extremely good communication. Whenever the GPU renders a frame, it tells the G-SYNC module, and then that frame is seen by you. Because the monitor won’t display anything new until a new frame is available, there’s no tearing, no stuttering, and less lag. Also, unlike Vsync, your framerate isn’t capped; it’s like having Vsync off, but without the problems of having Vsync off.

As unfortunate as it is, it’s difficult to truly appreciate G-SYNC without seeing it in person. Even good video cameras will have a hard time accurately portraying the benefits. What you have to really envision is total smoothness. There could still be a lot of variance in the framerate, but because of the way G-SYNC delivers each one of those frames to the monitor, the result will be much more pleasing to the eye. If you’re ever at an NVIDIA-sponsored event, and G-SYNC happens to be shown-off, you’ll likely spot this demo:

Simply called “Pendulum”, this demo lets you turn Vsync on or off, as well as enable G-SYNC, to see the differences between all three modes. This is the exact demo that wowed so many members of the press last fall, and if you’ve already been won over by G-SYNC, you can impress all of your friends with the same demo by grabbing it here.

Not long after NVIDIA took the veil off of G-SYNC, AMD followed-up with ‘FreeSync’, a technology that aims to do the exact same thing. As time went on, we learned that FreeSync is actually going to be part of an upcoming VESA standard, and once released, it should work with any GPU, as long as that GPU supports the feature. I’d suspect that we’ll see good examples of this in action this coming CES. It’s really hard to settle on how FreeSync will truly compare to G-SYNC until actual products hit the market.

At this point, I’ve been using this ASUS monitor for a solid two months, and in that two months, I’ve gamed quite a bit. Well, that’s an understatement. Throughout it all, my experience has been great; even with simpler games I notice the difference. When I load a game up now, I know in advance I am going to get a smooth experience, and that counts for a lot.

What really surprised me about G-SYNC is that even in games that gave me great performance, I still noticed the difference. Defiance is a good example. I logged into this game a couple of weeks ago to do a couple of quick missions, and it just hit me – the game was running so smoothly. Then I of course clued in that G-SYNC was working its magic.

A big reason G-SYNC can make for such smooth gameplay is because you’re essentially running Vsync off, but without the issues of having Vsync off, as I touched on earlier. Your frame rate could vary from 60 to 90 in an instant, and because of the continued smoothness, it’s not going to stand out. And believe me, if you’re running this ASUS monitor at 144Hz, your frame rate is going to vary a lot. Even with a console port like Defiance, I could barely hit 120 FPS standing still while looking at the ground.

Borderlands: The Pre-Sequel (our review) is another game I thoroughly “tested” with G-SYNC, and once again, the experience is fantastic. At max resolution and equipped with a single GeForce GTX 980, the game most often runs at below 100 FPS, and again, there’s heavy variance in the framerate, but it doesn’t matter in the grand scheme.

Not all games are so friendly with high refresh rates, so it’d be wise to not expect the 144Hz or even 120Hz refresh rate to come in handy in every given scenario. Even with Vsync turned off, many games like to cap at 60 FPS; even games that have built-in benchmarks that will tell you a higher framerate than that. Ultra Street Fighter IV, for example, does just that. Its benchmark will peak at 144-ish FPS, but in gameplay, there’s a cap of 60 FPS. Then we have games like King of Fighters XIII that actually should be capped, but are not. In this particular title, running the game at 144Hz is like running it at super-speed. It’s actually pretty ridiculous that the game is designed in such a way.

After having used the PG278Q so much, I’m at the point now where I simply don’t want to have to give up G-SYNC. It’s made me hate ‘Vsync off’ even more, something that became evident with the Maxwell launch, throughout all of my benchmarking. It seems like really simple technology, and maybe it is in a way, but the difference is so good, I want it to catch on in a big way. It’s no wonder AMD was so quick to latch onto FreeSync. Would that have happened had NVIDIA not released G-SYNC? I’m not so sure. The technology would have still existed with this future VESA revision, but I’m not sure we would have paid as much attention to it.

What about the PG278Q itself? Well, I admit that at first, the TN panel was really hard to get used to, especially since I was coming from a wonderful PLS panel of the same size. The TN limitations were especially noticeable when I used portrait mode, as the sides faded a bit unless you were looking at the monitor head-on. Admittedly, though, after just a couple of weeks of use, this was a limitation of the display I didn’t even notice anymore, even though it certainly didn’t disappear. I am very confident in saying that this is one of the best TNs out there.

Another potential downside is that the monitor costs $800. That’s a bit pricey for a 27-inch, given the fact that similar displays (in size and resolution) can be had for less. But, this display not only includes G-SYNC, it also supports up to 144Hz. There’s a premium here, there’s no doubt about that, but for those goods, it’s not that hard to justify.

Overall, this is a fantastic display, and if it’s not obvious by now, I highly recommend it. That’s the upside; the downside is that finding it in stock is tough. You’ll have to become a stalker to get one, and as this monitor has been available for a couple of months, that’s disappointing. I’m hoping to hear back from either NVIDIA or ASUS soon about this availability problem, and see if we can’t get an ETA of when the situation will be remedied.

not supported with gsync lcd panel pricelist

In a driver update last May, NVIDIA added the ability for users to turn G-Sync off when frame rates are above the maximum refresh rate cap. To do so, simply open the NVIDIA Control Paneland head to