Home Gaming You won’t need a new monitor for G-Sync soon

You won’t need a new monitor for G-Sync soon

1 min read
48

G-Sync becoming a free update?

Nvidia’s G-Sync technology is astounding, and looks like it will deliver on obliterating screen tearing from PC gaming for good. The problem is that right now you have to shell out a ton of cash for a compatible monitor, or at the very least get fancy with the company’s DIY kit. It’s a pain, and Nvidia probably knows that – which is why they’re getting rid of the high requirements for good.

At least that’s what a recently leaked driver seems to suggest. Someone has managed to mess around with the new internal driver (346.87), which was leaked by ASUS Nordic Support, and found some interesting hidden G-Sync features. Tweaking these has allowed the graphics card using the driver to implement G-Sync on most eDP(embedded DisplayPort) monitors, without the need for the G-Sync model at all. In short, it’s G-Sync directly through software.

That’s a very different approach to what Nvidia and most monitor manufacturers have been taking in recent months, and could spell the end for the need to upgrade your monitor at all. Right now the leaked driver only adds support for some Notebook displays, but Nvidia have confirmed that the software is still in very early stages of development. At the very least this means they’re working on drivers to support a wide range of monitors in the near future.

If you’re still in the dark, G-Sync is a new process that eliminates screen tear, input lag and stuttering nearly completely, syncing your game perfectly to the refresh rate of your monitor. Not only is it extremely important for competitive gaming and eSports, PC gamers in general will experience smoother gameplay with the technology enabled, with none of the performance hit.

It’s revolutionary, and the idea that I won’t have to break the bank for a new monitor to get it working is a sweet symphony to my ears. Now that it’s public knowledge, I fully expect Nvidia to start talking about their G-Sync plans in more detail soon.

Last Updated: February 3, 2015

48 Comments

  1. Well well well, this is interesting indeed!

    Reply

  2. Blood Emperor Trevor

    February 3, 2015 at 15:11

    What’s the catch? Nvidia don’t do anything for free.

    Reply

    • RinceThis

      February 3, 2015 at 15:13

      They are doing it for the children.

      Reply

      • Blood Emperor Trevor

        February 3, 2015 at 15:14

        Those FUCKERS!

        Reply

    • Axon1988

      February 4, 2015 at 08:23

      Probably that screens that are older than say 1 year will not be supported or something along that line.

      Reply

    • Wesley Fick

      February 21, 2015 at 16:20

      The catch is that G-Sync will still be superior, and the scalers that do allow variable refresh rates inside the laptops like MSI’s GT80 have only been manufactured and designed in the past year. Everyone will still need to upgrade their GPU or monitor (or both) in order to take advantage of variable refresh rates.

      Your current monitor still needs to be junked. And if you’re a competitive FPS player, G-Sync or FreeSync won’t make a lick of difference for you – having a 120Hz monitor with a strobed backlight gives you far greater gains, because variable refresh rates can and will affect your ability to actually shoot people in heavy firefights.

      Reply

  3. Ghost In The Rift

    February 3, 2015 at 15:19

    Well this is neat, love it, now we wait for Team Red….patience grasshopper……..

    Reply

    • Tosh MA

      February 3, 2015 at 15:20

      That’s becoming a recurring theme. “Let’s just wait for AMD.” I’ve heard that phrase more in the last week than I care for. XD

      Reply

      • Ghost In The Rift

        February 3, 2015 at 15:22

        LOL makes me wonder though,what they are up to because of the dead silence…

        Reply

        • Blood Emperor Trevor

          February 3, 2015 at 15:24

          • Ghost In The Rift

            February 3, 2015 at 15:31

            Good to know, thanks for sharing..:-P

          • Tosh MA

            February 3, 2015 at 15:33

            But AMD still needs a compatible monitor, graphics card AND driver.

          • Ghost In The Rift

            February 3, 2015 at 15:41

            So only monitors that uses the display ports will be compatible? on both amd and nvidia?

          • Tosh MA

            February 3, 2015 at 15:48

            Nvidia will only require display port, but AMD still needs further tech in the monitor itself, like Nvidia currently do.

            Nvidia will eliminate the need for that extra tech.

          • Johan Krüger Haglert

            February 3, 2015 at 21:26

            You sure?

            It isn’t just the case that these few notebook displays already use the Adaptive-Sync that AMD FreeSync is based on?

          • Tosh MA

            February 3, 2015 at 23:41

            All info gained from the link Trevor posted above. Read up. 🙂

          • HvR

            February 3, 2015 at 23:59

            ” and a DisplayPort Adaptive-Sync-compatible display. ”

            This is a VESA DisplayPort standard that AMD convinced VESA to bring over from the EmbeddedDisplayPort standard where it used as power saving mechanism. Clever move by AMD to catch up to Nvidia after Gsync was unveiled.

          • Wesley Fick

            February 21, 2015 at 16:23

            These notebook displays use either embedded DP or LVDS, which is a much more direct connection that can enable variable refresh rates) VRR if the scaler inside the monitor supports it. In notebooks much of the scaling is done by the GPU, but there’s still logic attached to the display that will advertise that it is capable of VRR.

            G-Sync is actually a hacked form of eDP/LVDS for the monitors that Nvidia puts their scaler into and certifies, which is why it works with DP1.2 and doesn’t use the extension in DP1.2a.

          • HvR

            February 3, 2015 at 15:51

            Yes, specifically DisplayPort 1.2a/1.3 or higher; this is the only interfaces (IFIAK) that provides the necessary hardware interfaces to vary the monitor refresh rate from driver level.

            HDMI, DVI, VGA and older DisplayPort you will always need supplementary hardware like the GSync module.

          • Wesley Fick

            February 21, 2015 at 16:27

            Correct, because G-Sync uses a DP1.2 connection.

          • Johan Krüger Haglert

            February 3, 2015 at 21:26

            And how fantastic that this doesn’t need any of the monitor, graphics card or driver..

            The thing with the AMD solution is of course that no special shit is required at all.
            And there’s always been hope Nvidia would accept using it too.

          • Wesley Fick

            February 21, 2015 at 16:32

            VRR still requires updated monitor scalers that will broadcast the capability to the GPU through the display’s EDID, and you need a new GPU to take advantage of the VESA standardised form of VRR. Nvidia will support DP1.2a+ (as its now referred as in the industry). That’s what’s happening here with the laptops that have G-Sync available through the leaked alpha drivers.

          • Wesley Fick

            February 21, 2015 at 16:22

            Same boat as Nvidia then, which also needs all three currently. New scalers will take out the requirement to have a G-Sync ASIC, but G-Sync will still be far superior.

  4. Uberutang

    February 3, 2015 at 15:25

    NICE!

    Reply

  5. HvR

    February 3, 2015 at 15:26

    This will only ever work on eDP monitors and hardware that support DisplayPort1.2a or higher; it provides interface to vary the monitor refresh rate.

    AMD busy with the same thing they demo’ed FreeSync last year.

    Reply

    • Johan Krüger Haglert

      February 3, 2015 at 21:27

      Or rather “Nvidia drivers tweakable to unlock support for Adaptive-Sync”

      Reply

      • HvR

        February 3, 2015 at 23:40

        Yes.

        Or rather Nvidia seeing the damn light and adopting the open standard all manufactures adhere to.

        Reply

  6. Ghost In The Rift

    February 3, 2015 at 15:30

    Oh look, smooth has 7 O’s, that’s weird.*runs*

    Reply

    • Tosh MA

      February 3, 2015 at 15:34

      Smooth. 😛

      Reply

  7. Corrie

    February 3, 2015 at 15:33

    I’m still undecided on what to do, do I go with a 970 after their blunders or do I wait for a AMD 300X card and also with DX12 needing a new set of cards to fully take hold of it what do I do?

    Reply

    • Blood Emperor Trevor

      February 3, 2015 at 15:45

      Well unless you need one right now, wait a couple of months to see what the 300 series is about. The 970 is already DX12 compatible if I remember right.

      Reply

      • Corrie

        February 3, 2015 at 16:05

        Yeah it is just I believe there are certain DX12 hardware specific things it can’t do but yeah not like DX12 will be implemented in every game from the get go

        But I do have a few months still so ‘ll see whether I do jump to the red team but I hear their driver support is still atrocious

        Reply

        • Wesley Fick

          February 21, 2015 at 16:34

          Maxwell is fully DX12 compliant, you won’t have any issues there. The only thing that Nvidia’s introduced that isn’t on the GTX 980/970 is hardware-accelerated H.265 decoding for 4K.

          Reply

    • Alessandro Barbosa

      February 3, 2015 at 15:46

      GTX 970 is a DX12 card, and is still one of the best on the market. The memory design in no way invalidates the amazing review the card got, so don’t let it put you off (unless you’re planning 4K)

      Reply

      • Corrie

        February 3, 2015 at 16:07

        Good point , I did always have my eye on a 970, just was a bit put off on nvidia’s lack of “communication” but with the earlier article on stackable memory it actually doesn’t bother me so much , and I kinda did wanna do 4K with DSR but yeah I am happy with 1080p, games look great at 1080 on my 40″

        Reply

      • Shorty20122012 .

        February 4, 2015 at 03:03

        There are issues even at 1440p (2k). Look at the Pcper.com new bechmarks and look at the frame time variance.

        Reply

        • Alessandro Barbosa

          February 4, 2015 at 17:59

          “If you are an owner of a GTX 970, I totally understand the feelings of betrayal, but realistically I don’t see many people with access to a wide array of different GPU hardware changing their opinion on the product itself. NVIDIA definitely screwed up with the release of the GeForce GTX 970 – good communication is important for any relationship, including producer to consumer. However, they just might have built a product that can withstand a PR disaster like this.”

          From their article 🙂 I’m in now way disputing that the memory design has no impact on higher resolution gaming, or for that matter only at high resolutions like 4K. I am saying that the review that tested the card when it launched (including ours) came to the conclusion that the 970 was a phenomenal card, and the memory design doesn’t in any way invalidate that.

          Reply

    • Maxiviper117

      February 4, 2015 at 10:56

      You wont need to upgrade your GPU to be compatible “One of the surprise announcements at the show is that Nvidia will support DX12 on every Fermi, Kepler, and Maxwell-class GPU. That means nearly every GTX 400, 500, and 600 series card will be supported.” http://www.extremetech.com/gaming/178904-directx-12-detailed-backwards-compatible-with-all-recent-nvidia-gpus-will-deliver-mantle-like-capabilities

      But…You will need windows 10, Yet, Microsoft is offering windows 10 to everyone on Windows 7 and up for the first year free.

      Reply

      • Corrie

        February 4, 2015 at 12:23

        Thanks for that, saw a similar article to that but this is a bit more clear

        But this is what has made me confused on what I should do?

        http://www.polygon.com/2015/1/22/7874793/directx-12-wont-require-a-new-graphics-card-after-all

        “To get the full support of DX12 will users need to get a new graphics card?”

        To get the “full benefits of DX12,” Ybarra replied, “the answer is yes.”

        “There will be DX 11.1 cards that take advantage of a lot of the
        driver and software tech that we’re bringing in Windows 10, but if you
        want the full benefits of DX12, you’re going to need a DX12 card.”

        Reply

        • Alessandro Barbosa

          February 4, 2015 at 18:46

          All the information surrounding this is a little vague at the moment. From what I gather “some” DX11/10 cards will be able to utilize some DX12 features. However, a DX12 card is required for the full suite of options and enhancements.

          Reply

          • Corrie

            February 4, 2015 at 19:30

            Yeah that is why I was confused whether to wait for a fully DX12 card or upgrade soon, I realized now that screw it I’ll bite for a 970 or if they release a new 970/970ti if they decided to”fix the 4GB” or if not and if new full DX12cards come out I’ll look at getting one, always have used Nvidia cards and to change kinda is silly if I’m still happy with my MSI 560.

            Realized that there’s a point why I’m working, even if I’m looking at a bit of a long term upgrade I’ll just save for the next build.

  8. The Top Five

    February 3, 2015 at 20:32

    Another interesting article on G-SYNC, if NVIDIA live up to their promise, this could be a game changer!!

    Reply

    • Johan Krüger Haglert

      February 3, 2015 at 21:31

      Except it’s not G-SYNC now is it?

      Reply

  9. z1n

    February 4, 2015 at 11:24

    Sooo, this is like V-Sync except with a G? But the real question is… will it hit the V spot? [queue Horatio Cane sunglasses manoeuvre]

    Reply

  10. Meh

    July 20, 2015 at 14:55

    Its just catchy corporate talk for ooo we found new ways to rip people off. I imagine monitor prices will sky rocket when this takes off.

    Reply

  11. Frantz Miles

    October 28, 2015 at 12:05

    Update? – What happening? Confirmed? Absolute excrement?

    Reply

    • Starfals

      November 3, 2016 at 20:39

      Nothing so far… and its even 1 more year from when we heard about it.

      Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Nvidia to finally stop support for Windows 7 and 8

It's hard to believe that the likes of Windows 7 and 8 are still being use by people, but …