The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia G-SYNC

    Discussion in 'Gaming (Software and Graphics Cards)' started by 1nstance, Oct 18, 2013.

  1. 1nstance

    1nstance Notebook Evangelist

    Reputations:
    517
    Messages:
    633
    Likes Received:
    221
    Trophy Points:
    56
  2. InspiredE1705

    InspiredE1705 Notebook Evangelist

    Reputations:
    329
    Messages:
    328
    Likes Received:
    8
    Trophy Points:
    31
    Well, I have never noticed tearing or stutter in any games I've played. The only thing I recognize is low frame rates.
     
  3. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    I would be interested if this was more open, not specific to only nvidia GPUs, as I mostly build desktops with AMD GPUs.

    How would it work with surround/multiple displays?

    The idea sounds great, but hopefully we shall see a more consumer friendly instead of a proprietary approach.
     
  4. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    I'm extremely excited about this tech. It's about time someone did something about the tearing + stuttering that happens when gaming and this solution pretty much solves it all. @ryzeki it has to be proprietary because its a hardware and software solution that utilizes Kepler's core. Besides, I wouldn't be surprised if AMD copies NVIDIA and makes their own version in the future. Here's an article on it: Say Goodbye to Tearing, Stuttering and Lag: NVIDIA G-SYNC Technology For Gaming Monitors - The Digital HQ
     
  5. Saucycarpdog

    Saucycarpdog Notebook Guru

    Reputations:
    0
    Messages:
    69
    Likes Received:
    7
    Trophy Points:
    16
    NVidia is slowing trying to make full fledged products out of their hardware...wonder what it would be like if they created a laptop by themselves? 0__o

    Another overpriced laptop! lol
     
  6. Captmario

    Captmario Notebook Consultant

    Reputations:
    50
    Messages:
    200
    Likes Received:
    9
    Trophy Points:
    31
    I am kind of excited about this technology as nvidia states that it will also remove any input lag created by vSync, now imagine vSync without input lag, should be so awesome

    although it seems like it wont be available in existing monitors, so maybe not that much useful for my current notebook
     
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Thanks for mentioning this, very interesting.

    I don't ever notice tearing. I never use vSync except in rare cases where it's needed (Dead Space, Skyrim). I may be astonished if I use it though. It all depends on the price hit due to this technology, though. But more importantly, will it be implemented in laptops at all? Yet another piece of hardware to cram into your already limited space hardware. Does it run hot and need any active cooling?

    Note this comment here from nVidia: Introducing Revolutionary NVIDIA G-SYNC Display Technology: Ultra-Smooth, Stutter-Free Gaming Is Here | GeForce

    " Hi, ASUS has announced that a modded VG248QE sold by a retailer like Newegg or Amazon will have a MSRP of $399.
    The VG248QE mod kit is currently estimated to cost $175, though we hope to get that down to $130.
    "

    The VG248QE currently has MSRP of $279, so it's adding $120 to the cost of the monitor. Tough sell I think.
     
    1nstance likes this.
  8. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Awesome to see some aspiring new technology regarding sync. Really tired of vSync it has virtually become useless now.

    Lets see how this plays out though.
     
  9. be77solo

    be77solo pc's and planes

    Reputations:
    1,460
    Messages:
    2,631
    Likes Received:
    306
    Trophy Points:
    101
    well, costs not considered assuming I'm willing to buy the Asus monitor and the kit, will it work with a laptop? I don't see why not but have to ask, as I don't have any intention on building another desktop but do use a monitor or two when on my desk with my laptop. If so I'm game, a I need a good monitor to compliment my machine.
     
  10. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    I looked at some Youtube videos of it, and it did look better. It was hard to see a difference in some areas, but not too hard in others. And I understand in person the effect is even more noticable. This seems to smooth out games that have varible frame rates as well. The reviews seem to be wow, amazing, big step for gaming, etc. So I wish I could see it in person.

    But from what I have seen, it seems worth $120, even $175.

    It doesn't seem to need cooling, and users can mod their own monitors. I haven't seen any mention of laptops. So I wonder if they will be modifiable, or not. Or if you will have to buy a new panel.

    It should work as an external display assuming the laptop has a display port. But the Intel GPU might get in the way. Maybe a firmware update will be needed for Intel, or maybe the Intel will need a hardware fix. ???
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    You can be certain that it won't work with current laptops. It will need a connector and power source. There's a reason it only works as an add-on for one monitor right now, because they built in the proper connectors. Most LCD's you won't want to open up anyhow, so most likely they will add an access panel to pop in the card for LCD's that are "G-Sync Ready". This just seems like a gimmick to me though. Only for the totally OCD gamer. If it's massively successful (which I doubt it will be, it will likely be another PhysX debacle) then I'm sure LCD manufacturers will come up with their own generic work around so it's not nVidia only.
     
    jaug1337 and TBoneSan like this.
  12. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    If you do a search with google, you find a lot of complaints of tearing and stuttering, as well as lag with vsync. This supossedly fixes it.

    Anandtech said after watching the demo, " I can't stress enough just how smooth the G-Sync experience was, it's a game changer." ... " We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level."

    Engadget said of the demo, " A quick demo we were shown of a V-Sync'd monitor versus one with G-Sync did what NVIDIA promised: screen tearing was eliminated and lag was imperceptible. ... It looked fantastic in person, but due to the limitations of our camera equipment, some of the improvements may not be apparent in our video."

    "It's just a better experience. Almost every single game can benefit from this," says John Carmack, founder of id software and CTO of Oculus VR.
     
  13. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    I would imagine boutique gaming notebook manufacturers like Alienware, Razer, MSI, Asus etc.. Will build this tech into their gaming units in the future, especially if we start seeing more high DPI displays.
     
  14. EpicBlob

    EpicBlob Notebook Evangelist

    Reputations:
    49
    Messages:
    410
    Likes Received:
    16
    Trophy Points:
    31
    From what I was seeing in the videos, 40fps and 60fps looked the exact same. Aka you wouldn't have to upgrade your graphics card as soon as games start going under the desired 60fps. This is going to be huge. Just hoping that monitors compatible with g-sync aren't insanely expensive.
     
  15. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'm of the entirely opposite opinion. G-Sync is nothing like the gimmick that is PhysX. It has far-reaching implications for the entire gaming industry. Not just for stutter- and artifact-free gaming on traditional displays but also for the issues of low-persistence and input latency in the VR applications that are taking off. Tearing might be fine when you're sitting a few feet away from your desktop or laptop screen, but getting a drop or tear with an Occulus Rift inches from your eyes is like getting a swift kick in the balls and totally ruins the immersion. Games have always been moving toward the ultimate goal of becoming indistinguishable from reality and this moves us one step closer to that. Those three luminaries on-stage (Sweeney, Andersson, and Carmack) wouldn't have made such a huge deal about it if it weren't special. Well sure, it was an Nvidia event and they probably weren't going to blatantly bad-mouth it, but Carmack has never been afraid to speak his mind on anything and the journalists on-hand to see the live demonstrations of G-Sync all raved about it. I think if you were there to see it with them you might feel differently about it.

    I'm really excited about G-Sync because it's one of the first truly revolutionary things we've seen in quite some time and it addresses one of the most fundamental flaws of display technology to this point: The fixed refresh rate. There's so many problem this fixes and I have not doubt this sort of technology will be widely-implemented in the near future. You can bet that if G-Sync is proprietary and not licensable then AMD and Intel will have their own solutions. That's how important this is.
     
    ajnindlo likes this.
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    I think G-sync will be dedicated only mode (wont work with optimus), other than that it just needs a modern nvidia GPU and standard displayport connector.
     
  17. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Would this even work on laptops? You think Clevo/Alienware will have modified G-sync panels?

    And this seems to be OEM independent, doesn't seem to matter if you have Nvidia or AMD graphics card right?

    And how much will it be to buy a panel with G-sync modification?
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I see a big problem, primarily because it's nVidia only GPU's, and as Meaker mentioned likely not Optimus supported.

    Second, if it's an issue for Oculus Rift, then just incorporate it into the Oculus Rift, not add $100-$150 to a monitor. That's like a 50% price hike from your decent gaming 24"-27" monitors.

    I'd rather see an independent company develop something that works universally. There's no need for it to be nVidia only.

    "nVidia" G-Sync. What do you think? Read the article. It's proprietary to nVidia which stinks.
     
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I don't see why it wouldn't work on a G-Sync equipped external monitor connected to a Kepler-based laptop. The again Nvidia might just impose an artificial limitation and not allow G-Sync to work on laptops period. Here are the current system requirements though:

    NVIDIA G-Sync System Requirements
    Video Card GeForce GTX 650 Ti Boost or Higher
    Display G-Sync Equipped Display
    Driver R331.58 or Higher
    Operating System Windows 7/8/8.1

    I'm not saying G-Sync or a similar technology won't show up in Occulus Rift in the future, it most likely will. But if you've been following the development of Rift you'd know that they have much more pressing issues to deal with right now. Specifically resolution and latency, which affects motion sickness. Plus it's pretty hard to fit this in a head-mounted display. This variable refresh rate technology stuff is more of an afterthought at this point and might show up in a second- or third-gen Rift device for instance.

    Right now G-Sync is only showing up on a few popular desktop monitors because it's super-new and Nvidia just wants to nudge it out the door first. Get it into the hands of the public and see how well it's received, testing the waters so to speak, before expanding it.

    Your point about an independent company developing this, while ideal, is completely unrealistic. Given that this has been an issue dogging displays for decades and the concept sounds simple enough--instead of syncing the GPU’s output to the monitor’s refresh rate, sync the monitor’s refresh rate to the GPU’s output--I'm sure many others have already thought of the same thing and/or attempted to put it into practice. But this is one of those things that's easy to conceive but extremely difficult to execute. Nobody except a GPU maker, especially a powerhouse like Nvidia, would have the hardware access and R&D muscle to bring something like this to fruition.

    And on that same token, AMD and Intel could develop their own variable refresh rate technologies, and I fully expect them to given how much of a game-changer this is, if G-Sync does end up being proprietary. Nvidia hasn't precluded G-Sync being licensable but I'm pretty sure it won't be. The implications of this technology are too far-reaching for the other GPU makers to stand idly by, otherwise it would be too much of an ace up Nvidia's sleeve.
     
    5150Joker likes this.
  20. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    If you don't want to pay extra, then don't get it. Right now it is a drop in card. So you can buy a monitor with the slot, and just not buy the card.

    Yes, it stinks that every single company tries to creat propriatary products. AMD has Mantle, Nvidia had Phys-x, etc. Why can't they get together and sing kum bi ah. That is not the way the world works, which is good. If it did then they would have no reason to come out with new products.

    But just like Blu-ray and HD-DVD worked itself out, this will to. You might think, well that is different. Then remember SLI, when it came out it was exclusive. Later cross fire came out. So things evened up.

    The long run is this will be good for all gamers, even those with AMD.
     
  21. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Hm yeah, nVIDIA forums suggest something else.

    Several times been hinted that we are in for a surprise. Starting to get eager lol, btw cba on source, go Google :D
     
  22. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    What do you mean a surprise? Are you saying they have another major announcement regarding g-sync? Or is it unrelated to g-sync, or what?
     
  23. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    I think related to G-SYNC since they've kept it under the same topic.

    I don't know though :p
     
  24. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Anybody else feel that G-Sync is an indirect response to Mantle? One implication of Mantle is "oh you won't have to spend a bajillion dollars on the latest-and-greatest GPU since our low-level API will provide a big performance boost," while G-Sync is "oh you won't have to spend a bajillion dollars on the latest-and-greatest GPU to chase that perfect 60+ FPS mark since even 30 FPS looks buttery-smooth with our variable refresh rate technology."
     
    jaug1337 and TBoneSan like this.
  25. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I must admit, that thought did come to mind...
     
  26. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    It could be, but more of a they made an announcement, we need to annouce something. Maybe this was a bit premature, which is why there will be an additional annoucement on it, or so the rumor goes.
     
  27. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Starting with the GeForce 331.58 WHQL driver, G-Sync is available as one of the options under V-Sync in the Nvidia Control Panel, for Kepler GPU's of course. Seems to support the notion that G-Sync will work for laptops as long as a G-Sync equipped external monitor is attached.
     
  28. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    Nice. Maybe soon we will see reports from the field. It would be nice to see some real reviews.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Not for a while. G-Sync equipped monitors and mod kits come out Q1 2014.
     
  30. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    Hopefully some reviewers will see some pre-release hardware. So far all we have was a controlled demo session run by Nvidia.
     
  31. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
  32. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    That doesn't look quite right for a g-sync patent. They basically are looking for the differences between the frames, and using that to control refresh rate. Instead, g-says when it has a frame done, it sends to the display and tells it to refresh. I.e. g-sync isn't looking for differences, as far as I know.

    Maybe this patent work led them to discovering g-sync, but I don't think it would protect g-sync. I would assume Nvidia has, or soon will have a patent for g-sync. And I would assume they would license it to AMD. They don't want AMD to figure out another way of doing the same thing, so best to license it so they are not so strongly motivatec to do so. This assumes that g-sync is as good as it has been hyped.
     
  33. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    My question is, why does this need to be in the monitor and not designed into the card itself? I guess it's something that we have to "see to believe". This patent was filed in 2007, but only issued in 2012. Great patent system. So we could have had this five years ago likely if the patent system wasn't so slow and/or nVidia decided to make it open tech. I'm just surprised this wasn't ever done before, and it seems to have been an issue ever since the switch from CRT to LCD. The next thing they need to work on is making images below native resolution look good when stretched to the screen. That was never an issue with CRT's either. But I think the only real way to do that is to increase the pixel density so high that lower resolutions don't show any "jaggies" or offset pixels. That or change graphic design from pixels to vectors. But that would require all new hardware... but why not? Force users to invest in the latest technology.
     
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
  35. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    That was a good article, it has background and talks about how g-sync applies. It also told me one reason it took so long, Display port is the only interface that up to now supports this.

    Yes, vector graphics would be bad. Now if we had ray traced graphics...

    It is good to finally see some real advancement come out. This article does say they will license it...
     
  36. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    It will take a long time, but I think it's a foregone conclusion that ray tracing will ultimately win out in the end.
     
  37. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Obviously G-Sync needs to be seen to believed, but here's a slow-mo video shot at 120 FPS which pretty clearly demonstrates its benefits. G-Sync monitor is on the right.



    First half with the swinging pendulum compares no V-Sync to G-Sync. You can clearly see the screen tearing regardless of FPS on the left.

    Second half with Tomb Raider (2013) compares V-Sync to G-Sync. The difference is subtle for the most part, but the screen judder and vertical shake on the V-Sync monitor from 1:05-1:10 is very clear. What's not shown in this comparison is the massive input lag introduced by V-Sync which G-Sync doesn't have.
     
    Last edited by a moderator: May 12, 2015
  38. lightbulbfury

    lightbulbfury Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Is Gsync dead? I've been thinking about buying an Asus ROG PG278Q but I'm wondering whether I should wait for freesync. The monitors rated high, but I'm just not sure.
     
  39. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I think so. It's been hyped up and way over hyped apparently since it hasn't seen the light of day that I'm aware of.
     
  40. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Nvidia makes good products but its marketing is horrendous. Just look at Shield for example, or Titan Z. They remind me of Nintendo in that respect. :D
     
  41. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    These crooks want $200 for a DIY kit. :eek: :mad:

    Holy sh!t my Asus monitor only cost $50 more than that.
     
  42. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Early adopters always get burned, no matter how great the tech. It'll come down for sure once there's some competition (FreeSync).
     
    TBoneSan likes this.