Introducing Revolutionary NVIDIA G-SYNC Display Technology: Ultra-Smooth, Stutter-Free Gaming Is Here | GeForce
Holy smokes this looks amazing.
-
InspiredE1705 Notebook Evangelist
Well, I have never noticed tearing or stutter in any games I've played. The only thing I recognize is low frame rates.
-
I would be interested if this was more open, not specific to only nvidia GPUs, as I mostly build desktops with AMD GPUs.
How would it work with surround/multiple displays?
The idea sounds great, but hopefully we shall see a more consumer friendly instead of a proprietary approach. -
I'm extremely excited about this tech. It's about time someone did something about the tearing + stuttering that happens when gaming and this solution pretty much solves it all. @ryzeki it has to be proprietary because its a hardware and software solution that utilizes Kepler's core. Besides, I wouldn't be surprised if AMD copies NVIDIA and makes their own version in the future. Here's an article on it: Say Goodbye to Tearing, Stuttering and Lag: NVIDIA G-SYNC Technology For Gaming Monitors - The Digital HQ
-
NVidia is slowing trying to make full fledged products out of their hardware...wonder what it would be like if they created a laptop by themselves? 0__o
Another overpriced laptop! lol -
I am kind of excited about this technology as nvidia states that it will also remove any input lag created by vSync, now imagine vSync without input lag, should be so awesome
although it seems like it wont be available in existing monitors, so maybe not that much useful for my current notebook -
Thanks for mentioning this, very interesting.
I don't ever notice tearing. I never use vSync except in rare cases where it's needed (Dead Space, Skyrim). I may be astonished if I use it though. It all depends on the price hit due to this technology, though. But more importantly, will it be implemented in laptops at all? Yet another piece of hardware to cram into your already limited space hardware. Does it run hot and need any active cooling?
Note this comment here from nVidia: Introducing Revolutionary NVIDIA G-SYNC Display Technology: Ultra-Smooth, Stutter-Free Gaming Is Here | GeForce
" Hi, ASUS has announced that a modded VG248QE sold by a retailer like Newegg or Amazon will have a MSRP of $399.
The VG248QE mod kit is currently estimated to cost $175, though we hope to get that down to $130."
The VG248QE currently has MSRP of $279, so it's adding $120 to the cost of the monitor. Tough sell I think.1nstance likes this. -
Awesome to see some aspiring new technology regarding sync. Really tired of vSync it has virtually become useless now.
Lets see how this plays out though. -
well, costs not considered assuming I'm willing to buy the Asus monitor and the kit, will it work with a laptop? I don't see why not but have to ask, as I don't have any intention on building another desktop but do use a monitor or two when on my desk with my laptop. If so I'm game, a I need a good monitor to compliment my machine.
-
I looked at some Youtube videos of it, and it did look better. It was hard to see a difference in some areas, but not too hard in others. And I understand in person the effect is even more noticable. This seems to smooth out games that have varible frame rates as well. The reviews seem to be wow, amazing, big step for gaming, etc. So I wish I could see it in person.
But from what I have seen, it seems worth $120, even $175.
It doesn't seem to need cooling, and users can mod their own monitors. I haven't seen any mention of laptops. So I wonder if they will be modifiable, or not. Or if you will have to buy a new panel.
It should work as an external display assuming the laptop has a display port. But the Intel GPU might get in the way. Maybe a firmware update will be needed for Intel, or maybe the Intel will need a hardware fix. ??? -
You can be certain that it won't work with current laptops. It will need a connector and power source. There's a reason it only works as an add-on for one monitor right now, because they built in the proper connectors. Most LCD's you won't want to open up anyhow, so most likely they will add an access panel to pop in the card for LCD's that are "G-Sync Ready". This just seems like a gimmick to me though. Only for the totally OCD gamer. If it's massively successful (which I doubt it will be, it will likely be another PhysX debacle) then I'm sure LCD manufacturers will come up with their own generic work around so it's not nVidia only.
-
If you do a search with google, you find a lot of complaints of tearing and stuttering, as well as lag with vsync. This supossedly fixes it.
Anandtech said after watching the demo, " I can't stress enough just how smooth the G-Sync experience was, it's a game changer." ... " We've seen a resurgence of PC gaming over the past few years, but G-Sync has the potential to take the PC gaming experience to a completely new level."
Engadget said of the demo, " A quick demo we were shown of a V-Sync'd monitor versus one with G-Sync did what NVIDIA promised: screen tearing was eliminated and lag was imperceptible. ... It looked fantastic in person, but due to the limitations of our camera equipment, some of the improvements may not be apparent in our video."
"It's just a better experience. Almost every single game can benefit from this," says John Carmack, founder of id software and CTO of Oculus VR. -
I would imagine boutique gaming notebook manufacturers like Alienware, Razer, MSI, Asus etc.. Will build this tech into their gaming units in the future, especially if we start seeing more high DPI displays.
-
From what I was seeing in the videos, 40fps and 60fps looked the exact same. Aka you wouldn't have to upgrade your graphics card as soon as games start going under the desired 60fps. This is going to be huge. Just hoping that monitors compatible with g-sync aren't insanely expensive.
-
I'm really excited about G-Sync because it's one of the first truly revolutionary things we've seen in quite some time and it addresses one of the most fundamental flaws of display technology to this point: The fixed refresh rate. There's so many problem this fixes and I have not doubt this sort of technology will be widely-implemented in the near future. You can bet that if G-Sync is proprietary and not licensable then AMD and Intel will have their own solutions. That's how important this is.ajnindlo likes this. -
Meaker@Sager Company Representative
I think G-sync will be dedicated only mode (wont work with optimus), other than that it just needs a modern nvidia GPU and standard displayport connector.
-
Would this even work on laptops? You think Clevo/Alienware will have modified G-sync panels?
And this seems to be OEM independent, doesn't seem to matter if you have Nvidia or AMD graphics card right?
And how much will it be to buy a panel with G-sync modification? -
Second, if it's an issue for Oculus Rift, then just incorporate it into the Oculus Rift, not add $100-$150 to a monitor. That's like a 50% price hike from your decent gaming 24"-27" monitors.
I'd rather see an independent company develop something that works universally. There's no need for it to be nVidia only.
-
NVIDIA G-Sync System Requirements
Video Card GeForce GTX 650 Ti Boost or Higher
Display G-Sync Equipped Display
Driver R331.58 or Higher
Operating System Windows 7/8/8.1
I'm not saying G-Sync or a similar technology won't show up in Occulus Rift in the future, it most likely will. But if you've been following the development of Rift you'd know that they have much more pressing issues to deal with right now. Specifically resolution and latency, which affects motion sickness. Plus it's pretty hard to fit this in a head-mounted display. This variable refresh rate technology stuff is more of an afterthought at this point and might show up in a second- or third-gen Rift device for instance.
Right now G-Sync is only showing up on a few popular desktop monitors because it's super-new and Nvidia just wants to nudge it out the door first. Get it into the hands of the public and see how well it's received, testing the waters so to speak, before expanding it.
Your point about an independent company developing this, while ideal, is completely unrealistic. Given that this has been an issue dogging displays for decades and the concept sounds simple enough--instead of syncing the GPU’s output to the monitor’s refresh rate, sync the monitor’s refresh rate to the GPU’s output--I'm sure many others have already thought of the same thing and/or attempted to put it into practice. But this is one of those things that's easy to conceive but extremely difficult to execute. Nobody except a GPU maker, especially a powerhouse like Nvidia, would have the hardware access and R&D muscle to bring something like this to fruition.
And on that same token, AMD and Intel could develop their own variable refresh rate technologies, and I fully expect them to given how much of a game-changer this is, if G-Sync does end up being proprietary. Nvidia hasn't precluded G-Sync being licensable but I'm pretty sure it won't be. The implications of this technology are too far-reaching for the other GPU makers to stand idly by, otherwise it would be too much of an ace up Nvidia's sleeve.5150Joker likes this. -
Yes, it stinks that every single company tries to creat propriatary products. AMD has Mantle, Nvidia had Phys-x, etc. Why can't they get together and sing kum bi ah. That is not the way the world works, which is good. If it did then they would have no reason to come out with new products.
But just like Blu-ray and HD-DVD worked itself out, this will to. You might think, well that is different. Then remember SLI, when it came out it was exclusive. Later cross fire came out. So things evened up.
The long run is this will be good for all gamers, even those with AMD. -
Hm yeah, nVIDIA forums suggest something else.
Several times been hinted that we are in for a surprise. Starting to get eager lol, btw cba on source, go Google -
What do you mean a surprise? Are you saying they have another major announcement regarding g-sync? Or is it unrelated to g-sync, or what?
-
I don't know though -
Anybody else feel that G-Sync is an indirect response to Mantle? One implication of Mantle is "oh you won't have to spend a bajillion dollars on the latest-and-greatest GPU since our low-level API will provide a big performance boost," while G-Sync is "oh you won't have to spend a bajillion dollars on the latest-and-greatest GPU to chase that perfect 60+ FPS mark since even 30 FPS looks buttery-smooth with our variable refresh rate technology."
-
I must admit, that thought did come to mind...
-
It could be, but more of a they made an announcement, we need to annouce something. Maybe this was a bit premature, which is why there will be an additional annoucement on it, or so the rumor goes.
-
Starting with the GeForce 331.58 WHQL driver, G-Sync is available as one of the options under V-Sync in the Nvidia Control Panel, for Kepler GPU's of course. Seems to support the notion that G-Sync will work for laptops as long as a G-Sync equipped external monitor is attached.
-
Nice. Maybe soon we will see reports from the field. It would be nice to see some real reviews.
-
-
Hopefully some reviewers will see some pre-release hardware. So far all we have was a controlled demo session run by Nvidia.
-
AMD users may be out of luck if it turns out this tech is patented by NVIDIA: NVIDIA G-SYNC Patent? United States Patent: 8120621 - The Digital HQ I don't see NVIDIA licensing it to AMD anytime soon--they'd much rather have people buy NV cards and a g-sync monitor to increase revenue + market share.
-
That doesn't look quite right for a g-sync patent. They basically are looking for the differences between the frames, and using that to control refresh rate. Instead, g-says when it has a frame done, it sends to the display and tells it to refresh. I.e. g-sync isn't looking for differences, as far as I know.
Maybe this patent work led them to discovering g-sync, but I don't think it would protect g-sync. I would assume Nvidia has, or soon will have a patent for g-sync. And I would assume they would license it to AMD. They don't want AMD to figure out another way of doing the same thing, so best to license it so they are not so strongly motivatec to do so. This assumes that g-sync is as good as it has been hyped. -
My question is, why does this need to be in the monitor and not designed into the card itself? I guess it's something that we have to "see to believe". This patent was filed in 2007, but only issued in 2012. Great patent system. So we could have had this five years ago likely if the patent system wasn't so slow and/or nVidia decided to make it open tech. I'm just surprised this wasn't ever done before, and it seems to have been an issue ever since the switch from CRT to LCD. The next thing they need to work on is making images below native resolution look good when stretched to the screen. That was never an issue with CRT's either. But I think the only real way to do that is to increase the pixel density so high that lower resolutions don't show any "jaggies" or offset pixels. That or change graphic design from pixels to vectors. But that would require all new hardware... but why not? Force users to invest in the latest technology.
-
I think it would be helpful to read up a little more on display technology:
NVIDIA G-Sync: Death of the Refresh Rate | PC Perspective
And your point about vector vs. raster images is really a terrible idea, unless you're fine with playing Flash games for the rest of your life. -
That was a good article, it has background and talks about how g-sync applies. It also told me one reason it took so long, Display port is the only interface that up to now supports this.
Yes, vector graphics would be bad. Now if we had ray traced graphics...
It is good to finally see some real advancement come out. This article does say they will license it... -
It will take a long time, but I think it's a foregone conclusion that ray tracing will ultimately win out in the end.
-
Obviously G-Sync needs to be seen to believed, but here's a slow-mo video shot at 120 FPS which pretty clearly demonstrates its benefits. G-Sync monitor is on the right.
First half with the swinging pendulum compares no V-Sync to G-Sync. You can clearly see the screen tearing regardless of FPS on the left.
Second half with Tomb Raider (2013) compares V-Sync to G-Sync. The difference is subtle for the most part, but the screen judder and vertical shake on the V-Sync monitor from 1:05-1:10 is very clear. What's not shown in this comparison is the massive input lag introduced by V-Sync which G-Sync doesn't have.Last edited by a moderator: May 12, 2015 -
Is Gsync dead? I've been thinking about buying an Asus ROG PG278Q but I'm wondering whether I should wait for freesync. The monitors rated high, but I'm just not sure.
-
-
-
These crooks want $200 for a DIY kit.
Holy sh!t my Asus monitor only cost $50 more than that. -
TBoneSan likes this.
Nvidia G-SYNC
Discussion in 'Gaming (Software and Graphics Cards)' started by 1nstance, Oct 18, 2013.