Oh man, FreeSync just came out and it's a doozie. Not only are FreeSync monitors undercutting equivalent or worse specced G-Sync monitors by hundreds, the FreeSync technology itself has some key advantages over G-Sync.
![]()
G-Sync carries a small 3-5% performance penalty due to VBLANK polling. FreeSync does not.
Furthermore, and this is the key one for me, G-Sync forces V-Sync on whenever frame rate falls out of the range in which G-Sync is effective (30-144Hz). Not only does FreeSync have a wider dynamic refresh range of 9-240Hz (although the first shipping FreeSync monitors are somewhere around 40-144Hz), it gives you the choice of whether to enable or disable V-Sync outside of that range.
To give you an example of how large the cost disparity between FreeSync and G-Sync is right now, get this. The Acer XG270HU 27" 1440p 144Hz IPS FreeSync monitor only costs $499 while the G-Sync equivalent, the XB270HU, costs a whopping $799.
And because FreeSync is a based off of a free and open standard, Adaptive-Sync, which has been part of the Embedded DisplayPort spec for years, it could potentially find its way into notebooks sooner than G-Sync.
-
Just make sure you take the PCPer review with a grain of salt, because they focused a lot more on what Freesync DIDN'T do outside the VRR window, as opposed to what it DOES when inside said window, and how it compares to GSync overall. The entire review reeks of bias, and that Allyn guy is clearly on nVidia's payroll.
And apparently they think Freesync causes ghosting.
Then again it's PCPer so why should I even be surprised.octiceps likes this. -
-
#GimmeFreeSyncPlsAndTy
-
Anyway I still enjoy their casts more so for Josh and Ryan's input - tends to be more objective.
Back on topic, the stars might be aligning for AMD. Free sync and a Nvidia spanking GPU is just what the industry needs.. Some undisputable competition. -
8GB 390X Lightning + waterblock + Freesync monitor
My wallet is reeling in fear already
But for once I'm glad I didn't impulse buy the ROG Swift. Being locked into nVidia's ecosystem is the last thing I want.Last edited: Mar 20, 2015TomJGX likes this. -
I remember Allyn Malventano calling BS and giving Gamenab a hard time on OCN when the mobile G-Sync news first broke. Ofc when it was proven to actually work, then PC Per published an article on it. Jumping to conclusions before proper investigation and fact-checking, what kind of journalism is that.
But I digress. 390X + FreeSync (especially that Acer monitor) looks like a winning combo. Cheaper and better than Nvidia. Exciting.
Last edited: Mar 20, 2015TomJGX, killkenny1, TBoneSan and 1 other person like this. -
Well to be fair, if you take away the GSync module, which sold for $200 or $250 if memory serves, the ROG Swift comes down to $550-600, so on par with the BenQ Freesync. Although the BenQ is IPS Oops misread, it's actually TN, but there is a cheaper Acer option available.
Actually what prevented me from pulling the trigger on the ROG Swift was the QC issues. Impression I got was you basically had to play panel lottery to get that perfect panel. That was utterly unacceptable to me, because for $800, I expect nothing less than perfection.Last edited: Mar 20, 2015 -
That's true. But that massive BOM cost increase from the G-Sync FPGA and the licensing fee added on top of it is one of G-Sync's major drawbacks.
Besides the massive price difference, what really strikes me about the FreeSync launch compared to the G-Sync launch (*cough* more like delayed rollout over many months) last year is the sheer number of quality monitors with seemingly immediate availability. G-Sync monitors took forever to become available for sale and for the longest time all you could get was 1080p TN gaming junk. Now with FreeSync we have 1440p, 144Hz, IPS, 21:9 ultrawide right out of the gate. Of course, it helps that FreeSync's open nature encourages monitor vendor adoption and G-Sync laid the groundwork for proving the viability of variable refresh rate technology, but still.Last edited: Mar 20, 2015 -
I may just jump over to AMD this time around. Would be nice for an R9 M390X with G-Sync 1440p laptop LCD.
-
D2 Ultima likes this.
-
Oops... yes freesync... Nvidia grabbed my brain cells.
-
Come on AMD, hurry up and do something already! TAKE MY MONEY! I'm holding out for the Clevo Batman with AMD R9 m390x, Intel Skylake, and Windows 10... Then I will gladly shell out $2500 for the laptop of my dreams*
*disclaimer - laptop for a year.TomJGX likes this. -
Little review here
-
Shots fired - nVidia explains why G-Sync is superior to Freesync
Regardless of the technical merits and drawbacks of Freesync right now, if AMD has nVidia this worried then I think they must be onto something.
Also this part made me LOL
-
killkenny1 Too weird to live, too rare to die.
I thought FreeSync was supposed to be universal. Or am I confusing it with Bulldozer (or something)? -
Bulldozer is an AMD CPU. -
^^ I was about to say this. Just like it was with Mantle... We have a name for it already, it's nGreedia - if they can't make money out of something, they wont support it.
-
killkenny1 Too weird to live, too rare to die.
-
-
That cheaper Acer freesync monitor is a TN panel vs the IPS panel of the gsync version. While I do have a nvidia card the one display port restriction is a deal killer for me. Bring on freesync!
-
-
I'm very much be out of touch here...feel free to chime in.. But I wonder if a hack/3rd party software/driver would ever get these working with Nvidia cards? Zero chance?
I'm guessing even if possible it wouldn't be easy seeing how long it's taking AMD to bring it. -
Might be possible (more or less same technology, different implementation/hardware), but I'm pretty certain that nVIDIA has tied it up all nice and dandy, so a massive rework would be needed. We wont know for certain unless someone digs into it.
I'm guessing here, but it took them so long mostly because they had to interact with VESA about moving/implementing standards back and forth and etc, so everyone could benefit (where nVIDIA just made it proprietary, so it save them the hassle and ""as a side effect"" they are cashing on it). At least half of the delay is because of this. Again, guessing.TomJGX likes this. -
Ignoring all the ghosting brouhaha about Freesync atm, I'm just really glad I can finally grab a 144Hz 1440p monitor for less than $700. Screw nVidia and G-Sync. -
octiceps likes this.
-
-
Freesync looks like it's going to be a real win. Nvidia need to yield on this one... the greedy bustards.
-
-
-
TechReport also has a review on BenQ's XL2730Z Freesync monitor.
While they also touch upon the ghosting issue, compare the language used with PCPer's, especially this bit here:
So there you have it folks, TR has basically confirmed what I said earlier about the PCPer "review" really being an anti-Freesync propaganda piece, possibly sponsored by nVidia. Not surprised given PCPer's stance during the 970 vram fiasco, but you'd think they'd at least have some journalistic integrity -- NOPE.
Also extremely ironic considering they implied the 970 vram issue was mountain out of a molehill, yet they're doing the exact same thing with Freesync's ghosting issue. I swear, some read/hear the word "AMD" in a sentence and they immediately start spilling trash out of their pie hole.
sorry for the rantLast edited: Apr 13, 2015TomJGX likes this. -
Just curious since I haven't read any reviews (AnandTech & TechSpot) that touch upon the ghosting issue, is it inherent to FreeSync itself, or is it due to IPS' higher response time or the specific panels that are used in the review monitors?
-
Don't know about other panels, but in the case of this BenQ monitor (TN panel btw), it's because for some reason when Freesync is turned on it forces overdrive off, which causes ghosting. I've seen some users report that disabling Freesync and turning overdrive on completely eliminates any ghosting. So yeah should be as simple as a driver fix or at worst a firmware upgrade. Not sure why PCPer didn't bother testing something so simple and went straight to the doom and gloom conclusion.
Last edited: Apr 13, 2015
FreeSync is here!
Discussion in 'Gaming (Software and Graphics Cards)' started by octiceps, Mar 20, 2015.