Do my usual routine - just go ahead and ask everyone (that have it listed) if they have the exact part number, at least two of them would respond positive and you can go with the reputation on either of them. Good luck!
-
-
-
-
D2 Ultima likes this.
-
-
-
-
I love 144hz and I love Gsync, I am going to get used to the TN but I will eventually be getting an IPS with gsync. That 144hz is making using the computer fun again though.
TBoneSan likes this. -
-
-
I've been itching for a nice 27-30" 2560x1440 144Hz external LCD though. My desktop will need some updating though.
-
-
-
If there were a 1440p 144hz 24" I'd be all over that since that's the size limit of my desk.
Since there isn't I'm happy enough trying to chase 144fps at 1080P.D2 Ultima likes this. -
-
Yeah, it appears that I was correct. Cherry picked 60Hz displays.
TBoneSan likes this. -
For 17" HTWN is already using the only screen that has been licensed for ZM systems.
If he uses any other screen in this so called g-sync model > g-sync will no longer work
If he puts this g-sync screen into another Notebook model > g-sync will no longer work
If he uses any GPU other than GTX980M, GTX970M or GTX975M > g-sync will no longer work
If he flashes a pre-g-sync BIOS version > g-sync will no longer work
They could enable g-sync for all eDP screens or even all so called 'approved' screen models for all Notebooks (on non Optimus systems) simply on driver level, but that's not how money is made.
Last edited: Jun 7, 2015triturbo, Mr Najsman, D2 Ultima and 1 other person like this. -
-
"If I whitelist the LP173WF2-TPB1 in the BIOS in my Prema Mod we could have 120Hz gSync because take that nVidia"Samot likes this. -
So if you can play at ~60 you can increase quality of the game and play it at 40-50 with barely visible difference.
Attention! Although with time you may start notice a difference between gsynced 40 fps and 60 fps because eyes and brain get used/adapt to react faster with opened opportunities. But that will take much time (depends).
So yeah, I definitely can tell you the biggest, huge withdraw/disadvantage of G-sync. "After you get used to it, you won't be able to play without it (below 60 fps at normal monitor)". The stuttering will start killing you because you already got used to lack of it. Similar effect you get after major PC upgrade when you started playing games at twice bigger frame rate than you played before. After some time you just can't play at that lower frame rate you used to play, hence why "minimal gameplay" grew up from 24 fps back in 2003 to a 30 fps few years later and 60 fps now. Once you upgrade, you will have to get used to play with bigger stutters again which you obviously don't want to.
You may quote mePrema likes this. -
Meh. I don't know. I used to be fickle, now I don't care any more. I'm just happy to get the time to game.
-
Really @James D? I'm yet to see someone who got killed by the CRT -> LCD transition, and they were nightmare back in the day, we are yet to see comparable to CRT technology.
-
-
You obviously haven't seen a good one, it's not like they get the best possible parts and stuff for schools. Same goes to LCDs by the way. A picture worth more than thousand words, how on earth I'm going to explain something that MUST be seen?
-
I don't miss the size, weight, heat, or power consumption of a nice CRT, but I do miss the quality image, viewing angles, and most of all scaling. There was no issue with scaling. Ever. If someone could fix the res scaling issue of LCD's to be non problematic like CRT, they'd make millions. You wanted 1600x1200, bam you got it. Want 640x480, bam you got it. No pixel interpolation or jaggies or anything due to the screen. The screen image was the same quality regardless of resolution. It was just the actual digital resolution that would cause the jaggies, not a matrix of pixels. But if you can stick with native resolution, there are so many advantages of LCD's.
killkenny1 and triturbo like this. -
Response time, one of the key benefits - it's second to none. Just like LCDs, CRTs varied greatly, but the response time was minimal (almost non existent) on most of them (not to say all). Then with the higher-end came better phosphors - wider gamut (unmatched by LCDs until recently, most animation studios used CRTs well into 2000s till LCDs catch-up); narrower grille - more pixels (higher supported resolutions); higher refresh rates - lower res and even higher refresh rates. For example Sony FW900 (24" CRT, 2001 model, pointed as the pinnacle of CRT technology) has "native" resolution of 1920x1200@90Hz, yet you can get up to 2304x1536@80Hz by default, or 2560x1600@75Hz if you tweak it a bit. The other way worked as well - 1280x800@120Hz. Oh and if it wasn't worth it, explain this:
http://www.ebay.com/itm/SONY-GDM-FW900-24-TRINITRON-WIDESCREEN-PRO-DISPLAY-/271706446945 Some people pay the price, and take the disadvantages, so to say, but the ultimate advantage is there - the picture, and that's why you buy display in first place, no? BTW that's why my current build is 8740wAnyway, it can be turned into quite the list, but it's a thing of the past, like it or not. The thing is, as I said - you have to see it to believe it. If you haven't seen quality CRT, there's no way to be explained, and again, same goes to LCDs.
Technologies I would really like to see - MEMS display - close to CRTs in most aspects, minus the weight (which is good) and half-pixel drawing/resolution independence (which is not); CNT FED - the successor of CRTs, and I mean quite literary, unlike Plasmas, this one can draw half pixels, the development seems to be at stall though, real shame.
Not to be completely off - That's why you need technology like G-Sync in first place, CRTs just workedHTWingNut likes this. -
And yes, forgot to mention that, lol. Response time was basically a non issue.Last edited: Jun 5, 2015triturbo likes this. -
http://gamenab.net/2015/06/05/last-statement-about-the-nvidia-gsync/
Hi,
Since a lot of websites and people contacted me about the G-sync, which was misinformed by different brands including NVIDIA itself, they wanted to hear my story.
Here I will precise a few points.
- NVIDIA/ASUS didn’t know that G-sync could work in a Windowed Mode, I prove them wrong (Article)
- G-sync on Mobile/eDP/DP was always possible without the need of a module
- I’ve been working on a payload and algorithm since a while
- I’ve been working with ASUS in a voluntary way, they stole my work (By the way I never contacted ASUS for anything, they contacted me so I can help them for something else, that’s all)
I’m actually from France and I’ve been working in the video game and cinema industry for over 15years. I’ve been working under a certain username providing patches and mods to improve the execution of the PC Games… I was doing mapping/modding and texturing. PCPer made the entire story to look good. They are arrogant and disrespectful. Because everything that doesn’t come from any kind of big company, should not be told to the public and that I should keep my research a secret. So each people made their own story of my life and no one was even close.
The truth is simple. I make a payload/algorithm that I share with different companies. I want to improve the frame timing (variable controlled timing) in a certain way. I know that each panel had this capability anyway. What they needed was a software way to translate the information in a good way that there’s between the GPU and the monitor.
Unfortunately for me, I was too stupid and shared too much information. So I lost my time with this for nothing.
What the people don’t realise today is that:
Companies like NVIDIA, have only one thing in their mind, to sell a license for all and for nothing. They try to steal or buy a maximum patent and make money out of it.
So what did you expect from them? To tell you the G-sync free software way was always possible and to give you that for free? To tell you that it was possible because a person shared a certain research with them?
Everything costs, so why not use this.
No one will know the truth behind this because you will always see the media hiding what they really know, or what story they make to push you to believe them.
Someone sent me a message recently saying that he laughed a lot about these guys from PCPer saying that the G-Sync Windowed Mode was not possible. I prove them wrong, nobody believed me. NVIDIA announces that a few months later, everybody believes and even them, oh man, this is unbelievable.
So what’s going on today
- G-sync on Mobile (eDP and DP they said) by buying their supposed license for software use ? seriously ?
- G-sync work in a Windowed Mode by using what I was explaining back in February, no seriously ?
- G-sync will work with DSR, SLI, etc… no seriously, what I was explaining ?
You laugh at me but at least I’m not the one who is ignorant out there. It doesn’t matter who win or lose. In the end you can see that the truth started to be revealed each day by NVIDIA itself, don’t be fooled or blind.
Anyway, I don’t have any ASUS, MSI, Alienware or any related laptop or a Desktop base work.
So in the end, they won, I lost, good for them
Hopefully people who believes in me here, will try to help to let them know that they should at least give their excuses to me and a good thanks will not be too much.
Obviously they found a way to make a lot of money of it, while me, I win nothing. Only shared my experience and knowledge (that was a better way, I could help).
Thanks for your support guys.
TusionOS is cancelled and all of my work is completely over…be77solo, Prema, HTWingNut and 1 other person like this. -
-
so will a laptop like mine (in my sig) if I put the 120hz edp display in it eventually be g-syncable? if someone writes it into the bios
-
-
-
-
-
The P750ZM is getting a 4K screen. The P770ZM is not using the same panel (obviously). Aorus X7 I'd believe uses the same screen as so few eDP panels are available for 17" -
EDIT: PM'd to PremaLast edited: Jun 7, 2015 -
Are the screens used in MSI GT72 G and Asus G751 also the LP173WF4-SPD1?
What's the 3K screen model used in Aorus X5? -
-
-
Is there a list of the G-Sync models coming out?
Seems like G-Sync is the real deal and I'm so glad it's finally coming out in laptops. -
so this leaked driver 346.87 has g-sync mobile, if my display is at least 75hz, from all the reading and googling I have read I don't see a reason why it would not work...nvidia is getting really greedy lately that is for sure making sure they get any royalties they can impliment and saying g-sync has to have a module for a display panel to work....which is BS the display panel really only needed to be 75hz (does it have to be edp) and a file in the monitors driver software saying that you are granted access basically to use the g-sync feature its crazy....
-
-
-
-
-
Now the next thing is, will there be ULMB (ultra low motion blur) implemented as well?
-
Edit: I'm wrong, it involves backlight strobing...Last edited: Jun 7, 2015 -
Can I know what eDP version does G - Sync for mobile needs, Is it 1.2 / 1.2a (Freesync support started with this iirc) / 1.3 ?? (Any one with knowledge of AW machines please let me know which version does the Haswell version of AW17 use & Ivy M17x R4)
And what version do the P570WM / P377SM-A / P370SM3 and the P7x0ZM machines use...?
Thanks !!Last edited: Jun 7, 2015 -
yes I am not exactly sure why but it seems like the laptop displays that are going to have g-sync are 75hz default stock (well kinda) not sure if thats a requirement for g-sync to have 75hz or what, but I know my display in my beast (in my sig) can overclock to around 90hz maybe more but I didn't wanna push it, even though i do have the 120hz display and edp cable I am too scared to try and install it and xotic won't install it for me so I dunno what to do except throw it in the parts pile and either save til my warranty is up and I am more comfortable and use to this laptop the attempt to install it or maybe I should just sell the display and cable and be done with it not sure....sry got a lil off topic there but yah I have seen g-sync in person and to me it really did look good but if you have a really good laptop and use v-sync like I do most of the time with my 60hz panel it looks the same as g-sync I dunno if its the sli 980m's or what but it seems if there is enough power g-sync is mute maybe in a year or so g-sync might be more suited me then but really it would be nice to have it now just so if I want to use it I have the option to, but I dunno whats going to happen because of how nvidia is applying the g-sync feature I am sure there would have to be some sort of bio's update saying yes you can use g-sync now...who knows...but all this new stuff is exciting for me, I know there is always new tech coming out but recently I have noticed a lot of good stuff is going to be coming out within the next 6-months to a year...can't wait...
@Ashtrix I am pretty sure you have to have displayport 1.2a ???? from what I have read..Ashtrix likes this.
G-sync coming to notebooks with 75Hz displays?
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 31, 2015.