The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Clevo 980M G-Sync

    Discussion in 'Sager and Clevo' started by Zymphad, Aug 10, 2015.

  1. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Hello,

    Is there a difference between Clevo 980M or a Clevo 980M sold with advertisement for G-Sync? I was under impression that all ALL 980M are G-Sync capable since mobile G-Sync is NOT hardware but software, hence the eDP since for desktop it is hardware, requiring actual G-Sync hardware.

    I was hoping by upgrading to a LG IPS that is sold with the G-Sync advertised models and using a BIOS that enables G-Sync support I would have G-Sync.

    Anyone have input on this idiocy created by Clevo?
     
  2. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It's not Clevo, it's Nvidia. According to Nvidia it's hardware/firmware and you do need a "G-sync enabled" 980m and LCD to use G-sync. You can't just use any 980m or LCD. Of course there are apparent "hacks" that allow you to enable G-sync with an eDP LCD.
     
  3. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    But Gamenab ded (fo real this time)
     
  4. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Alternatively, all GM204 mobile chips are advertised as "Supported GPUs" for G-Sync.....if you connect them to an external G-Sync display via DisplayPort. So they're not wrong.
     
  5. t456

    t456 1977-09-05, 12:56:00 UTC

    Reputations:
    1,959
    Messages:
    2,588
    Likes Received:
    2,048
    Trophy Points:
    181
    Clevo made the P770ZM G-Sync capable six months before Nvidia announced G-Sync laptops. This is from the ZM Service Manual dd December 2014:
    Hence ... praise Clevo (also for distributing schematics :cool: :cool: :cool: :cool: :cool: :vbthumbsup: :vbthumbsup: :vbthumbsup: :vbthumbsup: :vbthumbsup:). Seriously; how can one even contemplate shackling down to a Compal or Quanta???!

    The interesting thing is that GP = 'General Purpose' and hooks up to Lynx Point, whereas on SLI systems it was used for audio-sync. So ... is it necessary or is it only a tie-in to the G-Sync cookie (for the $100 nvidia licence) and, hence, enforced on Clevo to make their systems 'viable' for G-Sync? Mainly ... how can we make the HDA_SYNC pin #42 (=6x9) do our bidding on the P37**M systems? Or, heck ... even the P1*0*M, for that matter? They're all eDP and the pin is there:
    Thus; 'it can be used as general purpose'. Not only that, it has been re-used for general purpose (G-Sync = "yes"). Of course, that means sacrificing HDA_SYNC:
    In other words; ditch nvidia's High Definition Audio altogether:
    Which is, it seems, exactly what the ZM does; the Realtek hooks directly to Lynx Point, so no issues there. The P1*0*M does something weird:
    Hooked to thermal or battery throttle? Meh ... who cares; cut R21 or drop to ground with wire and problem solved. For the P37**M it may be a neater option to terminate HDA_BITCLK and HDA_SDOUT properly, as indicated above:
    Might just wing it ... the main thing is the meaning-of-life pin; what signal does the nvidia driver expect on the pin for it to switch 0 -> 1? Anyone with a ZM-G and multimeter? It could be just that simple; bios ties into chipset -> chipset controls GP-pin; why bother to make thing more difficult after validation?

    Either that or wait for AMD to retroactively enable FreeSync on their (ancient) mobile cards. At least there'll be no shenanigans with cookies, licences an 'proprietary' technology; conforms to (e)DP spec., so would make a drop-in option.
     
  6. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    Man Sorry but I can't follow this at all, anyway for you to dumb it down for a dummy like me? lol thanks I would love to use gsync on my np9377sm-a do you think that would ever be possible? thanks
     
  7. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Why wouldn't it be G-Sync supported if I connect it to an eDP display using BIOS that enables G-Sync?
     
  8. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,151
    Trophy Points:
    931
    afaik the GSync vbios is different from the standard vbios of the 980M. u cant interchange them, otherwise ull get a black screen. based on that i guess theres also some different hardware involved on the 980M, maybe different kinda vbios chip or something...
     
  9. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    According to @Prema the resistors are different on the G-Sync MXM cards just like GeForce vs. Quadro MXM
     
    jaybee83 likes this.
  10. t456

    t456 1977-09-05, 12:56:00 UTC

    Reputations:
    1,959
    Messages:
    2,588
    Likes Received:
    2,048
    Trophy Points:
    181
    Going to guess ... also id-only again :( ?
    So ... resistor change from '13D7' to '1617', flash '84.04.48.00.1C' or '84.04.48.00.20' and all is well? You'd think with any physical change there'd be a different revision GM204, yet they're all A1 (the 'FF' is a fluke; old GPU-Z, perhaps?).

    The third was from a Eurocom P770ZM, so that should be fine straight away. Someone did successfully flash ZM -> ZM-G without issue, except G-Sync wouldn't work anyway due to unsupported panel.

    Yet ... which resistors? Can find them on the W230SS since it has soldered gpu; Service Manuals lists these (hurray!) and managed to measure them (want a Quadro). But without schematic ... visual compare only is useless; they're about the size on ant's leg, so no markings.
     
  11. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Unbelievable horseshit. Unbelievable. Clevo was G-Sync ready for months and sold laptops with it disabled for months before deciding to enable it? Or NVidia decided to sell 980m with a few resistors different?

    Mobile gaming absolutely sucks. Why do I keep getting suckered by these dipsh%t companies, screwing me over on features after paying $1850? How is this even ethical?
     
  12. i_pk_pjers_i

    i_pk_pjers_i Even the ppl who never frown eventually break down

    Reputations:
    205
    Messages:
    1,033
    Likes Received:
    598
    Trophy Points:
    131
    Why do you guys even want G-Sync so badly, I thought it's only good if your frame rate dips below your refresh rate (which shouldn't happen if you have good specs)?
     
    hmscott likes this.
  13. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    It's the opposite usually. Screen tearing happens when the frames being generated are out of sync with your display, which means it can happen at any time. But for most people, it happens when you have more than 60 FPS on a 60 hz screen.

    It's to me the biggest update and improvement from NVidia in the last few years. And disabling it by changing a few voltage resistors on the 980M for some users is a slap in the face disrespect to their customers. It's a great example with what is wrong with with mobile gaming.
     
  14. i_pk_pjers_i

    i_pk_pjers_i Even the ppl who never frown eventually break down

    Reputations:
    205
    Messages:
    1,033
    Likes Received:
    598
    Trophy Points:
    131
    I have never noticed any screen tearing on any display, under any circumstances, ever. I wasn't aware that screen tearing reduction was one of the features of G-Sync.
     
    hmscott likes this.
  15. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Some people are sensitive to it, a bit like lower frame rates or input lag.
     
  16. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,151
    Trophy Points:
    931
    GSync is cool, but thing is: as long as u havent experienced it urself you dont miss it :p :D
     
    hmscott likes this.
  17. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I don't believe you for one. Secondly whether you notice it or not, it happens. I guarantee it.

    What do you think G-Sync even is? It synchronizes your display with your GPU output...

    Not true. I miss it without having experienced it. I experience screen tearing consistently. And if NVidia and Clevo markets this as a feature for 980M, I miss it and I want it. I don't have to experience it to miss it. It will rid of one of the most irritating part of mobile gaming when we often suffer with garbage display panels. Compared to what is available for desktops and considering we pay up to 2x-3x for our machines in equivalent performance, that's theft far as I'm concerned.

    I dealt with it and ignored it before G-Sync and Free-Sync became available on mobile machines. But now that it is here, I can't ignore that it exists nor ignore that NVidia and Clevo stole from me, I paid $1850 for a purposefully gimped machine.

    The G-Sync model is a P770ZM. It's called P770ZM-G because it is a P770ZM with no changes at all except they decided not to be a dick. That is the difference, Clevo and NVidia for the G model opted to not be dicks.
     
    Last edited: Aug 11, 2015
  18. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    FYI, that resistor change may actually be important.

    With eDP the display scaler is essentially built into the GPU. One of the major issues with the original G-Sync tests (ie leaked drivers), was the display would drop out because the "scaler" was incapable of re-displaying frames when VBLANK got too big (ie frame-rate below ~40fps didn't revert to regular double-buffering so it dropped). So it's possible that there is in fact a hardware change that allows this to happen or extends the length of that buffer. Attempting to send variable VBLANK to a panel which doesn't support it properly could result in some very strange results.

    Either way, Mobile G-Sync is based off variable VBLANK, but that is NOT enough to make it work (exactly why the G-Sync module on desktop monitors exists). Other changes must exist in order to deal with that low frame-rate issue so don't get all upset because you bought too early or did not consider this.

    Things really won't be properly sorted out until Intel picks a side. When Intel implements some sort of eDP passthru or adaptive sync, that will have major implications to Nvidia. ie, if Intel implements Adaptive Sync only, then Nvidia may not be able to make use of it via Optimus etc.
     
  19. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    dude you got what you paid for....you knew g-sync was coming out soon, you could have waited, but you didn't...now you have buyers remorse and try to blame clevo for it....not cool
     
  20. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    @Stooj interesting info. Asus released their laptop G-Sync ready prior my purchase of this P770ZM and it functions fine. So Clevo doesn't have an excuse for why they sold me an identical laptop but with one feature eliminated for a reason I can't fathom. I do not believe it was NVidia telling Clevo not to since Asus did and Asus is a far bigger partner than Clevo.

    Also just a personal disgust with Optimus, I'll never buy performance laptop with that again and if it has Intel GPU it will only have that as a thin and light 2 in 1 laptop.

    Wrong, I did not know G-Sync was coming to P770ZM. And it's not buyers' remorse since I did not make a wrong choice in what I bought, there was no G-Sync option to buy at the time. I can blame Clevo for releasing the exact same model but with G-Sync enabled. G-Sync is not a different laptop, it is the same laptop except they sold a gimped incomplete version before releasing the full complete version. That is BS. Clevo knew about G-Sync and they already had the license for it, they chose to sell me the P770ZM with G-Sync disabled when the hardware was ready.
     
    Last edited: Aug 12, 2015
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    This is incorrect. Mobile G-Sync uses eDP Adaptive-Sync and PSR. The PSR TCON on the LCD has its own DRAM buffer to store frames for when it needs to fill in, serving the same purpose as the proprietary module in desktop G-Sync monitors.

    Before Gamenab gave in to pressure from Nvidia/OEMs and shut down his site for good, he released a couple betas of his G-Tools. It successfully unlocked G-Sync on current Nvidia drivers for MSI GT72 and ASUS G751 notebooks that were sold before mobile G-Sync was officially "released" as long as they had the compatible eDP IPS panel from LG. There were no flicker and blackouts.

    So no, the resistor change isn't important. As far as I can tell, it's just some form of malicious security. As has been pointed out, modding the resistors on a "non G-Sync" MXM card to change its device ID turns it into a "G-Sync" MXM. Same concept behind the ppl who hack GeForces into their Quadro counterparts.

    As much as I disagree with @Zymphad on many things, I have to agree with him on this one. Nvidia/OEMs are bending customers over a table. There is a great deal of rightful outrage on various Clevo, MSI, Asus, and Aorus forums.
     
    hmscott and TomJGX like this.
  22. t456

    t456 1977-09-05, 12:56:00 UTC

    Reputations:
    1,959
    Messages:
    2,588
    Likes Received:
    2,048
    Trophy Points:
    181
    Yes, this is Nvidia, after all; it charges laptop builders for the G-Sync sticker, it's not a big step to think they'd do the same to mxm builders. So;
    1. 980M chip: $***
    2. 980M chip: $*** + $100
    The only difference is that for the -$100 option you'd have to promise to solder different resistors on your pcb. If true, it saved -$100 on the retail price (also for a few others, besides Clevo). From the 353.62 driver:
    Maybe a simple device id change is no longer possible, but what if you'd swap the actual ids' install sections? Either way, even with a with a ' DEV_1617&SUBSYS_75011558' vbios, using a 13D7 card the installer will not see:
    • DEV_1617&SUBSYS_75011558
    ... but, instead:
    • DEV_13D7&SUBSYS_75011558
    ... since it pulls those bits from the card's resistors.

    Going to try either an extensive inf mod (for a Quadro or 960M) or, failing that, intercept the device id in memory. That last option is certain to work, but a total * :eek:* process, considering the time it takes for a gpu driver to install ... soldering might be easier.

    ps.
    Bug: CODE tag does not disable double-space replacement.
     
    jaybee83 likes this.
  23. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    From my understanding of the usual PSR enabled T-Cons (ie parade tech) is the onboard buffer only stores a single frame for retransmission at a fixed frequency determined by the panel. However, the T-Con still needs to be aware of the panel minimum refresh-rate for it to actually function (many panels probably just go out only prepped for fixed refresh). Just because the TCON is Adaptive-Sync aware and PSR capable, does not mean it will play well with the panel. I'm guessing there's also some other mucking around required since they're running it at 75hz.
    Most notably, in the spec sheet for the LG panel used in the P770ZM-G, there's a note specifically mentioning potential problems with backlight flicker when operating at 40hz.

    I really can't find enough information on the TCON or panels though to put all the pieces together though.
     
  24. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    so its clevos fault you didn't research that they were coming out with g-sync soon....okay w/e....I know I researched for months any time that I had time to do it before I bought my laptop...I think most people do research before they buy a laptop...guess just not you, but that is not clevo's fault that you didn't do research..
     
  25. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Even if it were really simple to enable hardware wise I don't see people railing against Intel for disabling cache on lower end models or support for ECC ram on the normal core series.

    It's pretty standard practice across the the entire tech industry.
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Bad analogy because what's the benefit of that for gaming? Gamers actually care about G-Sync.
     
  27. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    http://forum.techinferno.com/clevo/3119-[bios-vbios-mods]-prema-mod-stock-305.html#post142592

     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Which proves how wide spread it is, it goes beyond gaming, within gaming the 970M/980m have units disabled that may be working.
     
  29. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    So how do we unlock the disabled SMs?
     
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    People used to flash modified bios files so chip manufacturers now laser cut them to physically disable them.
     
  31. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Oh I thought you knew something I didn't
     
  32. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    I wish it did still work, it was a fun time :)
     
  33. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    No difference. The fact that some dude on the Internet could enable it with a software change on MSI and Asus notebooks sold without the G-Sync sticker is proof of that.
     
  35. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    It was only shown to function correctly on the ASUS G751 and MSI GT72 because some of them already had the 13D7 version of the 980M as well as the same LG IPS panel. This has no bearing on whether Clevo or other manufacturers had/have pre-existing machines that are G-Sync capable.

    Here's a review of the Old Haswell G751 from 7 months ago with no G-Sync:
    http://www.eteknix.com/asus-rog-g751jy-t7051h-gaming-notebook-review/
    Note the Device ID in the GPU-Z screen shot. 13D7, the "supposed" G-Sync enabled 980M from long before G-Sync for mobile was even considered.

    Even then, the device ID seems to be meaningless to the whole G-Sync issue anyway. HTWingnut's own review of the P770ZM-G G-Sync laptop has a 1617 980M in it:
    http://forum.notebookreview.com/thr...0zm-g-980m-g-sync-review-by-htwingnut.777906/

    I found some more interesting stuff in my research. Here's the Original GTX980M specification page that was put up by nVidia:
    http://web.archive.org/web/20141010...notebook-gpus/geforce-gtx-980m/specifications

    Biggest thing to note there:
    LCD – eDP 1.2 support

    Now have a look at ParadeTech (makers of the PSR/Adaptive Sync capable TCONs used in the G-Sync "approved" panels):
    http://www.paradetech.com/products/displayport-lcd-timing-controller-products/
    The PSR capable TCons are ALL eDP 1.3 only.

    In around May, that eDP1.2 line was removed from the 980M spec sheet so I'm guessing something changed there. What I'm not sure of though, is whether going from eDP1.2 -> 1.3 can be done purely in software. I'm guessing that may not be the case and possibly the deciding factor into why a "vanilla" P770M may not be able to be simply upgraded.
     
    jaybee83 likes this.
  36. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Maybe I'm blind, but my honest first impression of G-Sync was "wow what a let down, and what an overhyped technology". My experience closely mirrors of that Eurogamer's so I won't elaborate here. But suffice to say 40 FPS still didn't feel like 60 FPS, and if there was any improvement in fluidity, it was miniscule. I guess I got hyped up by the reviews too much. (ok to be fair high FPS gaming did feel more fluid, but we're talking diminishing returns here, really the whole point of G-Sync to me was what could it do when my frames go below 60, or the frames jump around even when staring at a wall ala Watch Dogs style)

    IMO Zymphad you're really not missing out on much, but then again I seem to be in the minority. *shrug*
     
    TomJGX and jaybee83 like this.
  37. i_pk_pjers_i

    i_pk_pjers_i Even the ppl who never frown eventually break down

    Reputations:
    205
    Messages:
    1,033
    Likes Received:
    598
    Trophy Points:
    131
    That's kind of exactly what I was expecting for G-Sync - over-hyped, unnecessary, and not really noticeable.
     
  38. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I'm not bothered by tearing so I could probably care less about G-Sync, and I certainly won't invest in G-Sync to fill Nvidia's coffers, not with the unethical way they're approaching the whole thing
     
  39. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    How can I tell what version of 980m that I have? thanks
     
  40. TheFallenPenguin

    TheFallenPenguin Notebook Geek

    Reputations:
    3
    Messages:
    85
    Likes Received:
    15
    Trophy Points:
    16
    I don't blame the manufacturers because they were probably under NDA, but the Nvidia is just being ridiculous by expecting us to buy a completely "new" 970m/980m just to have G-Sync support even though the "old" ones are perfectly capable of supporting G-Sync. It just such a greedy thing to do, they should have just enabled G-Sync for all 900m series cards.
     
  41. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    I am thinking the 980m's I have in my machine might actually be the ones that support g-sync from everything I have read but not sure, honestly I tried g-sync out for the first time last week on someone elses laptop and I don't really see what all the fuss it about I think it is more of a hype type of thing imo it might be good for a lower end system but imo anything that has g-sync most likely is using a 900 series card and shouldn't have a need for g-sync because I only saw a small diff if frames were very low
     
  42. TheFallenPenguin

    TheFallenPenguin Notebook Geek

    Reputations:
    3
    Messages:
    85
    Likes Received:
    15
    Trophy Points:
    16
    G-Sync would make a huge difference on laptops, especially with demanding games like GTA V and Witcher 3.
    Post a GPU-Z screenshot and I'd be able to tell you if you have a 'G-Sync Approved' GPU.
     
  43. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
  44. TheFallenPenguin

    TheFallenPenguin Notebook Geek

    Reputations:
    3
    Messages:
    85
    Likes Received:
    15
    Trophy Points:
    16
    Nope, not G-Sync approved. Your Device ID needs to be 1617 for G-Sync. Your ID is 13D7 which is the old, non G-Sync 980m
     
  45. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    ah okay I was told that ASUS G751 and MSI GT72 both used the 13d7
     
  46. TheFallenPenguin

    TheFallenPenguin Notebook Geek

    Reputations:
    3
    Messages:
    85
    Likes Received:
    15
    Trophy Points:
    16
    They do, the non G-Sync ones at least. The newer 980m laptop models which support G-Sync all have the 1617 ID.

    GPUs that aren't G-Sync approved:
    • 980m - 13D7
    • 970m - 13D8

    GPUs that are G-Sync approved:
    • 980m - 1617
    • 970m - 1618

    Even if you do have a G-Sync approved GPU, you won't be able to use G-Sync if you don't have the following as well:
    • G-Sync license/cookie in BIOS
    • G-Sync approved display
    Older 970m and 980m laptops like the ASUS G751, MSI GT72 and Aorus X7 Pro all have the G-Sync approved display (LP173WF4-SPD1), but lack the GPU HWID change and the BIOS cookie which is the reason why they don't have G-Sync. The GPU HWID change and the BIOS license are both DRM measures put into place by Nvidia to stop owners of the 13D7 and 13D8 GPUs from also getting G-Sync.

    I'm no expert, but someone could probably enable G-Sync on their non G-Sync, 13D7/13D8, LP173WF4-SPD1 laptop by changing the resistors responsible for the HWID (flashing a new vBIOS?), and by flashing a custom BIOS that has the G-Sync cookie (Gamenab posted the cookies for all the laptops, but that's probably the reason his website got suspended).
     
    invertedsilence and Scerate like this.
  47. Brent R.

    Brent R. Notebook Evangelist

    Reputations:
    37
    Messages:
    673
    Likes Received:
    148
    Trophy Points:
    56
    Thanks for the info, I tried g-sync on a laptop last week, wasn't impressed but hey thats just me everyone has different eyes and ways that all sorts of different things cause them to see displays differently than from one person to the next
     
  48. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    SLI would be another clue, since G-Sync MXM cards don't work in SLI
     
  49. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,909
    Trophy Points:
    931
    Which is interesting since technically that should not matter, seems like Nvidia being slow to approve things.
     
  50. Stooj

    Stooj Notebook Deity

    Reputations:
    187
    Messages:
    841
    Likes Received:
    664
    Trophy Points:
    106
    Did you not read my post?

    Pre-1617, the 980M GPUs appeared to only support eDP 1.2. No PSR available, problem solved. Maybe it is just a firmware flash (and accompanying ID change), but that is never something that nVidia will officially approve.
     
 Next page →