k.
next on the agenda: hostile takeover of nGreedia's headquarters.
- 
 
- 
 
 I agree, boss.
 
 Also read my addendum as it pertains to you.
- 
 WOW.
 
 #nVidia
 #BestCompany
- 
 
 They making a strong push for this year's award
 
 ![[IMG]](images/storyImages/golden-poo.jpg) TomJGX, Kommando, ajc9988 and 1 other person like this. TomJGX, Kommando, ajc9988 and 1 other person like this.
- 
 Let me guess, 990M won't support SLI either. I bet it'll be gsync-only too. =D.ajc9988 likes this.
- 
 
 The same problem have i too but now it´s gone.
 I buyed the P771ZM-G Sync without OS. I installed Win8.1 but without UEFI Setup in Bios and NVidia 353.30 Driver.
 Some Times i become the black screen on startup. shutting laptop off the power button.and after i start it up again,it boots in like 4 seconds,a s if it was in sleep.
 On startup i see every time "resume from Hibernation"
 
 I upgradet to Win10 with NVidia Driver 355.60 and have the same problem again from time to time.
 I read with the damaged Alienware Notebooks and i go back to Win8.1.
 BUT...
 
 This Time i make a new clean install with Win8.1 and now with UEFI install but the same NVidia 353.30 Driver.
 And now my Problem is gone. No blackscreen and no "resume from hibernation" since last 7 Days  
 
 I checked my RAW Data with "Monitor Asset Manager" and there are ok with www.edidreader.com
 "valid Checksume: TRUE"
 
 @t456 said:
 I dont know why is my Problem gone, perhaps Hibernation make some wrong?ajc9988 likes this.
- 
 
 Can someone please a make a video showing to remove the keyboard and access the 2 RAM slots underneath it? I want to get 32GB RAM lol.. 
- 
 
 It's been almost 1 year since the release of DSR and the SLI+G-Sync+DSR combo is still not possible. I think Nvidia is preemptively disabling SLI of G-Sync MXM cards until they figure that one out (if they ever do). Well at least this is my hypothesis.
- 
 I still feel that SLI is getting less and less love from both nVidia's side and from developers' sides. I think they're probably waiting for pascal to design their cards for SFR support or something. I cannot see how so many technologies just doesn't work with the literal only way to get more GPU power on a PC.
- 
 
 No disagreement from me here. SLI just isn't as viable as it was a few years ago when I got my Y500.
- 
 Yeah. When I got it, I literally expected every game to use it to some degree (and so said so done) and I was more surprised than not to hear that some games were bad with it. Fast forward to late 2014 and suddenly it's a "meh, maybe SLI will work for these new games". Especially with the stupidly popular Unity Engine (single threaded, single GPU engine which I have NO IDEA why people like) and the blooming popularity of Unreal Engine 4, SLI is just going down the drain. Maxwell GPUs' issues with voltage and clockspeeds in SLI is an entire other matter, as well as slaying of new tech (DSR + Gsync + SLI as you said, as well as MFAA) and old tech (SLI-forced 64x CSAA, etc) just makes me want to make sure people have at least a 980Ti before they even consider SLI.
 
 Did I mention to you how I couldn't get MGSV: Ground Zeroes to use more than 70-80% of my GPUs, and I had to overclock my GPUs to get 60fps constant at max in it? Literally, OCing my GPUs caused the FPS to raise and stick there (without drops) because ~75% of 850/5000 vs 75% of 1006/6000 is a fairly large difference. Wasn't even CPU limited in the slightest.
- 
 
 But MFAA replaces CSAA. Except MFAA doesn't work in SLI.  
 
 No I don't remember you telling me that. That's really weird.TomJGX likes this.
- 
 I know right?  
 
 Yeah, it surprised me too. But the proof is there; I alt-tabbed from the game and simply double-clicked on my 1006/6000 profiles and instantly FPS went up and stuck there. Before I was getting ~55fps with drops to as low as 45. OC and min fps was something like 58. My CPU was being used decently, but it wasn't near max and I wasn't limited at all, nor did my CPU load increase when I overclocked. The engine itself (at least with 353.06 drivers) simply did not allow me to pass about 80% util, and mostly hovered near 70%.
 
 ![[IMG]](images/storyImages/Screenshot1869.jpg)  
- 
 
 I recently purchase an ASUS ROG PG278Q G-sync LCD. DisplayPort keeps dropping out while playing Witcher 3. I can't figure it out. Driving me nuts. 
- 
 Fastidious Reader Notebook EvangelistThinking on SLI as it's a tech that has even with us since the late 90s (had a model of Voodoo 2 pci card that supported the feature) I wonder what hurdles still exist to getting it to work correctly. I remeber watching someone play Far Cry 4 and their problem was that it was processing some effects twice doubling their visual effect. ajc9988 likes this.
- 
 
 About that Nvidia MXM GTX 980m, it does have a resistor of some sort to differentiate between a G-Sync and non G-Sync, so im waiting for a new GPU card from them to replace mine, it could probably a driver from recent update may killed the GPU. They still post mortem that thing. TomJGX likes this.
- 
 It's probably because the rendering methods and engine tech as well as bandwidth requirements have increased. For example, split-frame rendering and scissor-frame rendering were around back then, but now way too much data has to be transferred. Not only in required memory bandwidth, but in the amount of memory (64MB cards are really different to 2GB/4GB/8GB cards, after all =D).
 
 Cards weren't meant to transfer so much data across the PCI/e interface to each other, so we're stuck. nVidia in all their beautiful "DX12! DX12! Yay DX12!" hullabaloo "forgot" to design their current gen "DX12 compatible" cards for such things, when the butt of all jokes AMD seems to be far more ready for it.
- 
 that's because Microsoft adopted elements of mantle to make directx 12. Have you seen how 290x does against 980ti in ashes? 
- 
 That's a different story. That's because Maxwell GPUs apparently suck at parallel processing, and AMD cards don't, and thus they pulled ahead greatly.
- 
 That is because Maxwell was purposely gimped to get hyper focused on directx 11, Nvidia asked it would still rock over amd, then is blaming the software designer because they're infallible. Nvidia thought Microsoft would continue to suppress amd, mantle made it impossible if Microsoft was to respond quickly!
- 
 wat.
 
 I understand the gist of what you're saying, but you need sleep XD
- 
 Although I only got 4 hours of sleep, they removed some functionality to add to their Maxwell architecture for gaming. What they removed from prior design is what made them less than amd for bitcoin mining. The same feature for bitcoin mining, parallelism in part, is what makes amd better at the new standard. Deal with Nvidia f*ed themselves in hyper-focusing on directx 11. It's fact. Amd is cutting edge on standards that are suppressed unless so good everyone adopts it. Here, Microsoft has no choice but to adopt asynchronous processing done like amd while better utilizing the parallel nature of so many cores. Look it up!
 
 Edit: this isn't to say Nvidia doesn't have good cards, just that it's cards lose efficacy on a new api they didn't design for. Amd also developed hUMA and HSA in 2011-2012 time frame. Intel and Nvidia helped slow adoption. Now, Intel is backing freesync and hsa will allow the CPU, to a degree, to utilize hbm2 on Zen apus. It will be interesting to see if amd beats Pascal with both hbm2, but amd having years of honing what makes it good at directx 12, mantle, and Vulkan.Last edited: Aug 31, 2015D2 Ultima likes this.
- 
 
 Small Spoiler from me   
 
 ![[IMG]](images/storyImages/4ghz5vqic.png)  
 4,0Ghz @1,02V = 61c auto Fan / 55c max Fan
 
 
 ![[IMG]](images/storyImages/45uzqn0.png)  
 4,5Ghz @1,14V = 70c auto Fan / 65c max Fan
 
 XTU Stresstest with 20c Roomtemp  
 Look my Signature how are the Temps before with old Mods!
 @4,5Ghz the CPU runs 8c cooler with max fans than before
 
 I also make some Firestrike test with 4,5Ghz CPU and stock GPU only for Temp test.
 Fan auto = GPU 48c and CPU 65c
 Fan max = GPU 42c and CPU 61cajc9988 likes this.
- 
 
 Wow what does your 4790K run on? That is such low voltage... Even my 3940XM can't run at that voltage for those speeds...
- 
 
 In the specific case of Far Cry 4, the ghosting in SLI was because temporal SMAA was enabled. Ubisoft fixed it 5 months after release with patch 1.10.0.
 
 
- 
 
 Yeah, i also found that odd... -30mv at 4.0ghz and then -80mv at 4.5ghz?!?
- 
 
 @TomJGX 
 It´s a selected 4970K that i buyed from another User.
 Max. OC that i have testet on my PC was 4,8GHz with 1,310V Prime stable  
 I didn´t test higher.
 
 But in my P771ZM-G are 4,5Ghz enough.
 I try to become good temperature as you can see in the Screens above.
 But this very nice temps comes not only from some Mods (delite cpu, heatspreader and heatsink polish)  
 
 Tell more if i done with my modification (WC)  
 For this i have buyed extra a new Heatsink and bottom cover, so i can try some thinksajc9988 likes this.
- 
 
 @Samot 
 Why Not?
 
 4,0Ghz @ 1,024V is more than Prime stable. With 0,985V it´s not Prime stable only XTU
 4,5Ghz @ 1,142V is Prime Stable too. With 1,121V only XTU.
 
 But in my PC are the lower Voltage Prime stable and not only XTU in my P771.
 So i have to set the Voltage a bit higher in the P771
- 
 
 It´s just found those values (-80mv at 4.5ghz, -30mv at 4ghz not so much) surprisingly good. But take caution because that -80mv maybe be unstable on lower multi´s.
- 
 Meaker@Sager Company RepresentativeEvery chip is going to react differently, especially when they can have different stock voltages. 
- 
 
 Yep, that´s right. Chrack seems to have a really good chip. TomJGX likes this.
- 
 Meaker@Sager Company RepresentativeYes looks that way  Some very nice clocks. Some very nice clocks.
- 
 DaveFromGameaVision Notebook ConsultantIs it possible to install a 120Hz panel on the P770ZM? 
- 
 Theoretically, if you get the P770ZM with the IPS panel (so it has the eDP connector) and buy separately the LVDS LCD cover, then buy the 50-pin eDP cable and the 120Hz screen separately, then you should be able to install and mount it correctly.
 
 As for whether it will WORK or not... I dunno.DaveFromGameaVision likes this.
- 
 DaveFromGameaVision Notebook ConsultantThank you for the info.
 
 So this model would have the eDP port right? I've got an Alienware 17 with the 120Hz screen right now, would the cable/panel from it possibly work with the P770ZM? What is the LVDS LCD cover?
- 
 That model would work, correct.
 
 The LCD cover needs to be the one that houses the Chi Mei Innolux and the AUO panels. It's NOT the same one that houses that LG IPS panel. You should be able to order the LCD Cover from Eurocom (or maybe from RJTech if you can find it on their site). I don't know what your AW17 uses... if it's the LG LP173WF2-TPB1, then yes, it will work. You can use Moninfo and/or HWiNFO64 to check and see if the info matches up with that LG panel on www.panelook.com.DaveFromGameaVision likes this.
- 
 DaveFromGameaVision Notebook ConsultantLooks like I've got a Samsung panel right now which I don't think is compatible because it's a 40 pin connector, not a 50 pin like the LG you listed. So I would need the LG panel, a new cover and a cable I assume? Do you know the part number on the cable? Has anyone on here upgraded to a 120Hz? I've looked but I can't find anything.
 
 edit: According to notebookcheck the stock screen is this LG model correct? It's listed an eDP 2 lane screen, does that mean I can just plug the 120Hz LG in even though it is a 4 lane?Last edited: Sep 1, 2015
- 
 No sorry, you're out of luck.
 
 You need the IPS laptop SPECIFICALLY because your board won't have the eDP connector otherwise, according to most people who checked.DaveFromGameaVision likes this.
- 
 DaveFromGameaVision Notebook ConsultantYeah I figured the Alienware display working was a long shot, oh well. This is what I've got so far:
 - I need the IPS version of the P770ZM so I have an eDP port.
- The P770ZM IPS screen is this model, I want to replace it with this model (or the glossy version depending on preference)
- Do I need a new cable connecting the display panel to the motherboard? The stock screen is a "2 lane" and the 120Hz is a "4 lane".
- I will need a new LCD cover (bezel?) from the non-IPS version to match the different sized screen.
 
- 
 1 - Correct. I.E. the P770ZM-G from Sager/Myth/etc, or the Eurocom model with the IPS panel selected.
 2 - Correct.
 3 - Yes, you need a 50-pin (4 lane) eDP connector. That IPS panel uses a 30-pin (2 lane) eDP connector.
 4 - Yes, you will need the LCD cover from the non-IPS version; the one that supports the Chi Mei and AUO panels. The reason is that the IPS panel has a different mounting orientation, but the P37xSM and P37xSM-A models used the Chi Mei and AUO panels that the P770ZM use, as well as the 120Hz panel, in the same LCD cover. So any LCD cover that can fit both the Chi Mei and AUO panels should fit the 120Hz panel.
 
 If you're willing to do the necessary modifications and attempt to get a 120Hz panel into that P770ZM you'd be doing this whole place a favour that nobody else has bothered going through XD.DaveFromGameaVision likes this.
- 
 DaveFromGameaVision Notebook ConsultantAlright, that is awesome news. I'm seriously considering picking one of these up, I'm tired of Dellianware restrictions and I'd like to see just how far my 980M will go. Where is the best place to get the eDP connector? I'd rather pick up that and the LCD first and worry about mounting it when it is confirmed working.  
- 
 Eurocom is expensive, but will sell you the parts.DaveFromGameaVision likes this.
- 
 What is your alienware 3D display? Is it from LG? Did you run HWiNFO and/or MonInfo?
- 
 DaveFromGameaVision Notebook ConsultantYeah, HWiNFO confirmed it's a Samsung.
- 
 And is it eDP or LVDS? MonInfo should also clarify this.
- 
 DaveFromGameaVision Notebook ConsultantIt's EDP, here is a picture of the motherboard connectors.
 
 Monitor
 Manufacturer............. Samsung
 Plug and Play ID......... SEC5044
 Data string.............. GN36T€173HT [*CP437]
 Serial number............ n/a
 Manufacture date......... 2012, ISO week 1
 Filter driver............ None
 -------------------------
 EDID revision............ 1.4
 Input signal type........ Digital (DisplayPort)
 Color bit depth.......... 6 bits per primary color
 Color encoding formats... RGB 4:4:4
 Screen size.............. 380 x 210 mm (17.1 in)
 Power management......... Not supported
 Extension blocs.......... 1 (CEA-EXT)
 -------------------------
 DDC/CI................... Not supported
 
 Color characteristics
 Default color space...... Non-sRGB
 Display gamma............ 2.20
 Red chromaticity......... Rx 0.553 - Ry 0.318
 Green chromaticity....... Gx 0.352 - Gy 0.586
 Blue chromaticity........ Bx 0.165 - By 0.110
 White point (default).... Wx 0.313 - Wy 0.329
 Additional descriptors... None
 
 Timing characteristics
 Range limits............. Not available
 GTF standard............. Not supported
 Additional descriptors... None
 Preferred timing......... Yes
 Native/preferred timing.. 1920x1080p at 60Hz (16:9)
 Modeline............... "1920x1080" 146.870 1920 1968 2000 2140 1080 1083 1088 1144 +hsync -vsync
 
 Standard timings supported
 
 EIA/CEA-861 Information
 Revision number.......... 1
 IT underscan............. Not supported
 Basic audio.............. Not supported
 YCbCr 4:4:4.............. Not supported
 YCbCr 4:2:2.............. Not supported
 Native formats........... 0
 Detailed timing #1....... 1920x1080p at 100Hz (16:9)
 Modeline............... "1920x1080" 237.940 1920 1968 2000 2080 1080 1083 1088 1144 +hsync -vsync
 Detailed timing #2....... 1920x1080p at 110Hz (16:9)
 Modeline............... "1920x1080" 261.730 1920 1968 2000 2080 1080 1083 1088 1144 +hsync -vsync
 Detailed timing #3....... 1920x1080p at 120Hz (16:9)
 Modeline............... "1920x1080" 285.530 1920 1968 2000 2080 1080 1083 1088 1144 +hsync -vsync
 
 Reserved general related data
 
 Report information
 Date generated........... 9/1/2015
 Software revision........ 2.90.0.1000
 Data source.............. Real-time 0x0100
 Operating system......... 6.2.9200.2D2 Ultima likes this.
- 
 Okay, thanks very much.
- 
 Meaker@Sager Company RepresentativeLook at all of those display chips near the connectors!
- 
 DaveFromGameaVision Notebook ConsultantAlright I think I found the cable part number: 6-43-P37E1-020-J. They are expensive ($155)! Plus the screen for ~$90. Any input on glossy vs matte? My AW17 screen is glossy and I don't mind it, is the matte coating pretty high quality? Any idea what part number I'm looking for on the cover or what to ask for? 
 
 The non G model works as well right? I've already got a 980M which will not work with the G model.
 
 edit: Apparently the P770ZM is discontinued... anyone else know where I could get a barebones one besides rjtech? I've already got a 4790K in one of my computers I was going to use... I guess I have to wait to see if the P770DM gets a screen upgrade.
 
 second edit: If I got the G-sync model could I theoretically upgrade it to 120Hz or is G-sync dependent on the display panel.Last edited: Sep 2, 2015TomJGX likes this.
- 
 Non-G IPS panel model will work, but only Eurocom offered those.
 
 P770ZM is for the time being discontinued due to Skylake's existence, however I can't promise the Skylake refresh will have the same LCD cover that houses the LVDS chips. Eurocom might be able to help you out with the specifics of the system; they always have all the options. But they'll be expensive. You'll be able to say you can use your own CPU OR send them your CPU to use if you want.
 
 I believe it's display-panel dependent, but a BIOS upgrade might be able to handle that assuming you have the G-sync 980M *hint at Prema*
*** Official Clevo P770ZM / Sager NP9772 and P770ZM-G / Sager NP9773 Owner's Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by HTWingNut, Jan 6, 2015.
 Problems? See this thread at archive.org.
 Problems? See this thread at archive.org.