The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    ***The Official MSI GT83VR Titan SLI Owner's Lounge (NVIDIA GTX-1080's)***

    Discussion in 'MSI Reviews & Owners' Lounges' started by -=$tR|k3r=-, Aug 13, 2016.

  1. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Yeah i'm not sure if I should upgrade assuming I can flip whatever laptop I swap cards with for close to the price I bought it for or if I should wait for the next gen gt83vr to come out. 8th gen gt83vr models just came out earlier this year but MSI still went with 60hz 1080p screens even though they gave their gt73//75's 120hz screens-why they did this I have no idea but it's kind of weird how they gave laptops that can't fully utilize a 120hz screen a 120hz screen but only give an SLI laptop a 60hz when it can fully utilize 120hz...
     
    hmscott likes this.
  2. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    There is only one other manufacturer that makes 18 inch screens and they won't share.


    Upgrading from 1070's to 1080s is a bit moot. Imo you won't see enough gaines to justify it.

    Hell I have a dual 1080 cards and well a lot of games don't stress them out too much unless I start using dsr or the like.
    Sent from my SM-G930W8 using Tapatalk
     
    hmscott, bennyg and gt83vr6reHelp like this.
  3. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Yeah DSR definitely pushes the cards to the max. I noticed I would go from about 60% usage via 1080p on Crysis 3 to 99% usage via 4k DSR on Crysis 3. It's pretty crazy how that game still pushes cards to the max(it's somewhat CPU heavy too). Ahhh interesting, wasn't aware of that. Hopefully MSI can figure it out for next model or just go with an even bigger screen. I think MSI have figured out the best internal engineering for their cooling set ups when it comes to the GT Titan SLI models(if only they made a separate exhaust for the CPU instead of sharing one with the second GPU). It also seems that MSI has the lowest overall temperatures when it comes to SLI laptops. MSI would have the ultimate model if they combined their internal/external structure(+ the suggested cooling change) with a bigger and better monitor. I'm thinking of a new product similar to the acer predator 21(acer's cooling is terrible for this model or i'd have gotten it. All gaming tests for the highest demanding games i've seen on the predator 21 have the CPU and 1080 GPUs in this thing screaming around 85c-90c. Maybe I will write a letter to MSI to share suggestions for the next GT Sli titans. Whatever they do with their new flagship model next year, I hope they keep the easily removable/convenient hood and the fullsize mechanical keyboard(keeping the keyboard in the same location too).
     
    Last edited: May 20, 2018
    hmscott likes this.
  4. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    I see you found the 1080 cards. Yes they will work IF THEY ARE THE CORRECT FORM FACTOR--There are at LEAST two versions of the MSI 1080 with the GPU in different locations...But how are you going to get the heatsinks? Good luck finding 1080 AND VRM heatsinks for the 1080s!

    780W Eurocom or 2x330W PSU+msi adapter would be REQUIRED.
     
  5. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    I was under the impression that the 1070 heatsinks were compatible with the 1080. Hmmm. I don't get why they make an upgrade cost as much as a brand new 1070 Sli laptop lol. This makes me want to buy a new laptop instead. I think MSI should re-think their pricing for upgrades. If owners end up purchasing a new laptop instead, the market gets flooded with old laptops on the secondary market. MSI makes no money off the secondary market. If the secondary non-retail market is full of options, MSI has a harder time selling brand new laptops when someone can buy a slightly used model for about half the price of brand new models. MSI should think about this and consider lowering GPU upgrade costs. I suppose i'll include this in my letter to them.
     
    hmscott likes this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    MSI tried to go the upgradeable GPU route a couple of years ago, but when they tried to scale the idea up, for some reason it was abandoned. Whether it was too risky to do the upgrade by the owner - it was rumored that MSI required the owner to send it in for the scaled up plan - and MSI figured out the resources needed in manpower and plant resources would be too high - so MSI decided to provide a trade-in upgrade instead.

    The MSI trade-in upgrade is still going for those that bought models of MXM GTX9xx GPU's, like GT72, GT8x and they got 2 generations of GPU upgrades. To Pascal and then to ???? whatever name Nvidia goes with.

    Unfortunately MSI stopped that idea with the GT72 / GT8x series, and so whatever you buy that's it, you need to sell it on to upgrade to the next generation models.

    If you think about it, it makes sense what happened. The next generation GPU's of Pascal had requirements not planned for in the design of the previous generation. Thermal, power, control - different enough to require engineering resources best used to make new laptops at scale.

    It just made more sense to put the old laptop into an established refurb program and recoup value selling it on to a new owner, and make new laptop models in the standard production lines, instead of creating a 3rd production line for upgrades.

    The cost of Pascal MXM cards, and the time and effort to do the upgrade even in LGA laptops "designed to be upgraded" is a pain. Then you have the old GPU's to sell on instead of a whole laptop, and so the upgrade pain is passed on to people wanting your old MXM card, until the last schmuck is stuck with their old MXM card that noone wants.

    It's overall better value and less resource cost to sell your old laptop and buy a new one. :)
     
  7. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    O i totally understand why MSI wasn't able to uphold the promise of the old gen laptops being upgradable to the next gen of gpus, no one could have foreseen what the pascal requirements were(On the other hand, if they didn't know what the pascal requirements were, why would they make a promise they weren't confident they could keep?-i'm sure they learned from this mistake and it sounds like they've compensated for this via the trade-in program). I just think in general, a gtx 1080 mobile card should not cost 1,299 USD(that's more than DOUBLE of what a desktop gtx 1080 costs not to mention the mobile card is slightly weaker than the desktop version). Like, 2 cards would cost me about 2.5k since a couple places offer a discount on the second card if you are buying 2. That's not factoring in the SLI bridge, heatsinks, new power supplies, thermal paste and labor(plus anything else I forgot). It will literally come out to over 3k USD just to upgrade from 1070s to 1080s. It's ridiculous. I was heavily considering buying either a 1080 SLI model or two separate 1080 laptops, swapping out the gpus and then reselling the units listed with 1070s taking a very minimal loss but needing the heatsinks/power supplies make it not worth it. I think i'm going to wait for next year's gt83vr flagship model(hopefully MSI makes more updates to it than just the processor, I took a look at this year's model and it's a bit dissapointing as they cut the ram in half while only swapping out for the latest processor but still charging the same price.) They need to go big with 18.4"-21" monitors @ 120hz or at least a 4k monitor @ 60hz). I paid 1800 for my gt83vr-6re, 40 for express shipping, 200 for the repaste, 80 for a 2nd NvMe drive, 100 for the m.2 sata 500gb drive and 300 for the 32gb hyperx ram upgrade-I can only see myself sinking another 600 USD into this thing.
     
    Last edited: May 20, 2018
    hmscott likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The MXM costs have always been outrageous with few "deals" or exceptions, even years after the MXM's have stopped being current production the prices are high.

    There aren't enough individual buyers to purchase MXM cards, usually they are meant for the service market not for upgrades, and the volume doesn't motivate makers to generate enough production to meet some level of efficient scale, so they are few and far between - and costly.

    Desktop GPU's are made in the thousands, and constantly being built in new production runs. Not so for MXM GPU's outside of contracted quantity purchases - probably for 25%-50% of the spot / individual costs. Most MXM's are probably made by the vendor using them for their laptops or mini-desktops, and aren't intended for individual sale / use.

    There really isn't the intention to create an MXM upgrade market, most vendors aren't making an upgrade market happen, it's a side effect of MXM GPU's being made available for service replacements.

    Since there is a small level of active interest - people that would actually follow through with the desire to upgrade - there is a small support network of people that provide service MXM's for enthusiasts, but there isn't any chance of that being "cheap" for current production GPU's.
     
    Kevin@GenTechPC likes this.
  9. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,500
    Likes Received:
    2,098
    Trophy Points:
    331
    Agreed with Scott. Investing in MXM is not worthwhile and cost effective and because technology changes everyday thus why we are seeing the downplay by manufacturers.
    Even if a machine can be upgraded, to manufacturers there are costs associate with performance testing/power testing/thermal/BIOS/compatibility/WHQL qualifications etc since it's a full system.... too much details to talk about.
    None of enthusiasts want to see this but unfortunately that is the direction many big industry players are going.
     
    Pedro69 and hmscott like this.
  10. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    The 3 Jeyi 2mm heatspreaders arrived today from china. Here is a temperature update on the m.2 heatspreaders installed on the 2 m.2 NvMe drives and the m.2 sata 3 drive.

    For some reason my Crucial MX500 500gb sata 3 m.2 drive was hitting temps as high as 88c during crystaldiskmark tests. After installing the heatspreader with .5 mm pads on top and a naked 1.0 mm pad on the bottom, temps went down to 65c during crystaldiskmark tests. Idle temperatures went down 3c.

    During crystal diskmark tests, The toshiba stock NvMe drive topped out at 60c and the samsung pm961 drive topped out at 56c, down by about 7c-10c each. Idle temperatures for the samsung drive went down 9c from 47c to 38c and idle temperatures for the toshiba drive(windows boot drive) went down by 2c, from 52c to 50c. I'm curious why my toshiba m.2 NvMe is running warmer than the samsung(could it be because it is being used as a boot drive?).

    I think for 20 dollars for all 3 heatspreaders/pads is a worthy investment for a little extra cooling on m.2 drives as laptop cooling isn't as good as desktops. The GT83VR in general doesn't have much ventilation for the m.2s so every little bit helps. The Jeyi 2mm spreaders + .50 pads fit perfectly under the hood with space to spare.

    On a side note, the userbenchmark.com test keeps getting stuck during the 4kmixedtest for the Crucial MX500 drive. The entire benchmark used to complete before I installed the heatspreaders(crystaldiskmark tests still completes just fine though). Any idea why this might be happening?
     
    hmscott likes this.
  11. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    Personally I'd call

    143.jpg
    on that one.

    @Kittys has a lot of experience on SSD's and stuff. Maybe she can help?
     
  12. Windfurian

    Windfurian Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    4
    Trophy Points:
    6
    As a recent owner of a GT83 Titan 8RG I am finding myself very happy with this beast, previous laptops were are a GT72 and GS73, and once I touched the keyboard I knew this was the next model for me.

    I actually use this on my lap which sounds a little crazy, the only problem with this is the damn dongle for the dual supplies hangs and tends to pull out, so I am looking at the Eurocom as an upgrade, but I am based in NZ and looks like no one wants to ship them down here so might have to get creative with that.

    Main question I have, does anyone have any baselines for overclocking this model?
     
    gt83vr6reHelp and hmscott like this.
  13. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    From what I heard, the bios is hardcoded so that you must use the two power bricks. I'm not sure if this changed for the 8th generation model. Congrats on your purchase, The keyboard is also what attracted me to the gt83vr. Welcome to the sick baller club.
     
    Last edited: May 23, 2018
    hmscott likes this.
  14. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    Someone said that the Eurocom 780W PSU works on the "Dual" power brick versions. It HAS Been tested on the i9 8950HK version of the GT75 Titan without any strange throttling (using ONE Of two of the default power bricks, e.g. if one is disconnected intentionally, will cause extreme power throttling). Can't speak for the GT83 units though.
     
    gt83vr6reHelp likes this.
  15. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Here is a thread with information about OCing the i7-8850H. http://forum.notebookreview.com/thr...-i7-8850h-i9-8950h-coffee-lake.810891/page-45 HMScott chimes in towards the bottom of the page. I think 4.5ghz as a daily driver is doable. @ Windfurian you have XTU?

    Once you find a stable OC for your processor, we can talk about a potential undervolt on the core voltage to lower CPU temperatures by 5c-10c. After that, we can OC the monitor to increase your refresh rate by 50% going from 60hz to 90hz or even 100hz. Monitors oced to 100hz seem to be the stable norm for gt83vr models with only one user reporting a 120hz OC. Might as well take advantage and oc your monitor since you have dual gtx 1080s to put out the frames. Games look buttery smooth when a monitor has a higher refresh rate, you'll love it. Keep us updated on your progress and let us know if you need any guidance or help.

    Some programs you may want to download if you haven't already:
    XTU (For ocing & monitoring your processor-also for stress testing ram/processor)
    HWinfo64 (All around monitoring, fantastic program)
    MSI Afterburner (GPU & CPU monitoring, GPU overclocking)
    Crystal Diskmark & Crystal Diskinfo(SSD/HDD monitoring & testing)
    3D Mark Firestrike(GPU benchmarking/stress tests, they also have versions for 1440p and 4k)

    On a side note, What games do you like to play? Have you given your machine the Crysis 3 test yet?(Crysis 3 is one of the most GPU *and* CPU intensive games on the market. At DSR-4k with max settings on crysis 3, my two gtx 1070s scream at 99% usage and average about 60 fps. CPU usage is also pretty high. I view this game as a real world scenario stress test for computers. You'll have no problem crushing it with dual gtx 1080s + i7-8850h but i'm curious on how big of a difference your gpus/cpu will make in comparison to my dual 1070s/i7-6820hk oced to 4.0 and what the usage levels are at.
     
    Last edited: May 23, 2018
    Pedro69 and hmscott like this.
  16. Windfurian

    Windfurian Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    4
    Trophy Points:
    6
    Clocked up to 4.3 without any hassle last night, so will give 4.5 a try tonight, but so far everything has been effortless.

    Game wise it has pretty much only been playing BDO for now, little bit of Conan Exiles, sadly haven't had a lot of time for gaming with it and it has been used more for office work (which it absolutely crushes), but have a long weekend starting tomorrow so will load up some more stressfull tests for it.
     
    hmscott and gt83vr6reHelp like this.
  17. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Has anyone here ever used a ram cache for their ssds? I just enabled momentum cache(crucial's built in ram cache feature) for my crucial MX500 m.2 ssd and my speeds are 8-10x faster. Read/write went from 500mbs to 4500mbs. This thing is now faster than my NvMe drives. I'm thinking of configuring a ram cache for both of my NvMe drives. Does anyone have experience using a ram cache for their boot drive?
     
  18. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    The eurocom power brick works with our laptops. Problem is it's a 400$ upgrade. Lmao.

    The hard code if I recall is to draw a certain number of watts.


    I used a ram disk for awhile. Great for loading games.

    I wouldn't recommend using it for a boot drive as it will wipe the drive every time you shut down the computer.

    Sent from my SM-G930W8 using Tapatalk
     
    Last edited: May 25, 2018
    gt83vr6reHelp likes this.
  19. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Thank you for the clarification on the Eurocom PSU.

    I definitely wouldn't use a ram disk as a drive for an OS as it would take time to reload the ram disk with windows every boot as well as saving it to another location on every shutdown/restart. It would also take a ton of ram to do. I was thinking more a long the lines of a 2-4gb ram cache for all the temp files and a 2-4gb ram cache for my other ssd. Anyone else use ram cache for stuff like this?
     
  20. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Does anyone know what the safe operating temperatures are for the MS-1815 motherboard inside the GT83VR-6RE? I couldn't find anything on google.
     
  21. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    as in?

    to my understanding the motherboard would have the same operating temperature tolerances as the CPU/GPU or slightly higher.
     
    hmscott and gt83vr6reHelp like this.
  22. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Well, right now I have a portable packed pixels monitor plugged in and powered via thunderbolt 3 adaptor + extender while watching a twitch stream, motherboard temps are about 65c/66c while oced to 4.0 ghz. Is this high? I noticed that an extra monitor powered by my computer raises the temps of everything about 10c. Do I have anything to worry about? Any idea what healthy temps for our motherboards are? I couldn't find any information on the msi website in regards to the MS-1815 motherboard.
     
    hmscott likes this.
  23. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    To be honest I have no idea, I know most capacitors and VRM's can go up to 100-120c for bouts at a time, but no one anywhere seems to list motherboard temperature limits, the scant few thing I can find suggest that that would be a fine temperature band to be within.
     
    hmscott and gt83vr6reHelp like this.
  24. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Gotcha, thanks mate.
     
    hmscott likes this.
  25. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    No problem, that's mostly what i lurk around for now adays. lol
     
    hmscott and gt83vr6reHelp like this.
  26. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Happen to know where I can find red or black Sodimm heatspreaders? I haven't had luck on google. I know they won't offer much in cooling, I just want them mainly for cosmetic purposes for when I pop the hood.
     
    hmscott likes this.
  27. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Not too sure to be honest, other than a swap from an HDD to an SSD I've kept my system stock. mostly due to either not having money or time.
     
    hmscott likes this.
  28. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    gotcha. I just swapped my hdd for a hyperx savage ssd(only because it matches the msi color scheme---so much OCD here haha). So far I have 3 red heatspreaders for the m.2/nvme ssds and the red/black/silver hyperx savage 2.5" ssd. I plan on covering the silver metallic portion of the blueray drive with either red/black paint or a red/black sticker. Last thing on the list is the ram. It's hyperx impact ram(black cards) but it has the stickers/chips exposed so I want to cover it up with something that will match the color scheme. I may have to make something out of thin aluminum sheets or something + add a coating of red/black paint since sodimm heatspreaders are very uncommon. Also thinking of getting thin mesh grates and painting them red to cover the extra oddball sized openings underneath the hood as well. Still working on finding someone who can make a transparent fiberglass hood so that everything can be displayed like a PC with a glass tower. This is a fun project so far and I can't wait till it's finished.
     
    hmscott likes this.
  29. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Nice, I thought about trying to find somewhere I could swap the cover above the keyboard with some kind of mesh or slotted vent to help draw air over the NVME drives or something but I couldn't find enough data to support doing it or somewhere to actually do it.

    Sent from my SM-G930W8 using Tapatalk
     
    hmscott likes this.
  30. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    I think hmscott mentioned that someone drew power from the 2.5" ssd hookup in order to supply power for a very very small fan for extra ssd cooling. I'm not sure how much cooler things ran with the added fan. I'm sure it made at least a 5c or possibly 10c difference on the NvMe drives though. I don't know anything at all when it comes to accomplishing such a feat. as i'm still very a much a newb at all of this but maybe get in touch with @hmscott to see if he(or she? profile picture has me confused, sorry if I got it wrong) can give you a few pointers or connect you with the person who did it. I'm personally not comfortable doing that type of mod myself and I'm not too comfortable with paying anyone to do it unless they had a fair bit of experience with this specific model. I ended up settling with 2 mm red heatspreaders with .5 mm thermal pads which ended up dropping m.2 temps under load around the 10c range.
     
    hmscott likes this.
  31. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    The key to any such things is a multimeter and continuity tests.
    you need to know which pins are wired to what. Ground, 12v, 5v, etc. and to know the amp rating for how much power there is (remember watts=volts * amps), so you cant' just hook up a 3.3v, 5v or 12v fan without knowing the amp rating of the rails.

    then some wiring practice and you're good to go.
    One thing electronics classes can get you good fundamentals in. But for those who weren't so lucky, you can learn online and experiment on old hardware that you don't care about the magic smoke coming out.

    nuke.gif
     
    gt83vr6reHelp likes this.
  32. Andest2003

    Andest2003 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    8
    Trophy Points:
    6
    Hi

    I have a gt83vr 6rf that i have had for a while. I have hardly used it. Can someone help? switching from single to sli gtx1080 or the other way round i always have a problem that it does not just switch like my pc. it goes to a black screen and i have to then push the power button to go out of windows to then log back in. sometimes this has reset the display resolution so i then restart and it will then of switched. do others have this issue?

    next question is i had a look at overclocking the screen. i tried some lower settings via nvidia control panel custom screen setup and ended up at 100hz so i thought i would try for a bit. i played farcry 5 for about 20mins and all appeared ok. i also tested to make sure it was not dropping frames and again all looked good. i tried another game later, world of tanks and on the loading log in screen it displayed shaky image showing 4 images in each corner. i then pushed the power button to re log in and my desktop display all appeared ok. I then went to swith back to 60hz and when i did my screen became shimmering towards the top on the bright colours thought i had killed it. This eventually put its self right and stopped after a couple of restarts and i re loaded drivers. I have tried again and sometimes it would start the game fine and others same problem. is it because 100hz was too high? even though it was not skipping any frames. I decided to ask you guys first before i had another play.
     
    hmscott likes this.
  33. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, it sounds like 100hz is a bit too high on your unit, try dropping down to 95hz or so - 90hz is 50% higher refresh than 60hz. :)

    IDK about the resetting of the SLI options, as I never needed to do that to get games to work in SLI or single mode, they just did what worked for them.

    Reinstalling the Nvidia driver will reset Global options, including resolution / etc, so maybe that is what is happening?
     
  34. Andest2003

    Andest2003 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    8
    Trophy Points:
    6
    Its the way i have to switch it from single gpu to sli gpu. on my main system the monitor goes off a couple of times then in comes back on switched.

    On my laptop it ends up on a black screen i then have to push to power button to log me out. i then log back in and its switched but sometimes the resolution is different. seems very different than my full size pc. I will try a different driver and see if its that.
     
  35. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    I did play with it myself long ago on the GT80, but I don't recall having any issues. That was on Windows 7 / 8.1, but not on Windows 10.

    There have been lots of reports repeating over several years now of black screen issues with Windows 10 getting solved with Windows 10 updates or Nvidia driver updates, and then reportedly being broken again by another Windows 10 Update or Nvidia update, it's been a long term PITA for everyone.

    One of the many reasons I don't run Windows 10, that constant Windows 10 update + Nvidia update breakage.

    Do you have to switch from single GPU to SLI? I recall no performance advantage between leaving SLI on vs single GPU on games only supporting 1 GPU. I thought disabling one might provide more power for the remaining to give better performance, but on the GT80 I didn't see anything more than normal test run variances.
     
    Last edited: Jun 11, 2018
  36. Andest2003

    Andest2003 Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    8
    Trophy Points:
    6
    I have been playing with single card and sli benchmarks on farcry 5 as on my main pc sli gtx1080ti i have been having issues with the nvidia driver. I have been getting rubbish sli performance on all new drivers since the first farcry 5 nvidia update that was great and giving me much better fps in sli. I have been getting much lower fps when in sli than on single card. The single card has been better and seems to be down to driver.

    I was trying to see if my laptop was doing the same or if it was getting better fps in sli. I dont normally switch from sli on the laptop but to be honest ive not used it a great deal. It was a way to see how the new nvidia drivers were working on lap. Thats when i noticed it was switching very different to my main pc. Just thought i would see if anyone else had this on their sli 1080 laptop.

    I have also noticed that when connected to 4k tv it does not like to switch from 1080p to 4k and again from single card to sli or vice versa. thought it was just down to tv support.
     
    hmscott likes this.
  37. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    That sounds like windows 10. Since the past update I've been having a lot of issues with scaling. (I switch from 4k60 tv, 1440pUW@95 and 1080p95) and it's been giving me all sorts of issues plus windows crashes....and this odd thing called windows indexer eating up 1/3-2/3 my cpu.

    Sent from my SM-G930W8 using Tapatalk
     
    hmscott likes this.
  38. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    You don't need to disable SLI globally, you can do it in the 3D Program Settings for 1 program / game at a time, for example:

    "You can disable SLI for the Sims only in the NVIDIA control panel.

    Manage 3D Settings > Program Settings > Select the Sims exe.

    In the list of options will be an SLI Related option, forget what its referred to as and dont have SLI setup atm. But by doing this you can disable SLI for specific games only instead of enabling/disabling SLI completely."
    https://forums.geforce.com/default/...the-sli-without-it-bothering-my-other-games-/
    720a20a9f65ce63233131d9a381eec1a.jpg
    Hopefully that method won't be as disruptive as doing it globally in the Nvidia Control panel. Also, if there isn't an entry for your game in the Nvidia 3D Program Settings list, you can add one to the list for your game / program.

    Nvidia Inspector can also edit the game profile should you want to take that route.

    These 2 download sites are the latest I can find, not familiar with the sites, they look "ok", but I'd scan that exe a few different ways. :)

    NVIDIA Profile Inspector 2.1.3.20
    March 23, 2018
    https://www.asylab.com/single-post/2018/03/23/NVIDIA-Profile-Inspector-21320
    https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/artifacts

    Original Developer Orbmu2k has a GIT repository, perhaps the "purest" place to get it:
    https://github.com/Orbmu2k/nvidiaProfileInspector

    Guru3d's version is a couple of years old, and has the authors site download listed - IDK why that developer doesn't show new info about new development, his original site has 2011 content... weird.

    NVIDIA Profile Inspector Download Version 2.1.2.0
    http://www.guru3d.com/files-details/nvidia-profile-inspector-download.html

    Nvidia Inspector introduction and Guide
    https://forums.guru3d.com/threads/nvidia-inspector-introduction-and-guide.403676/

    The Nvidia Programs settings should be enough, but since you are getting into it all, I figured being made aware of Nvidia Inspector might be of interest to you.

    Disabling SLI through Nvidia Profiles
    https://www.bluegartr.com/threads/108057-Disabling-SLI-through-Nvidia-Profiles

    Please come back and let us know what works for you. :)
     
    Last edited: Jun 11, 2018
    Andest2003 likes this.
  39. hanban911

    hanban911 Newbie

    Reputations:
    5
    Messages:
    9
    Likes Received:
    12
    Trophy Points:
    6
    Hi, I will be getting my gt83 in a few days, and i wanted to know if you (or anyone) can give some instructions on how to OC the scrren to above 60hz. also please share your experiences if you have done so, Im not planning to go above 90 hz or so, but i want to know if it is safe
    THANKS!
     
    hmscott likes this.
  40. Falkentyne

    Falkentyne Notebook Prophet

    Reputations:
    8,396
    Messages:
    5,992
    Likes Received:
    8,633
    Trophy Points:
    681
    If memory serves, 100% of these screens can run at 75hz, and can be overclocked with ToastyX CRU. I do NOT know if this applies to the 4k versions.
     
  41. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    Is there a 4k gt8x now? Most can do that I would think.

    Sent from my SM-G930W8 using Tapatalk
     
    hmscott likes this.
  42. hanban911

    hanban911 Newbie

    Reputations:
    5
    Messages:
    9
    Likes Received:
    12
    Trophy Points:
    6
    there are no 4k screens on the gt83, just 1080p

    ao im good to go for 75hz on the gt83? has anyone managed to get more than that?
     
    hmscott and Falkentyne like this.
  43. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    I get 95 on mine.

    Sent from my SM-G930W8 using Tapatalk
     
    hmscott and Falkentyne like this.
  44. hanban911

    hanban911 Newbie

    Reputations:
    5
    Messages:
    9
    Likes Received:
    12
    Trophy Points:
    6
    nice, stable with no issues?
    did you use toastyx CRU or just changed settings in nvidia control panel?
     
    hmscott and Falkentyne like this.
  45. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    No issues so far, I just matched it to my x34. I use the Nvidia control panel

    Sent from my SM-G930W8 using Tapatalk
     
    hmscott likes this.
  46. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    GT83VR-6RE owner here. To OC the screen, first make sure any DSR(dynamic super resolution) settings have been disabled. You can do this by going into Nvidia Control Panel ---> Manage 3d settings ---> Turn off DSR factors. Next, go back into nvidia control panel--->change resolution and hit the "customize" feature while you have your native resolution selected. Click create custom resolution. Change the refresh rate (Hz) to 75 and click test. If it passes, set this as your resolution. You'll want to test your new custom resolution with a higher refresh rate in games to see if there is any artifacts or graphics glitches. As long as they are not present while testing games, the new refresh rate is stable. Once you are stable at 75 Hz, edit the resolution and increase your Hz by 5. Test the resolution in game again to see if it is stable. If it is stable, make it your new resolution and continue to try and push the refresh rate higher by increments of 5. From what i've read, a stable overclock to 90 Hz seems to be the norm, sometimes 100. Right now i'm stable at 100 Hz. One user reported a stable overclock of 120 Hz. You'll know you've raised the Hz too high if it fails the test or if your machine black screen hangs after flickering between the old/new resolution. If that happens, just restart and and bump the Hz back to your last stable OC. Once you've found the highest refresh rate for your monitor that passes the windows test and gaming tests, apply it and share your overclocking results with us.
     
    hmscott likes this.
  47. hanban911

    hanban911 Newbie

    Reputations:
    5
    Messages:
    9
    Likes Received:
    12
    Trophy Points:
    6
    Thanks for the detailed reply! Will try it as soon as I get the laptop and report back!
     
    hmscott likes this.
  48. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Sounds good mate. Let us know which model you get too. Also, In the event I didn't mention it before, I strongly recommend getting the CPU re-pasted with thermal grizzly conductonaut liquid metal or your processor will thermal throttle when under full load while overclocked(assuming this is if you get a model with an unlocked overclockable processor, no re-paste required if you do not overclock).
     
  49. hanban911

    hanban911 Newbie

    Reputations:
    5
    Messages:
    9
    Likes Received:
    12
    Trophy Points:
    6
    Hey, Ok so I got the laptop, it is GT83 7RE (7920 HQ cpu) with 2x1080's in SLI, 512gb x2 in raid, extra evo 960 512 gig, and 1 TB mechanical storage.
    64 GB ram.

    I OC'ed the display to 90 Hz stable using nvidia control panel. my only concern is how the monitor handles the OC in the long term. for how long you guys running ~90 Hz OC on your gt83?

    I'm afraid i'll mess up the screen ( i dont have any warranty) [​IMG]


    https://image.ibb.co/f1cYX8/20180622_185610.jpg
     
    hmscott likes this.
  50. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    I have had my panel running at 95htz for 2 years with no known issue.

    Sent from my SM-G930W8 using Tapatalk
     
    hmscott likes this.
← Previous pageNext page →