The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    ***The Official MSI GT83VR Titan SLI Owner's Lounge (NVIDIA GTX-1080's)***

    Discussion in 'MSI Reviews & Owners' Lounges' started by -=$tR|k3r=-, Aug 13, 2016.

  1. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    I've never read anywhere about anyone having any LCD issues like that with the gt83 models, pretty sure it's just you. Sorry to hear that mate. I can totally understand how you would want to publicly devalue the product after that, I would too. Send it back to MSI to get fixed or for a replacement and please share with us the results!

    Fortunately, I've never experienced this issue and i'm the second owner(at least) of the laptop. It's over 2 years old so far. This thing is a tank, my chip performs as well as an air cooled overclocked 6700k. Haven't had any display issues either, OCed to 100hz steady too. Only issues i've had were temperature issues while overclocking after I first acquired it from the previous owner(prior to repasting with liquid metal) and learning how to properly use the machine/bios.

    Definitely keep us updated.

    They sure don't make those alienware models like they used to ;/. If only the quality was as good today as it was back in the day(when they actually cared and before it became a dell reskin), I would have considered purchasing one. Unfortunately alienware's new stuff is utter garbage. It's all half assed.
     
    Last edited: Jan 17, 2019
    Kevin@GenTechPC likes this.
  2. sportsurgeon

    sportsurgeon Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    5
    Trophy Points:
    6
    I am very happy yours and other peoples GT83 are still going strong. I am happy if it is just an isolated case. I will definitely post it here once it is fixed. And you are right about the current Alienware since Dell took over. They are garbage!
     
  3. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    You could try and play with advanced bios settings. Also there's a chance that IC that converts edp to lvds is dead.
     
  4. sportsurgeon

    sportsurgeon Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    5
    Trophy Points:
    6
    I have great news. I replaced the LCD panel again and now it works. What are the odds of having two bad LCD panels. Anyhow avoided having to send it in and it was not the LVDS cable and it wasn't the BIOS or anything else. The problem is totally fixed now. Did not have to do an RMA and send it in. Not sure why the LCD panel only lasted two years and the replacement was also bad. Must be a quality issue with the panels.
     
    ryzeki, NuclearLizard and hmscott like this.
  5. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    My guess would be all the good ones where bought up by the big companies. Also they probably are not in production anymore, that would be why they are killing this line of laptop.

    Sent from my LM-Q710.FGN using Tapatalk
     
  6. etcetera

    etcetera Notebook Evangelist

    Reputations:
    52
    Messages:
    604
    Likes Received:
    166
    Trophy Points:
    56
    I agree that such an expensive computer should come with a better panel, at least 2560x1440 would be ideal with 4K an interesting option.

    I think part of the problem nobody makes 18.4" QHD or UHD panels, only 17"

    the other aspect of the problem, higher resolution would result in lower runtime as more pixels demand more power and the runtime is poor enough as it is - I get 1 hour, barely.

    1080 HD is not *that* bad although QHD I think would be ideal for this machine as an intermediate resolution between HD and 4K.

    GT80S is a 2015 year model. But I don't see anything interesting in latest-greatest MSI models as of 2019. There is nothing that GT80S cannot process and spit out as of today. It was a fast machine then and it's still is a fast machine today.
    Yeah, the video card is better in newer models, and a few tweaks. Replace the HDD with SSD and other SSD with latest-greatest Samsung models and it flies.
    The only thing that would make me upgrade is a 18" QHD display with a mechanical keyboard. I think there are other machines with QHD displays but they lack Cherry MX brown keyboards.
     
    Kaloyan likes this.
  7. etcetera

    etcetera Notebook Evangelist

    Reputations:
    52
    Messages:
    604
    Likes Received:
    166
    Trophy Points:
    56
    This is an old but relevant post. GT83VR is not a quantum leap away from GT80S, it's a very similar machine. In fact it takes an identical number of SSDs as GT80S.


    3 2280 modules, 2 of them are PCIe and 1 SATA.
    1 2.5" module, I got Samsung 860 1TB and the other ones are also 1TB. Samsung PM981 works well, so does PM951 guaranteed to work.
     
  8. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    From what I hear this line of laptops is dead due to part sourcing constraints and limitations like that.

    Sent from my LM-Q710.FGN using Tapatalk
     
  9. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Great news for gt83 owners. With Nvidia's latest drivers(as of mid Jan?), We now get "gsync compatible" on the gt83 series. I enabled gsync compatible and vsync on in the nvidia control panel while turning off any in-game vsync. I was getting a steady 10ms gpu time @100hz(oced) with v sync before, now i'm getting a steady 7ms gpu time @100hz(oced) which is equal to what I was getting on a gsync 1440p external monitor @144hz. As expected, everything is much smoother too. This laptop just got even better mwahahaha.
     
    Last edited: Jan 31, 2019
  10. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    That's awesome. I suspect it is because nvidia embraced freesync and that's what our laptops were using. Great news :)
     
  11. Rengsey R. H. Jr.

    Rengsey R. H. Jr. I Never Slept

    Reputations:
    1,084
    Messages:
    2,771
    Likes Received:
    1,020
    Trophy Points:
    181
    4K 18.4 is available ...
     
  12. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    QHD would be 2560x1440, 4K is overkill on these machines in my opinion. Probably only good for specific content creators.
     
  13. etcetera

    etcetera Notebook Evangelist

    Reputations:
    52
    Messages:
    604
    Likes Received:
    166
    Trophy Points:
    56
    I agree. 4K is too much on any laptop screen IMO. I had 4K on a 27" monitor and that was too much even.

    4K rhymes well with a 32" screen. Anything less than that should get either QHD or HD. I think 15" or so looks OK with 1080 HD. But an 18.4" screen would look fabulous with a 2560x1440.
     
    hmscott likes this.
  14. etcetera

    etcetera Notebook Evangelist

    Reputations:
    52
    Messages:
    604
    Likes Received:
    166
    Trophy Points:
    56
    Just not in an MSI machine and I hate to give up the mechanical keyboard.
     
    gt83vr6reHelp likes this.
  15. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    I use in my desk a 49inch TV as a monitor and even THAT feels small haha. specifically when using native scaling at 100% dpi.
     
    Falkentyne likes this.
  16. etcetera

    etcetera Notebook Evangelist

    Reputations:
    52
    Messages:
    604
    Likes Received:
    166
    Trophy Points:
    56
    There is the 5K resolution and now 8K... I can't image how fabulous 8K would look on a 49" monitor.
    that's what I really want for Christmas. In about 5 or 8 years when it becomes common and affordable.

    And I do wonder how 5K would look in a 32" monitor.. for some reason the only size possible is 27".. and that's a total mismatch.
     
  17. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    A nighttime battle for the megahertz.......

    I took the plunge and spent the time learning ram timings so I could properly manually overclock the hyperx impact sodimms(4x8gb). These are xmp rated for 14-14-14-35 @ 2400mhz. In the past, I cheated the ratio up 1.5 bins with a custom profile using the same xmp settings. It gave me 2800mhz, I felt great! Reflecting on it, I felt dirty and these results do not satisfy me at all. I felt like I was just roboclocking. I felt like a cheater....

    I went back into the bios and tested 3200mhz, no bootsies, No matter what timings/voltages, at least not with these sticks, not ever. I tested 3000mhz, bootsies! I tightened the timings to match the next strap that was relative to my xmp settings. I went with 15-15-15-38. All 4 sticks boot. The only issue ended up being stability. "NOOOOOOOOOOOOOOOOOO". Programs would crash or not open at all. Eventually I would bsod. I tried stock 3000mhz timings, xmp timings from all the brands etc etc. Nothing at all.

    No sweat as I remembered earlier in the year I had gotten 2800mhz(14 multiplier) + Qclk odd ratio multiplier to boot with minor stability issues. I figured I'll take another crack at it since now I have an understanding how most of the timings work. I went with 14-15-15-37-491 with +200mv offset on the system agent. Stability. I still wanted more...I was starving for those extra mhz. I was still unsatisfied.....

    Lately I was thinking about bumping my CPU OC down because in order for me to keep my 4.4ghz overclock, I have to enable it every single time in windows after I boot since multipliers are locked at 42 in the bios. Even if the programs open during boot, I still have to manually adjust since the ratios are outside the bios limit. The extra 200mhz also goes away anytime I switch to balanced power plan. I'm lazy and I hate this.

    And then it hit me like a ****ing mac truck......BCLK!!!!!!!! I HAVE TO TUNE MY BCLK.

    I'll OC the BCLK so that I can achieve even higher ram speeds since I was stuck at 14 strap(+Qclk odd ratio) and to not have to open up throttlestop or xtu everytime I want to goto 4.4ghz.

    After a few hours, I ended up with this:
    Bclk: 104.75
    4 core multiplier: 42, 1.30v
    cache: 42, 1.30v
    Memory ratio/timings: 14 / CL 14-15-15-37-508, +200mv system agent

    Core and cache come out to 4.4ghz and the memory came out to 2933mhz with roughly the same latency as cl 14 2800mhz. I'm going to go back and try to tweak the trfc a little more later on.

    I highly recommend bclk overclocking once you get to know your chips.

    Here is a photo of latency testing. Most of the tests towards the top of the list were with 2 sticks, The bottom half is mainly 4stick tests after I thought I had found stability. The top of the list was abandoned once proven unstable for these sticks. I also tested different CPU frequencies to compare cache/ram scaling since they go hand in hand with skylake. The 2933mhz/4.4 sample(4stick test) is the highest stable result. Till next time!

    @hmscott @NuclearLizard Would either of you happen to know if increasing the system agent ICCmax(amps?) can help possibly get 3k stable? What about "energy performance gain" in the bios memory sub-menu in the system agent menu? It has amps for each stick. Never messed with those values before. Maybe the sticks need more amps? Or have I definitely reached the limit of these sticks?
    upload_2019-2-5_10-32-33.png
     
    Last edited: Feb 5, 2019
    ryzeki likes this.
  18. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    I don't think the ICCamps would help with memory speed but you can try. they are for the CPU so that it can gulp more power if needed. It's 'safe' to adjust and you will be limited by temps instead.
     
  19. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    I have no idea my friend. The most tweaking I did was to play with voltage and clocks in XTU and then in throttlestop.

    I mostly tune my system to keep it from throttling on long loads or to get an extra bit of performance in bursts. Other than that I let it be.

    Sent from my LM-Q710.FGN using Tapatalk
     
  20. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Hi, i would like to share my experience with GT83VR 7RF and i want to share my overclock experience with this great machine.

    This is my best score in 3DMark: https://www.3dmark.com/fs/18156112, with latest Windows update, latest BIOS, firmware, drivers, etc... the performance has decreased from Windows 10 2017 to Windows 10 2018/2019, but any way, i think that is interesting to share scores and optimizations to get better scores with this machine.

    I've set the CPU @4,5Ghz (Intel Extreme Utility) and GPU with the maximum OffSet that MSI Dragon Center allows (250 core and 350 memory).

    If i try to increase the GPU core speed a Little (2 or 3 MHz) with MSI Afterburner, the GPU becomes unestable and i don't want to make voltage changes.

    Could you share your experiences and results about this?
     
  21. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    From what I've heard, the voltage/power limits on mobile 10 series gpus are locked. +250/+350 on the core/memory might be a little high and it looks like the benchmark is showing that the cards didn't take the oc as the core clock looks to be operating in the stock range + boost(if the OC is too high it either doesn't apply at all or you can crash). Try bumping the gpu oc down to +150core/+200-250memory and see if the coreclock rises in the next benchmark. Make sure you are set to maximum performance in nvidia control panel for the gpus and max performance for your OC. Make sure cooler boost is on(fans 100%) during the test. Report back with your results so we can take a look. Cheers.

    Also, I recommend only using 1 tool for ocing any single component. Every laptop's proprietary overclocking software solution(such as msi's dragon center or asus's oc tool or X brand oc tool) that comes with automatically makes the changes for you. If you use multiple tools, they may conflict with each other.
     
    Last edited: Feb 7, 2019
  22. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    If you're using MSI afterburner to monitor temps, clocks etc. while benchmarking there's an option for Power Limit. Make it to show in OSD and check if you're hitting power limits. Though I'm fairly sure you are and that's why your clocks don't really go much higher eventhough you set it in afterburner.
     
  23. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    He could also use a positive system agent offset to alleviate any pcie bottlenecking so that any sort of gpu oc sticks.
     
  24. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    In my MSI 1070 it is because of 115w power limit. Clocks bounce past 1683 MHz boost clock for split second then power limit kicks in and drops it.
     
  25. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Which gt83 model do you have? Which model #?
     
  26. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Ok so I just realized that the display on the gt83 is not a freesync display. My 2 retina displays are freesync and i'm only able to turn on g-sync compatible when they are plugged in but my actual laptop display benefits from g-sync compatible. Technology behaves strangely sometimes lol
     
  27. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    It's not gt83vr but gt80 with single 1070.
     
  28. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    What happened to your second GPU?
     
  29. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    This laptop originally came with dual 970m's. I've upgraded it with single 1070 so there's no second gpu (though I don't even need it).
     
  30. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    How?, the gt80/gt80s models are not upgradable to pascal cards.
     
  31. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    They work. But you cannot use internal screen because igpu and dgpu cannot be present at the same time. Using external monitor it is possible and work no different than any other true pascal laptop. Even gsync works through mdp port.
     
  32. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    @hmscott You dealt with the whole upgrade fiasco a couple years back, any knowledge of this?
     
  33. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    The upgrade consisted of trading in your MSI Maxwell laptop for a Pascal laptop, and paying an upgrade fee, there were no kits for upgrading only the MXM GPU's from MSI, and I used MSI's upgrade trade-in offer only.

    You might want to search for a Pascal MXM upgrade thread, probably Clevo mostly, and see what they have to say. There are some MXM dealers that work the forums and help people doing those kind of upgrades, you might try them too.
     
    gt83vr6reHelp likes this.
  34. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Disable MSR config/c-states, add a system agent offset of 50-100mv to alleviate any potential bottlenecking and make sure you are on a max performance power plan. My dual 1070s sit around 2000mhz each while under load in cod black ops 4 with a +144core/+200mem oc. I noticed they only go that high when core usage hits a certain point. For example, if i'm playing on low settings and core usage is low, one or both cards clock down to regular factory speeds. If the load is heavy, they go up and stay up. Also, even though the cards are locked, make sure to unlock all voltage controls in afterburner. It seems as though the cards end up behaving in what I can only describe as an "adaptive oc" type mode.
     
  35. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    Its not that the machines cannot physically use pascal GPUs, its that MSI did not offer official GPU upgrades. The GT80 line in particular suffered because 1060s couldnt allow for SLI, and prices for 1070s and 1080s were high.

    Anyways, the whole point was that MSI would not offer "official" kits with upgrades and instead offered a trade in program. I traded my GT80 with SLI 980m for a GT73VR with 1070.
     
    gt83vr6reHelp and hmscott like this.
  36. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    I thought they didn't offer the mxm upgrades because of the power requirements of the pascal architecture. I guess it seems more of a "we wanted to move more total units of the next model instead of mxm upgrades" approach. Good to know that someone can at least stuff a 1070 in the old model.
     
  37. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    Depends on the model. I think previous GTs couldnt use dual 1080s for example, only 1070 and thats it. I think any model that required an external power cable like the 1080 couldnt work, but 1060 and 1070 could be modded to work around constrains since they get power from MXM.
     
  38. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    But are both of your cards run at 100% while gaming? If I run not demanding game which use 60-70% of gpu my clock speed sits at 1880 MHz or 2000 Mhz with +150 core. If I run AAA game like Witcher 3 maxed which push my gpu to 99% my clocks hover around 1650-1730 MHz. I think with SLI both your gpus run at 60-80% utilization which allows them to be at 2000 MHz.

    They didn't offer not because of power requirements. Well maybe 1080 which needs 150w could be somewhat "harder" to run. But 1060 and 1070 you can run easily on single gpu MSI laptops.

    In my opinion they didn't offer upgrades because pascal ditched LVDS support. And as we all know MSI GT80 run on LVDS panel. Also I believe first MSI GT72 models used LVDS panels which made this upgrade impossible for every MSI "upgradable MXM gpu".
     
    gt83vr6reHelp and hmscott like this.
  39. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Thank you for your comments gt83vr6reHelp and Lumlx.

    I’ll try to reduce the offset as you said and report the results. You are right, i only have seen ‘oced core ‘ speed in the results reaching 2.073mhz.

    I have to use Dragon Center for safe GPU OC only, because i don’t have options on it to OC my CPU. I have to OC my CPU with Intel Xtreme Tuning (Xtu) and because i have a 7920hq processor, I can’t make more changes than increase a little the BCLK 104Mhz from 99,50 and increase the core and cache multipliers x47, x48.

    I’ve noticed better performance @4K with all maxed out with these offsets in games like Far Cry 5 or Ace Combat 7 but i’ve never tested with those offsets as you are explaining. Is very interesting, let me try and i’ll write here the results.

    Oh! If you know something that I have to do in BIOS or something, let me know (I’ve made ‘classical optimizations’ as disabling coreParking, stop search indexing, maximum performance in Nvidia Control Panel... but I don’t know if I can do anything more).

    Thank you, great forum and great thread!
     
    gt83vr6reHelp and ryzeki like this.
  40. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Hi, these are the results with offset Core +150 and memory offset +250: https://www.3dmark.com/3dm/33259748?

    i get more speed, more Mhz Core clock with the maximuncore and memory offset. The "show details" section said:

    Clock
    2012 MHz (1557 MHz)
    Memory clock
    1315 MHz (1251 MHz)

    I'm going to try a little more and report it.
     
  41. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Increasing a little the offset i have these results: https://www.3dmark.com/3dm/33260230?

    Core clock
    2.063 MHz
    Memory bus clock
    1.330 MHz

    These results with Core Offset at +180 and Memory at 314Mhz.

    The number one in the chart have 2.073Mhz but the memory 1.339Mhz so i think that the memory offset could be increased to +350Mhz to get the maximum speed but for core it's totally different, maybe the "sweet spot" is at +190 (the core doesn't need maximum offset to get the best speed, as you said).
     
  42. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    38
    Messages:
    205
    Likes Received:
    159
    Trophy Points:
    56
    I suggest you to enable in MSI afterburner OSD Power Limit. This way you will see if power limit kicks in or not. If it does, nothing much you can do besides unlocking these cards. Well it's possible to tinker with Voltage/Frequency curve editor but still you will be limited mostly by vbios.
     
  43. gt83vr6reHelp

    gt83vr6reHelp Notebook Consultant

    Reputations:
    40
    Messages:
    269
    Likes Received:
    137
    Trophy Points:
    56
    Yeah I think you might be right about that, I'm gonna take it for a spin and check it out.
     
  44. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    It's because the jank way they do it. From what I understand it either gets its feed from the motherboard controller so it can control it that way or off the GPU itself and the screen just "prints" what's thrown at it.

    Sent from my LM-Q710.FGN using Tapatalk
     
  45. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Yes, you are right, there's a power limit. These are the results: https://www.3dmark.com/3dm/33261894?

    I get "perfect stable" CPU clock at 4.393Mhz, GPU memory clock "perfect stable" at maximum offset, geting 1.339Mhz. I hit the maximun Core Speed (2.076Mhz) with an offset at +190.

    So decreasing a little the Core Offset i've get 25-26Mhz "extra".

    The graphics score is nearest to 44.000 points, so i think that is incredible for a notebook (98% better that the other results).

    I don't want to modify the voltages, so i think this is the "most safe OC" that it can be done with my GT83Vr
     
    hmscott likes this.
  46. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    a 44K GPU score is not only incredible, its amazing for any type of PC, let a lone a laptop. I am happy with my 24K score, and you are nearly double that, great scaling. If only SLI was implemented better in gaming, can you imagine the insane gains?
     
    ALLurGroceries likes this.
  47. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Totally agree. It’s an amazing machine. One more question, i’ve Set the OSD in Afterburner and I get the ‘power limit label’ (‘LIM1 power’) with default clock speed (without oc). Is it normal?... I don’t notice any problems but this the first time that I have seen it because i never used it...
     
  48. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,084
    Trophy Points:
    431
    If you are wondering if you can hit power limits on default clocks, then yes, particularly if something is very demanding.
     
    Miguel1234 likes this.
  49. Miguel1234

    Miguel1234 Notebook Enthusiast

    Reputations:
    5
    Messages:
    14
    Likes Received:
    11
    Trophy Points:
    6
    Ah ok, great, thank you. It’s amazing to get 44k GPU score with this Mega-Laptop. You can connect it to a 4K tv or monitor and enjoy very good playable framerates (even in ‘ultra-extreme-settings’).

    Check these results: https://www.3dmark.com/3dm/33264394?

    44.049 GPU Score points for two laptop's 1080gtx and 13.066 Physics score with an I7-7920HQ both piezes of hardware designed for a laptop ¡!

    I don't know how it could be the next MSI laptop "GT Titan" with a Core I9 and 2x2080 RTX GPUs...
     
  50. NuclearLizard

    NuclearLizard Notebook Deity

    Reputations:
    162
    Messages:
    939
    Likes Received:
    728
    Trophy Points:
    106
    That is an extremely good score.


    Though I don't think we are getting another entry in this line of laptop. Sadly Nvidia and the ram manufacturers cranked the prices up so high on their products no one is buying them.

    Though if they wanted to keep the form factor and move to a desktop cpu and GPU I would be happy to follow.

    Sent from my LM-Q710.FGN using Tapatalk
     
    Miguel1234 likes this.
← Previous pageNext page →