The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. Ihtfp06

    Ihtfp06 Notebook Guru

    Reputations:
    5
    Messages:
    67
    Likes Received:
    18
    Trophy Points:
    16
    The performance hit is irrelevant when the performance I’m getting allows the game to run at maxed out settings. As I previously mentioned, I’m getting 60 FPS at 2560x1440 on Ultra with every game I’ve played except Deus Ex.


    Sent from my iPhone using Tapatalk
     
  2. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Ofcourse its not irrelevant, games that become more demanding show the limitations far more quickly than not having a bottleneck.
     
  3. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    Vasudev likes this.
  4. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    Did you notice increased snappiness after BCLK OC?
     
  5. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    yes a little bit, i thought it was my imagination lol
     
    6.|THE|1|BOSS|.9 and Vasudev like this.
  6. Vasudev

    Vasudev Notebook Nobel Laureate

    Reputations:
    12,050
    Messages:
    11,278
    Likes Received:
    8,816
    Trophy Points:
    931
    When you add Spoiler microcode, snappiness will be 3x. Ask @6.|THE|1|BOSS|.9 he's beta tester.
     
    6.|THE|1|BOSS|.9 likes this.
  7. spikerules

    spikerules Notebook Guru

    Reputations:
    0
    Messages:
    55
    Likes Received:
    4
    Trophy Points:
    16
    I have a question about the AGA. I have an Alienware 51m and want to use a second GPU for DaVinci Resolve that allows for a multiple GPU setup (NOT SLI). But I have heard the AGA turns off the built in GPU, is this true? I've also heard people with a GSYNC monitor (like I have) can access both GPU's. I would like to know before I purchase, otherwise it may be wiser to go for a TB3 eGPU?

    Thanks!
     
  8. 6.|THE|1|BOSS|.9

    6.|THE|1|BOSS|.9 Notebook Evangelist

    Reputations:
    915
    Messages:
    498
    Likes Received:
    970
    Trophy Points:
    106
    True , smoothness is observed since microcode itself fix bugs on scaling frequencies according to load,etc... ;) and with BCLK OC.. SA get OC'ed too... so it needs to get a small bump of voltage for the sake of OC and RAM OC itself :)

    Use the right microcode is always great benefits.. I was hitting only 102.7Mhz BCLK OC but.. after I used the latest microcode available here..
    http://forum.notebookreview.com/thr...fix-and-meltdown.806451/page-17#post-10888780

    http://forum.notebookreview.com/thr...ers-welcome-too.810490/page-468#post-10888790

    I got over the limit and I could go 107.9Mhz BCLK oc :)
    Have a look on some valuable information and knowledge how microcode can do huge difference in OC potential ;)
    https://www.overclock.net/forum/5-i...de-update-cpu-microcode-through-software.html

    The same idea apply to Skylake.. but don't know if applies for higher generations really :rolleyes:
    Due to BCLK my RAM is currently overclocked way too far.. over 2800Mhz now :D still trying to squeeze performance along with other OC... I just simply need the time because RAM is the most frustrating, lame, time consuming to get a fully stable RAM :(
     
    Last edited: Apr 15, 2019
    judal57 and Vasudev like this.
  9. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    It only turns off the discrete GPU if that discrete GPU isn't running the laptop screen. Otherwise, both the dGPU and the AGA GPU are enabled.
     
  10. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    i will try that micro code and i will see if i can go up from 102,67 that is my max THX
     
    propeldragon and c69k like this.
  11. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    @ |THE|1|BOSS|.9 install completed with code 0. thay means everything installed fine ?
     
  12. TomC69

    TomC69 Notebook Geek

    Reputations:
    10
    Messages:
    97
    Likes Received:
    43
    Trophy Points:
    26
    If you project only to an external monitor via the AGA GPU's DisplayPort then is the 51m's igpu and dgpu essentially disabled?

    I'm looking forward to seeing someone with an Area 51m and AGA w/rtx2080ti (and/or with a gtx1080ti) and external gaming monitor (and only projecting to it) to see how much more performance this gives compared to a 51m w/internal rtx2080. Plus, to see if this results in much lower 51m temps with the AGA.
     
    FXi likes this.
  13. gthirst

    gthirst Notebook Evangelist

    Reputations:
    97
    Messages:
    342
    Likes Received:
    231
    Trophy Points:
    56
    Anyone have a suggestion as to what the best RTX 2080 Ti cards are for an AGA?

    I have a 13 R3 i7-7700HQ with 1060 and a GTX 1080 in AGA. I just bought a desktop with a i7-8700 and RTX 2080. I think I could save a few bucks and sell my barely used 1080 if I just bought an RTX 2080 Ti and returned the desktop. I figure performance may even be better this way (entirely dependent on CPU/GPU dependence).
     
  14. TomC69

    TomC69 Notebook Geek

    Reputations:
    10
    Messages:
    97
    Likes Received:
    43
    Trophy Points:
    26
    I'm no PC expert but I do not think that your laptop's i7-7700hq will fully utilise a rtx2080ti in an AGA. If I were you I would try your desktop's rtx2080 in your AGA and see how the performance of it compares to your gtx1080. You may find that there is not a lot in it. You may also want to try your gtx1080 in the desktop to see how it performs compared to it's rtx2080 and your laptop.

    Your laptop's i7-7700hq cannot be overclocked and is getting a little long in the tooth now so I don't think adding and expensive rtx2080ti to it is really going to make it a lot more future-proof.

    Your desktop's i7-8700K is probably ~50% higher performance than your laptop's i7-7700hq so it should be able to handle the rtx2080 and rtx2080ti no problem, and hence is a lot more future-proof imho.

    If I were you I would probably sell your laptop and/or AGA w/1080 and keep the Desktop w/rtx2080. Later, maybe look at selling the rtx2080 and upgrade to a rtx2080ti later. Your $'s so your choice of course, lol!
     
  15. gthirst

    gthirst Notebook Evangelist

    Reputations:
    97
    Messages:
    342
    Likes Received:
    231
    Trophy Points:
    56
    Well I need the laptop for when I'm out and about. At least 50% of my gaming is when I'm away from home. I got a great deal on my 13 R3 and 1080+AGA and it is the perfect device for me right now... So the desktop is just icing on the cake.

    I figure I can sell the GTX 1080 and make a good amount back and the 2080 Ti will be a bit cheaper than the deal I got on the 2080/8700.

    I briefly looked into some benches:
    https://www.3dmark.com/compare/fs/19029516/fs/16754751/fs/18962704

    My desktop is there as well as my 13 R3+1080. In addition, the highest 7700HQ+2080Ti combo is on there as well - s1574a (may be a user here!) That is using an AGA with some overclocking.

    Pretty interesting results. Annhilates the 2080 in my desktop on graphics scores. 30-40% increase with the 2080 ti in AGA. However, the CPU difference is huge, the 7700HQ clearly lags behind the 8700 by a large margin.

    I'd love to see comparisons in games, but it is a bit too specified without knowing the other user personally. I may just end up sticking with the 13R3+1080, which seems to have a lot of legs left if I'm only pushing 1080p/60. Would hate to return the desktop+monitor (34" UW 1440p MSI), but it may make the most sense for me.
     
  16. TomC69

    TomC69 Notebook Geek

    Reputations:
    10
    Messages:
    97
    Likes Received:
    43
    Trophy Points:
    26
    gthirst, if you need a laptop and mainly only play 1080p/60hz then it probably makes sense to just stick with what you have for now and return the desktop.

    Whether or not your i7-7700hq can handle much more than your gtx1080 probably depends on which apps you are running. I mainly use Flight sims (like X-Plane 11) and racing sims (like Assetto Corsa) and these tend to be very CPU dependant.

    Maybe try the monitor connected to your laptop's AGA gtx1080 hdmi (or better with DisplayPort) and go to your windows display settings and select only to project to this monitor (not your laptop screen). If you run the monitor at 1080p/60hz you might be surprised how much extra performance you will get using this compared to your laptop screen. Also, using a monitor at home is going to be a lot more fun that using a puny little 13in laptop screen imho. You probably really only need an inexpensive 1080p 24-27 in 120-144hz gaming monitor to begin with.
     
  17. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    They're not essentially disabled. They're simply not used for the monitor connected to the AGA GPU.
     
    TomC69 likes this.
  18. Y.U.KILL.ME

    Y.U.KILL.ME Notebook Enthusiast

    Reputations:
    5
    Messages:
    33
    Likes Received:
    8
    Trophy Points:
    16
    hey guys.
    I just got a used gtx1080 ti.
    when I tested it with Furmark and unigine heaven everything worked fine with no crashes - each test was for 20 minutes & the temperature was around 65c
    but I start experiencing crashes when gaming on ultra and high
    I have the stock PSU and I think it is not functioning well
    unfortunately, I don't have another PSU to try out
    I really hope it's not a bad GPU
    what do you guys think?
     
  19. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Was it previously used for cryptomining? If so the GPU may be pretty worn out.
     
  20. Y.U.KILL.ME

    Y.U.KILL.ME Notebook Enthusiast

    Reputations:
    5
    Messages:
    33
    Likes Received:
    8
    Trophy Points:
    16
    the guy who I bought it from said I only used it for gaming .. but you never know !!
     
  21. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Try to test it in a desktop PC of a friend or something, Just to rule out that the card itself isn or isnt faulty.
     
  22. Sentential

    Sentential Notebook Evangelist

    Reputations:
    76
    Messages:
    328
    Likes Received:
    272
    Trophy Points:
    76
    What kind of crashing are you seeing exactly? My experience crashing is usually memory related either system or storage. The only time I've had GPU failures are usually my fault (cracked/chipped) and they normally hard fault as in full VPU recovery and display some kind of artifact. Furthermore from previous experience dealing with AGAs they tend not to like same-vendor IDs for swapping. Those of use who used AMD based AGAs had *far* fewer issues than Nvidia->Nvidia combos. Not saying AMD is better I'm saying Microsoft blows and you're probably seeing a driver/deviceID problem. Try device manager, delete all existing driver software from the integrated GPU and install the desktop drivers over the AGA card while the 1060 is electrically disabled and see if that helps. It'll probably dump your battery life using non-mobile drivers however the crashing will likely stop

    I doubt that sincerely. Caps wear out, FETs wear out silicon does *not* wear out. I work in a datacenter where we have old San Diego Opterons that have been fully loaded 100% for the better part of 11 years, have had CMOS battery replaced and are still "in production" so based on that I'd blame his PSU over the GPU.
     
  23. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    A CPU is MUCH simpler than a GPU simply because it is a contained semiconductor part. GPU boards contain that semiconductor as well as memory, power circuitry, I/O and other stuff. I did not mean the actual GPU chip would fail but rather one of the lesser components on the same board as you mentioned.
     
    Sentential likes this.
  24. Sentential

    Sentential Notebook Evangelist

    Reputations:
    76
    Messages:
    328
    Likes Received:
    272
    Trophy Points:
    76
    I hear ya, I just think it's more likely to be a driver issue than a hardware one.
     
  25. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    I am not well versed in cryptocurrency mining - always viewed it as a fad - but my understanding is that the serious miners would run custom firmwares on their cards to detune them some for stability and then run them at 100% load for very long periods of time which is not what consumer focused GPUs were designed for. This would/could lead to accelerated fatigue and failure of components such as memory chips and power circuitry. Both of which will contribute to difficult to troubleshoot instability.

    In the horse world they have a saying, "rode hard and put up wet" that feels like it would apply here.

    Due to the shifts in the crypto world after the last couple years, a ton of these cards will be entering the used market with no obvious external signs they were used as such making it a roll of the dice for the next buyer. For this reason, I would be extremely suspect of used high end GPUs for a while as they were specifically targeted by miners.
     
    TomC69 likes this.
  26. Sentential

    Sentential Notebook Evangelist

    Reputations:
    76
    Messages:
    328
    Likes Received:
    272
    Trophy Points:
    76
    I actually mined on an AGA and it isn't as hard on the components than you think it would be. The worst that you'd see are dead fans from use, if anything a mining card is better than a babied gaming one as it has fewer thermal cycles on it plus they are normally run 20-30% below OEM power targets.
     
    c69k likes this.
  27. Y.U.KILL.ME

    Y.U.KILL.ME Notebook Enthusiast

    Reputations:
    5
    Messages:
    33
    Likes Received:
    8
    Trophy Points:
    16
     
  28. Y.U.KILL.ME

    Y.U.KILL.ME Notebook Enthusiast

    Reputations:
    5
    Messages:
    33
    Likes Received:
    8
    Trophy Points:
    16
    a very simple crash does not affect the system by causing a freeze or restart .. like the one you see when overclocking the core too much
    I also I don't see any weird colors or aliens jumping around
     
  29. Y.U.KILL.ME

    Y.U.KILL.ME Notebook Enthusiast

    Reputations:
    5
    Messages:
    33
    Likes Received:
    8
    Trophy Points:
    16
    unfortunately I live abroad and I don't have friends with gaming pcs. but I will take it to a pc shop and ask them nicely to test it for me
     
  30. Sentential

    Sentential Notebook Evangelist

    Reputations:
    76
    Messages:
    328
    Likes Received:
    272
    Trophy Points:
    76
    Soft faults like that sounds like either a driver or RAM problem to be honest. A hard fault like loss of power would lock the system to the point where you'd have to force restart it, same goes with the GPU failure. Remember when your AGA is active it electrically disables the 1060 so the only GPUs you have "active" are the IGP and the AGA. If the AGA failed due to power you'd have to force restart as there isn't a GPU to fail over to. Does that make sense?

    Try running memtest86 and see if it comes back with anything. Otherwise try going into device manager without the AGA and uninstall the 1060 plus remove driver software. Restart with the AGA installed (so the 1060 is disabled) and then install the desktop 1080 drivers. If that stops your crashing then disconnect the AGA, reboot and the 1060 should pick up the desktop nVidia drivers and that should fix it. Also uninstall Geforce Experience.

    If still no dice try ruling out the integrated 1060 by using it by itself and if that fixes it and the previous tries don't I'd send the 1080 back and pick up an RX Vega instead.
     
  31. Y.U.KILL.ME

    Y.U.KILL.ME Notebook Enthusiast

    Reputations:
    5
    Messages:
    33
    Likes Received:
    8
    Trophy Points:
    16
    thanks for your patience and time
    games run absolutely fine on the internal Nvidia GPU which makes me believe that crashes have nothing to do with bad ram or storage.
    I also reinstalled windows just to be sure it's not a bad driver thing
    btw crashes disappear when I underclock the GPU !!
    I will give it a try and undervolt the GPU to see if it will pass
     
    c69k likes this.
  32. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    For those asking about if a 2080Ti will work fine or not in the AGA, it will.

    I had a Titan X (Pascal) in my 13 R3 running very well.

    As long as you're using an external screen the hit is very minimal:

    https://www.3dmark.com/fs/13653218
     
    FXi likes this.
  33. TomC69

    TomC69 Notebook Geek

    Reputations:
    10
    Messages:
    97
    Likes Received:
    43
    Trophy Points:
    26
    I see you have a nice new Area 51m w/rtx2080. It would be interesting to see how this would perform with an AGA w/rtx2080ti. Might even help reduce temps on your laptop.
     
    FXi likes this.
  34. dsmrunnah

    dsmrunnah Notebook Guru

    Reputations:
    5
    Messages:
    51
    Likes Received:
    39
    Trophy Points:
    26
    To be honest, with the Area 51m 9900K performing as good as they are, I would expect them to perform right with my desktop if they had a 2080Ti in an AGA. I have only tested my AGA with an EVGA 2080 (went straight into watercooling with my 2080Ti), but I didn't see really any bottleneck of 4x Lanes vs 16x Lanes:

    https://www.3dmark.com/compare/fs/17632650/fs/18159004#

    Comparing only the Graphics Score you can see they're pretty close. Keep in mind, that I put a slightly higher clock on the memory with the desktop test (as you can see in the test). Now the 2080Ti might pull enough bandwidth to bottleneck with only 4-lanes, but I'm interested as well.
     
    FXi and TomC69 like this.
  35. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Not enough of a performance upgrade to justify buying a 2080Ti.

    My 2080 runs cool. Maybe next generation I'll try it out. Or if someone wants to lend me a 2080 or 2080Ti I can compare against the internal 2080. :p
     
    FXi and TomC69 like this.
  36. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    On the Area-51m, do the PCIe lanes for the AGA still come directly off of the CPU? (Same for Thunderbolt 3).
     
  37. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Tb3 are always for now connected to the pch instead.
     
  38. gthirst

    gthirst Notebook Evangelist

    Reputations:
    97
    Messages:
    342
    Likes Received:
    231
    Trophy Points:
    56
    Simple question. I know the AGA performs better when outputting to an external, but can I use both an external monitor AND the built in display?

    For productivity purposes. I'm building a bit of a workstation and plan to have a monitor with an arm via VESA mount. I can buy the same mount with dual arms with a laptop stand on one arm. That would let me put the monitor next to the laptop. Basically, it is so I can transition between sitting and standing. I wouldn't mind it for some light gaming on one display and Netflix on the other.
     
    propeldragon likes this.
  39. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    you can use both displays for productivity, for gaming you should use the external instead
     
    Vasudev likes this.
  40. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    In any instance where you are sending an image back over the AGA cable back to the laptop you will affect the performance in a negative way. So for gaming purposes i would just output to the external monitor only.
     
    Vasudev and TomC69 like this.
  41. NomisR

    NomisR Notebook Consultant

    Reputations:
    22
    Messages:
    167
    Likes Received:
    0
    Trophy Points:
    30
    So I bought my AGM not long after I got my m15, and finally picked up my 2080Ti FE. I was getting the coil whine from the PSU so I replaced it with a Corsair CX550m and also the fan. Regular games runs without any issue with the old and new PSU. But when I finally decide to get a game with Ray Tracing, Metro Exodus, the game crashes when Ray Tracing is turned on, but would run fine without it. The programs would crash when I tried to run the different ray tracing demos although they work without ray tracing on.. (the ones that allows it to be turned off). I am on 1809 version of Win 10 and 430.39 version of graphics driver. Anyone have any idea if it's an AGM issue or would I have to return the video card?
     
  42. twin snakes

    twin snakes Notebook Consultant

    Reputations:
    97
    Messages:
    175
    Likes Received:
    117
    Trophy Points:
    56
    Maybe the problem is the game itself
    Try running the 3dmark Port Royal first to see that RT cores are working properly or not
    If it does, then the game is the culprit
    If not, time to RMA the card
    AGA unit has nothing to do here
     
  43. NomisR

    NomisR Notebook Consultant

    Reputations:
    22
    Messages:
    167
    Likes Received:
    0
    Trophy Points:
    30
    So I ended up downgrading to the 419.67 drivers it Ray tracing seems to run now. I did notice one thing though, when I run the benchmarks with my built in GPU on my laptop, while the benchmarking scores were lower, the benchmarks were playing really smoothly. But with the 2080Ti in the AGA, it seems to stutter every so often. Not sure if i need to try various drivers until it works smoothly.
     
  44. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    do you have the stock PSU ??? i had the same problem, i ended up replacing the psu for a NZXT650E, i have to mod a little bit the case to make it fit. The point is that a gold rated or grater psu can hold the wattage spikes better than the stock one
     
    Vasudev likes this.
  45. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    @NomisR too add i use a gtx 1080ti with my AGA.
    at full load the stock psu screams like a baby hahahha the coil wine was insane
     
    Vasudev likes this.
  46. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    So you are trying to do ray tracing on a GPU that has no fixed function hardware to handle it? That may be impacting the experience.
     
  47. judal57

    judal57 Notebook Deity

    Reputations:
    274
    Messages:
    1,164
    Likes Received:
    650
    Trophy Points:
    131
    where i said Ray tracing ??? o_O
     
    Vasudev likes this.
  48. cookinwitdiesel

    cookinwitdiesel Retired Bencher

    Reputations:
    4,365
    Messages:
    11,264
    Likes Received:
    263
    Trophy Points:
    501
    Sorry, I saw nomis and mixed it up.
     
  49. NomisR

    NomisR Notebook Consultant

    Reputations:
    22
    Messages:
    167
    Likes Received:
    0
    Trophy Points:
    30
    I updated the PSU to a Corsair CX550m.. i finally went down to the 417.XX driver that was released in Jan, that one seems to be smoother but still stutters in the benchmarks although seems to work fine when playing, I'll stick with that one for now.
     
  50. TomC69

    TomC69 Notebook Geek

    Reputations:
    10
    Messages:
    97
    Likes Received:
    43
    Trophy Points:
    26
    I don't change nvidia drivers all that often. I wait until there is a good reason, and lots of positive feedback. I'm on 417.35 (notebook version, not that it probably makes any diff) and it's nice and stable with my AGA w/gtx1080ti (Zotac blower, basically FE).

    I have not had any problems with the stock 450w psu or the front fan. The front fan is always blowing (a bit noisy but not really that bad) and I've never heard any whining from the psu. The only whining around here is when my wife goes over my monthly paypal account activity, lol!

    One thing that I found with my AGA is that my gtx1080ti did get a little on the hot side when running high GPU levels (anything above ~75%) and I never really noticed the GPU's blower fan ever actually operating. I installed msi afterburner and used it to make a fan curve and this has worked out well. My GPU's nice and cool (usually 65-70 deg C max). I'm using a flat 40% fan to 50 deg C, 60%/60 deg C, 100%/100 deg C.
     
← Previous pageNext page →