The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *OFFICIAL* Alienware Area-51M R2 Owner's Lounge

    Discussion in '2015+ Alienware 13 / 15 / 17' started by Spartan@HIDevolution, May 9, 2020.

  1. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    898
    Trophy Points:
    131
    Secure boot can be disabled.

    This is on BIOS 1.3.0

    [​IMG]
     
  2. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    898
    Trophy Points:
    131
    I don't have the game but it sounds like it's CPU bound and not making use of all your cores. It's likely just maxing out 4? Hence the 50% utilisation but the high temps.

    Check via task manager and you'll see.
     
  3. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    I have it so I'm going to run it for you :)


    Around 150 w or so but the game is laggy tonight and I have to wait another 20 min to get into the server.. will investigate further
     

    Attached Files:

    Last edited by a moderator: Aug 12, 2020
    Kalen likes this.
  4. Calipha

    Calipha Notebook Guru

    Reputations:
    4
    Messages:
    54
    Likes Received:
    10
    Trophy Points:
    16
    Same for me, that seems low considering the max TGP this particular card has.

    Though even so I don't even get the FPS you're getting.

    Did you do any undervolting or overclocking? If so, with what program(s), how, and what values?
     
    Last edited by a moderator: Aug 12, 2020
  5. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    I just added 180 mgz on core clock and 300 on memory clock
    Since I've repasted and change the thermal pads the thermals are pretty okay so I have some room to overclock

    The cpu is underclocked -130 mv with xtu
     
  6. Calipha

    Calipha Notebook Guru

    Reputations:
    4
    Messages:
    54
    Likes Received:
    10
    Trophy Points:
    16
    Ohhh- I see.

    Would said undervolting & overclocking be possible with stock thermals? I'm not very familiar with XTU, how were you able to acomplish that?

    Did you add the GPU OC via AWCC or MSI Afterburner?
     
  7. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
  8. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    Well you can always slightly undervolt the GPU with msi creating a fan curve by pressing ctrl+F ( there is a few tricks to do to enable the voltage sensor and etc )
    Difficult to say because by decreasing the voltage you will have less possibility to increase the clock of the GPU but better thermals so a more stable clock .

    Regarding the cpu if everything is stock it will be almost impossible to increase the clocks over the stock ones as the 10900k runs quite warm, in order to keep good thermals I had to delid the cpu change the paste as well as the pads and undervolt and even then I cannot overclock to 5.0 GHz on all cores without throttling with cinebenchr20 was pretty okay though while playing but I didn't see any benefit gaining 100 mghz on all cores.

    If you haven't used xtu yet I recommend you first go into command center and turn the overclocking off if not done already.

    No need to change the core voltage you can leave is as default as it requires a bit of knowledge to start tricking it.

    You should start with putting a core voltage offset of -80 mv wich will apply to both core and cache so no need to do both like with throttlestop.

    Then run cinebenchr20 check the thermals and the stability, game a little bit and if no freezing you're good to move by increment of -10 mv and doing all the tests again.
    It is worth noting that from -100 mv I would only add a increment of -5mv as each cpu is different so you just wanna find the sweet spot :)

    If it crashes you just wanna roll back to your last offset then you can save the profile so you can apply it
     
    aeffect and normand668 like this.
  9. Calipha

    Calipha Notebook Guru

    Reputations:
    4
    Messages:
    54
    Likes Received:
    10
    Trophy Points:
    16
    Alright, thanks.

    I'll try this out.
     
  10. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    As anyone else changed the thermal pads ?
    Just wanna compare the measurement of the GPU thermal pads as I am not sure I am using the right ones
    Here are the measurements I took
    Sorry for the bad drawing lol
     

    Attached Files:

    Satanello and normand668 like this.
  11. normand668

    normand668 Notebook Geek

    Reputations:
    82
    Messages:
    80
    Likes Received:
    112
    Trophy Points:
    41
    Thanks for the info mate, same if anyone has any confirmation of this and recommendations for pad brand(s) that would be really appreciated. I'm over the moon already with the stock thermals, however will no doubt repaste and repad the heatsink if anyone can confirm the correct dimensions and even better provide any difference in temps.
     
  12. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,343
    Likes Received:
    2,345
    Trophy Points:
    181
    Can’t wait to order a 3080 for the r2. Oh wait might have to wait for the r3 next year..lol jk guys
     
  13. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    I can confirm those measurements are almost 100% accurate .

    I had 69 degrees stock pads with superposition benchmark now I'm hitting 67 max with an ambient temp of 24 degrees
    Though these pads are gelid gc extreme and I'm looking to upgrade to fujipoly wich requires no mistake at all so I just gotta try first to confirm it would be okay with fujipoly as they are much harder .

    The same with the cpu I've ordered a spare heatsink from a reseller on eBay to have the stock pads , I tried yesterday 95 degrees cpu with cinebenchr20 stock pad and with gelid gc extreme I get 85 degrees max .

    Will be posting tonight a neat picture with all the measurements I have with the gelic gc pads
     
    normand668 and Arkengelus like this.
  14. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    898
    Trophy Points:
    131
    I get you're joking but i'm pretty sure everyone who's bought one here went in knowing there won't be an upgrade to the 3080 and bought the system for it's performance as is.
     
  15. Hypertora

    Hypertora Notebook Enthusiast

    Reputations:
    8
    Messages:
    10
    Likes Received:
    19
    Trophy Points:
    6
    I'm yet to do any delidding/repasting (Too hot in the UK to want to do this at the moment). I have got my 64GB Kingston Hyper X (2x32GB) installed and all seems to be working fine on that front.
    [​IMG]

    The main issue I am coming into at the moment is with my NVME replacements. My fircuda drives in the main 2 slots are fine and in Raid0. But my 4TB Sabrent just won't work in slot 3 without the BSOD Kernel Security Check Failure, while it works perfectly fine if I move it to slot 4! Less of an issue if I didn't have 2 of these drives to install...
     
  16. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
    As we know from earlier comments, some bought it believing Dell is providing reasonably lasting value here, since the next iteration is a year away. Alas, consumer Ampere will debut a month ahead of expectations on 1st Sep, and consequently the new 51M might well be available in time for Xmas.
     
    Last edited: Aug 12, 2020
    jc_denton likes this.
  17. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    Small improvement in SSD scores after the recent IRST update from Dell:

    AS SSD Benchmark - 2x Western Digital PC SN730 NVMe SSDs in RAID 0 [64K Data Strip Size]

    [​IMG]

    AS SSD Benchmark - Single Western Digital PC SN730 NVMe SSD

    [​IMG]
     
    normand668 and jc_denton like this.
  18. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,343
    Likes Received:
    2,345
    Trophy Points:
    181
    Yeah it was a joke
     
  19. Cerreta28

    Cerreta28 Notebook Evangelist

    Reputations:
    74
    Messages:
    635
    Likes Received:
    311
    Trophy Points:
    76
    Does the new 51m r2 have the F7 option where it would switch to the igpu . I ordered one with 144hz and no gsync wanted to make sure i can do that


    Sent from my iPhone using Tapatalk
     
  20. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    Nope
     
  21. Cerreta28

    Cerreta28 Notebook Evangelist

    Reputations:
    74
    Messages:
    635
    Likes Received:
    311
    Trophy Points:
    76
    Shot i would have gotten the 300hz panel lol . Can the amplifier got back to the laptop display ?


    Sent from my iPhone using Tapatalk
     
  22. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    Yes, the amplifier can work with the internal laptop display.

    Why would you need the Intel GPU anyway on such a laptop?

    The HDMI port is HDMI 2.0 which can support 4K 60hz. link The display port is 1.4 which supports 4K 120hz or 8K 60hz. link


    https://www.dell.com/support/manual...940a51-116a-48c1-9082-9489b6182e0a&lang=en-us


    The thunderbolt port is connected to the Intel GPU though
     
    Last edited: Aug 13, 2020
    jc_denton likes this.
  23. Cerreta28

    Cerreta28 Notebook Evangelist

    Reputations:
    74
    Messages:
    635
    Likes Received:
    311
    Trophy Points:
    76
    Wow a lot has changed huh


    Sent from my iPhone using Tapatalk
     
  24. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,835
    Likes Received:
    59,597
    Trophy Points:
    931
    We won't see Ampere for notebooks this year. Maybe middle spring 2021 but not before that time frame.
     
  25. jc_denton

    jc_denton BGA? What a shame.

    Reputations:
    10,923
    Messages:
    3,036
    Likes Received:
    5,781
    Trophy Points:
    581
    They got to dump their 20x0Super stock first. :D
     
  26. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,691
    Messages:
    29,835
    Likes Received:
    59,597
    Trophy Points:
    931
    Tech sites already talk about Super Ampere for desktops next year. I expect Nvidia have started bin chips for those and standard for notebooks for next year. They have to reduce power draw for Notebook Ampere graphics. Aka they have to bin chips. This take some time. But what is better than throw out Turing a few months more when AMD have nothing to offer notebooks? Milk while you can. The Ampere cards will be in limited numbers before Christmas and desktops will have first prio due coming Big Navi.

    See.. Nvidia get double gain. Push out old chips for notebooks and enough new chips for desktops to stop AMD.
     
    Last edited: Aug 12, 2020
  27. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
    Well, RTX 2080 was revealed at the end of Sep 2018, and 51M was introduced in Jan 2019, and that was a completely new model. We can safely assume NVIDIA has already been manufacturing Ampere and dumping old RTX stock, they are not stupid.
     
  28. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56

    The amplifier can also work with the laptop display I've tried it with my rtx 2080 ti
     
  29. Terreos

    Terreos Royal Guard

    Reputations:
    1,165
    Messages:
    1,846
    Likes Received:
    2,258
    Trophy Points:
    181
    Your logic is so sound it hurts. Stop it. :(

    I honestly wonder if AMD will ever be competitive in the gpu market? Nvidia isn’t like Intel and just sit on what they have until the competition catches up.
     
    Biker Gremling likes this.
  30. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    @etern4l Weren't you who told me that the graphics amp is only to power an external monitor bro? Or was it @ssj92
     
    etern4l likes this.
  31. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
    I don't recall saying that, but obviously the amp works much better with an external display, because that way it doesn't lose bandwidth on pushing data back to the laptop. It can make a huge difference, especially at high FPS.
     
  32. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    That's why I don't bother using my 2080ti on the laptop because with the gap of loosing bandwidth it's almost the same as the 2080 super .
    Though there is almost no loss if plugged to an external monitor .
    I do believe with the ampere if there is really a bump of 30-40 performance it will be worth using on the laptop
     
    Spartan@HIDevolution likes this.
  33. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
    I don`t bother using laptop screens for anything other than mobile use or as a secondary screen, due to the limited screen real estate, but yeah - using the AGA with the internal display is a waste.
     
  34. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    Hey guys
    After changing the thermal pads of the GPU with fujipoly ones I have coil whine coming from it
    All the measurements I took are accurate and the GPU don't go over 69 degrees after stress test so I'm a bit confused
    Any idea what to do ?
     
    etern4l likes this.
  35. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    898
    Trophy Points:
    131
    I assume the fujipoly don't dampen the sound as well as the dell ones did. There isn't much you can do about coil whine though.
     
    Biker Gremling likes this.
  36. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    I'm just changing the one on top of the vrm with gelid gc extreme which was working fine before because the noise is unbearable lol
     
    ratchetnclank likes this.
  37. alaskajoel

    alaskajoel Notebook Deity

    Reputations:
    1,088
    Messages:
    1,031
    Likes Received:
    964
    Trophy Points:
    131
    Don't expect a great experience doing this on the 51m. There is no direct display output connection between the AGA and internal display and it does not support Optimus so an image drawn on the AGA can only be output on the internal display via DXGI.
     
    Papusan, raz8020 and etern4l like this.
  38. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    Last edited: Aug 13, 2020
  39. Calipha

    Calipha Notebook Guru

    Reputations:
    4
    Messages:
    54
    Likes Received:
    10
    Trophy Points:
    16
    Spartan@HIDevolution likes this.
  40. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
  41. Spartan@HIDevolution

    Spartan@HIDevolution Company Representative

    Reputations:
    39,567
    Messages:
    23,559
    Likes Received:
    36,825
    Trophy Points:
    931
    Nope
     
    etern4l likes this.
  42. G46VW

    G46VW Notebook Consultant

    Reputations:
    142
    Messages:
    286
    Likes Received:
    217
    Trophy Points:
    56
    Thats not low at all. Every map requires different amounts of power from the GPU depending on how its optimized to run. For me, as in BF5, some maps will pull the full wattage of my 2080 were the FPS will be almost maxed out at 200 fps and some maps, like Underground, will not do that as they are optimized different and will not pull the full amount of wattage from the card. Were talking competitive multiplayer here, not single player.
     
    Last edited: Aug 13, 2020
  43. AMoo-Miki

    AMoo-Miki Notebook Enthusiast

    Reputations:
    10
    Messages:
    18
    Likes Received:
    19
    Trophy Points:
    6
    Can anyone think of a reason why G.SKILL Ripjaws DDR4 3000 CL16 wouldn't work on this? Being substantially cheaper and moderately "faster" than the HyperX Impact DDR4 2666 CL15, I am pretty much drawn to it.
     
  44. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
    Sure, they could be incompatible with the fussy mobo. Actually, many many pages back someone reported Ripjaws weren't stable. BTW Not sure why you are talking about 3000 and 2666,rather than 3200 and 2933.
     
  45. mrboh

    mrboh Notebook Enthusiast

    Reputations:
    7
    Messages:
    27
    Likes Received:
    12
    Trophy Points:
    6
    You can always buy some and experiment for the rest of us ^^

    Sent from my Pixel 2 XL using Tapatalk
     
  46. AMoo-Miki

    AMoo-Miki Notebook Enthusiast

    Reputations:
    10
    Messages:
    18
    Likes Received:
    19
    Trophy Points:
    6
    I made an Excel sheet of all these speeds and latency values where the G.SKILL Ripjaws 3000MHz CL16 (10.7ns) stood way above:
    • HyperX Impact 3200MHz CL20 (12.5ns)
    • HyperX Impact 2933MHz CL17 (11.6ns)
    • HyperX Impact 2666MHz CL15 (11.25ns)
    Among the HyperX Impact ones, I feel the practical speed of 2933MHz CL17 to be the better of the 3.

    ... only if I had the machine :(
     
    raz8020 and etern4l like this.
  47. etern4l

    etern4l Notebook Virtuoso

    Reputations:
    2,916
    Messages:
    3,530
    Likes Received:
    3,482
    Trophy Points:
    331
    Yeah, looks like it. The 3200 runs at CL17 if downclocked to 2933, but it will probably be safer to just go with 2933 CL17.
     
    raz8020 likes this.
  48. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    I called dell about the coil whine if my graphic card and the tech guy was very nice, they organised a pick up by ups since I'm still under the 14 days, well today is the 14th so very lucky :)
    The thing that is the most incredible he right away proposed to create a new order for me free of charge that has been accepted already and will be shipped within 2 to 3 working days.

    I told the tech guy I replaced the thermal pads and he said that has no impact whatsoever and it is not supposed to make that much noise .

    Now the only thing is, should I let them pick it up with the new thermal pads or should I put back the original ones that I kept from the spare heatsink I have ?

    When I send it back are they going to take it appart and blame for changing the pads ?
     
    etern4l likes this.
  49. tmarshallg

    tmarshallg Notebook Enthusiast

    Reputations:
    0
    Messages:
    32
    Likes Received:
    10
    Trophy Points:
    16
    I would put it back to original as best as possible just to be safe.
     
    c69k and Melvin Rousseau like this.
  50. Melvin Rousseau

    Melvin Rousseau Notebook Consultant

    Reputations:
    86
    Messages:
    203
    Likes Received:
    213
    Trophy Points:
    56
    I was thinking so too lol
     
← Previous pageNext page →