Thermals are pretty good all my posts a number of pages back are from a completely stock machine i've not repasted.
-
ratchetnclank Notebook Deity
-
During last 2 months whenever I read about anyone buying brand new R1 or R2 I get a bit question mark in my head. Luckily my 'hard earned money' LOL are not being spent.
If I were to buy a laptop for gaming/video production on Nvidia I would definitely wait for 3080RTX. I have a feeling that the new video cards will be a HUGE step up in performance. I also believe that Frank Azor will use his magic with ex-colleagues from AlienW and make AMD processor happen in R3 as an option, seeing that Intel is not getting 7nm soon.
For me, 1060M still kicks @1080p and if I want in-built OLED 120Hz+ the wait might be long. The first laptop maker who pulls off high Hz OLED will collect cash from all over the Globe, def. from me too.
i would definitely wait or buy 2nd hand R1 till R3 comes out if I had no gaming laptop at the moment.ratchetnclank, normand668 and etern4l like this. -
ratchetnclank Notebook Deity
I was thinking about waiting for the RTX 30xx series but in the end i couldn't be bothered to wait any longer for what will likely be a 20-25% increase in peformance across the board over the R2. Possibly slightly higher with ray tracing enabled.
I'd been with my aging GTX 680m for 8 years so anything was a huge upgrade from that. Plus with the R2 i kinda knew the cooling would be pretty decent as the R1 wasn't bad and they added a vapour chamber and more airflow above the keyboard.raz8020, Papusan, pathfindercod and 3 others like this. -
How exactly did you arrive at the modest 20-25% performance boost in Ampere? More likely to be in the region of 40% over 2080s, similar if not larger performance increases are expected in the raytracing department.
https://www.techradar.com/uk/news/n...he-rtx-2080-ti-according-to-leaked-benchmarksc69k likes this. -
KINGPIN is misspelled for starters, so is that a genuine leak by some random kid in wccftech message cesspool?
-
Melvin Rousseau Notebook Consultant
Mine is waiting for me at home .
I'll be doing a unboxing and review on Saturday most likely I'll be the first onec69k, bsch3r, normand668 and 1 other person like this. -
Can't wait to watch it! Still waiting for mine to ship!normand668 likes this.
-
Just received my A-51M R2 yesterday and these are my scores so far; I think I can still squeeze more out of it.
https://www.3dmark.com/3dm/49149879?c69k likes this. -
You may want to bookmark this thread - http://forum.notebookreview.com/threads/alienware-area-51m-r2-review-by-spartan.833709/
It is a stub for now, but I imagine will be filled out within the upcoming week(s).
Spartan@HIDevolution likes this. -
Hello, I'm new here, thought I'd join since i just joined the club and splurged on an i9 10900K, 2080S, with 4k screen
This is a massive upgrade for me coming from a 5 year old XPS 13. Anyone got any day one recommendations? Things I should do - i read about adjusting the way the battery is handled to save the battery longevity, any other tips i should know? Undervolting I read here, and GPU adjustments (quite new to this side of things). Hoping to learn.
I have a few questions,
In regards to RAM is the HyperX Impact DDR4 2933mhz CL17 regarded as the ultimate for this laptop? Or is there a 3200mhz XMP set which has low latency which someone can recommend? ultimately i just want it for performance. Bought it for a desktop replacement, will use it as my main machine, gaming, photography and video editing etc.
Now I ordered the base 256GB SSD version, since Dell literally tries to ruin you by the extortionate prices of their mediocre storage
i'm sat here wondering what I'll be getting - I'm assuming another M2 NVME slot, is this PCI3.0 also (not limited in any way?), also will I get an empty SATA bay for a 2.5" drive since I won't be getting the daughter board for the additional M2 slots having ordered the base version?
Speaking of which I assume the motherboard doesn't support PCI4.0 in any way - is this correct? So Sabrent out of the question. Thought I'd check because sometimes companies do the whole "we don't support it" but then you try it and it turns out it works. I'm sure there are other questions but I'll try not to bombard for now. Thanks, and hope those who have received theirs are enjoying it. mine's due to arrive here in Cornwall, UK on the 11th August. -
ratchetnclank Notebook Deity
Because every year the leaks say 50% increase over the previous and it's always actually within 20-30% at most.
That's a really good score. Much better than mine. Although i think updating to the latest nvidia driver might have something to do with that for me.Last edited: Jul 30, 2020Kalen, jc_denton, Graeme W and 1 other person like this. -
Dell confirmed that it would replace my R1 with an R2 with:
i9 10900k RTX 2080 Super 300hz screen 16gb 3200Mhz
All this without paying anything!Last edited: Jul 30, 2020 -
ratchetnclank Notebook Deity
I'd take it just for the no thermal throttling.
Also i just realised why my firestrike score was lower. I'd left the geforce experience instant replay enabled. (Doh!)
https://www.3dmark.com/3dm/49152132?
A bit better now.jc_denton likes this. -
ratchetnclank Notebook Deity
Set the battery to start charging at 50% and stop at 80% from the bios. Undervolt and overclock the GPU if you want to.
With regards to memory the hyperx you listed will definitely work we don't know if aftermarket 3200 sticks will work yet but it's likely they won't at that speed.
Storage you will get 2 M.2 pci 3.0 slots and the SATA 2.5" bay. You'll only get the m.2 slots if you ordered at least a 3 SSD configuration.
Intel boards don't support PCI4.0 at all yet.
Hope you enjoy it when it arrives
Graeme W likes this. -
What do you mean by "always actually?" Well, let's go by the most recent generational shift: GTX 1080 vs RTX 2080.
Just by looking at FireStrike Graphics, an old DX11 benchmark, RTX 2080 is 36% faster than GTX 1080. Median Time Spy Graphics score is 48% higher for the RTX.
GTX doesn't even have new features such as ray tracing or DLSS...
https://www.notebookcheck.net/Nvidi...sktop-Review-Pascal-has-arrived.165500.0.html
https://www.notebookcheck.net/NVIDI...phics-Card-Benchmarks-and-Specs.377901.0.html
https://benchmarks.ul.com/hardware/gpu/NVIDIA+GeForce+GTX+1080+review -
Just gotta supercharge that mobile GTX1080
Spartan@HIDevolution, etern4l and ratchetnclank like this. -
ratchetnclank Notebook Deity
https://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html?
Looking at the firestrike averages for graphics between the GTX 1080 and RTX2080 mobile the different is a 18% increase.
I'm not saying it can't be 50% better but it never normally is and the cases where is might be are usually cherry picked scenarios. -
Not clear why you are looking at some ancient deprecated benchmarks like Ice Storm and Cloud Gate.
https://benchmarks.ul.com/news/anno...rk-cloud-gate-and-3dmark-ice-storm-benchmarks
DX12 TimeSpy is the more relevant benchmark: and here RTX shows a whopping +45% performance improvement.Yeah, if you are going to play old games (as reflected by old benchmarks including DX11 FireStrike, which is "just" 20% faster on RTX 2080), you will be fine with old gen graphics.
On the other hand, the case for upgrade from GTX to RTX was super-clear due to the new RTX-only features such as ray-tracing or DLSS. Not sure if any such Ampere gaming features are in the pipeline, there is new stuff on the computational side.Last edited: Jul 30, 2020Rei Fukai, c69k, normand668 and 1 other person like this. -
https://www.3dmark.com/spy/13229800 nice job @therock2284
whats temps are you getting on cpu? stock voltage? -
Spartan@HIDevolution Company Representative
Awesome! I missed the story though how are they gonna do that for you? Did you have an issue with a recently purchased R1 or what?normand668 and c69k like this. -
BTW remember Ampere is likely fabricated in 7 or 8nm vs Turing's 12nm. That's huge. Better fabrication process is the main reason behind AMDs ability to leapfrog ahead of Intel in desktop and enterprise space (to put it mildly - it's a total destruction).normand668, c69k, Kalen and 1 other person like this.
-
Time Spy benchmark is rendered at higher resolution. The reason you gain better performance in this bench for 20xx is due the faster vRam. At lower 1080P it will be another matter.
And Mobile 3080 won't see same performance uplift as the Ampere desktop cards. Mobile will still be crippled at same low TGP while the Ampere desktop cards will get higher TGP vs Turing. All your talk about 50% performance increase for mobile card will only be a wet dream
As a reminder on the low Combined scores for 2933/3200MHz ram in R2.
2080 Super desktop card with lower total and lower graphics score and still provide a lot better combined score. The faster ram in R2 is a downgrade for performance vs the EOL 51M running 2133MHz sticks.
Both setup is with 3200MHz ram.
https://www.3dmark.com/3dm/49149879?
https://www.3dmark.com/fs/22897416
Edit. See also 3DM FS combined score from the thin Alienware Alienware m15 R2 and soldered ram. Not the way it should be.
https://www.3dmark.com/fs/21012097Last edited: Jul 30, 2020electrosoft, raz8020, Rei Fukai and 4 others like this. -
Yes, good point about the higher TimeSpy resolution but perhaps the more important difference is that TimeSpy is a proper DX12 benchmark. Check out the "Direct X 12" chapter which clarifies the fundamental difference:
https://s3.amazonaws.com/download-aws.futuremark.com/3dmark-technical-guide.pdf
Basically FireStrike is far more likely to be limited by CPU speed, due to the use of a higher-level (CPU-heavy) API, whereas TimeSpy is more of a pure and massively larger/more complex GPU load. I am not sure if TimeSpy is that sensitive to VRAM performance - a 10% VRAM overclock does nothing for my TimeSpy score.
Either way, 3K is a very reasonable test resolution these days, and obviously the load on the GPU due to DX12 is much heavier and more representative of modern DX12 games (although I am curious if there is a good ray-tracing benchmark).
As regards Ampere, I understand you are referring to the rumours that desktop Ampere cards will run at higher TDP. Well, we can safely assume that this benefit won't be available in Appleware laptops (and most other brands, to be fair)
However, I think most performance improvement in Ampere GPUs will come from this move to 7-8nm, and this will obviously translate to potentially massive improvements in mobile graphics performance (although, as per the Nvidia thread, you are right in assuming that the gap to desktop will widen further, since this extra power won't be available in most laptops).
For completeness' sake, Ampere performance benefits can also result from internal design improvements and perhaps faster/better Video RAM. I haven't read too deeply about this aspect TBH, but both of these two avenues of improvement would benefit mobile platforms.
One thing is pretty much certain: it won't be another "Super" scam, it will be a truly exciting generational upgrade. We'll find out more very soon indeed
Last edited: Jul 30, 2020c69k likes this. -
Still many that will play games in DX11. And not many newer gaming laptops have panel in 3K. Either 1080P or 4K. See... Gaming in 1080P will continue. Aka you won't get the performance uplift you think you'll get with Ampere (of course without RT who will benefit the coming cards)
My words stand still by itself. Forget 50% raw performance uplift with coming 3080 Mobile -
I hear you bro. I tend to play a DirectX 9 game in 1280x1024 lol
We'll see. 50% sounds overly optimistic, but the 30-40% people talk about could be in reach. That's 30-40% more performance at the same price, and the usual rapid devaluation of the previous gen... Tech progress can be brutal.Last edited: Jul 30, 2020 -
If Nvidia want charge one arm and one leg for 3080 silicon, then expect the same for the power crippled Mobile top dog coming for notebooks!
https://www.tomshardware.com/uk/news/nvidia-rtx-3080-ampere-launch-by-september
" There's potentially bad news as well, however. While the RTX 2080 Ti and 2080 Super are becoming less common, RTX 2070 Super and lower are still reasonably available. My thought right now is that the RTX 30-series Ampere GPUs will start at the extreme end of the performance and pricing spectrum, which could mean prices of $1,000 or more at launch. Maybe I'm being pessimistic, but if RTX 3090 (or whatever the top part is called) ends up pummeling the RTX 2080 Ti the way I expect it to, it could easily go for $1,500 or more. RTX 3080 (3080 Ti, 3080 Super, 3080 whatever) could then take over the $1,000 mark, and RTX 3070 would fill the $700 slot. Yuck."
-
Well, that's pure pricing speculation, although not completely unfounded - given the weakness of AMD in graphics space, Nvidia could well decide it's time to exploit their position. But then, if they were to realistically charge huge amounts of money for Ampere GPUs, they would have to be a huge improvement over Turing
-
More RT cores means bigger than needed silicon. And this added cores doesn’t mean huge performance uplift if this is not what you’ll use. Benching or gaming without ray tracing support (some prefer raw performance vs good looking).jc_denton likes this.
-
My understanding is that, unlike the CPU software in the context of gaming, GPU programs naturally and easily parallelise to all the extra cores, so no worries about them being unused, assuming reasonably ambitious quality settings.Last edited: Jul 30, 2020
-
ratchetnclank Notebook Deity
I believe RT cores can't do just the general compute of the other graphics cores at least from my understanding of it they are a bit more specialised.
https://www.hardwarezone.com.sg/fea...-,RT cores,comprises of two specialized units.
I'm wondering if these 50% improvement stats come from "with DLSS enabled" with them adding more tensor cores onto the die for ampere.c69k likes this. -
Yeah, good point. the RT cores look quite specialised, maybe they could be used for some graph-related algos outside of graphics, but I don't immediately see it and Nvidia doesn't seem to advertise it
the good news, is that not that they don't seem to take that much die space.
Sorry, which 50% improvement stats exactly? Again the step down in fab process from 12 to 7-8 nm alone will automatically provide substantial performance improvement, that much should be clear.jc_denton and ratchetnclank like this. -
ratchetnclank Notebook Deity
Sorry where i said stats i meant rumour. The fab process gains should help increase performance per watt over Turing it also depends on if the die size shrinks or increases too i guess. It's all speculation at the moment anyway and i guess we will see in a month or so when they annouce them.etern4l likes this. -
Apparently there is also a new memory type. From A100 specs:
HBM2 With 40 gigabytes (GB) of highbandwidth memory (HBM2), A100 delivers improved raw bandwidth of 1.6TB/sec, as well as higher dynamic random-access memory (DRAM) utilization efficiency at 95 percent. A100 delivers 1.7X higher memory bandwidth over the previous generation.
Noice. -
ratchetnclank Notebook Deity
The A100 is the ampere successor the to Volta V100 which is a datacenter gpu which also used HBM memory though?
The RTX series of cards will likely use GDDR6 as per previous generations. -
I stand duly corrected. Moreover the improvement from 900GB/s to 1.7TB/s is due to additional channels, not a seismic breakthrough in memory technology. That said, according to leaks NV will upgrade both VRAM speed and capacity in consumer models.
https://www.tomshardware.com/news/nvidia-rtx-3080-ampere-all-we-knowjc_denton and ratchetnclank like this. -
Exciting laptop, I presume zero overheating issues, zero fires, no discolouration, and rock stable so far? If yes, my only gripe would be the 60Hz@4K...
jc_denton likes this. -
Spartan@HIDevolution Company Representative
-
Great news! From what i can tell with the results and below the Crucial Ballistix RAM works at full 3200MHz @ CL16 vs the stock Kingston 3200Mhz @ CL22. Tested repeatedly before and after and found roughly an improvement of 750-1000 to my FS scores with the same overclock.
https://www.amazon.co.uk/gp/product/B083W5ZRJ1/
https://www.3dmark.com/fs/23212789
Overall the laptop is cooler than i hoped and screen gorgeous. Certainly a massive upgrade from my AW17 R5 with ridiculous temps from the 8950HK.
-
That's awesome! What was that noise about locked out XMP all about then? I guess they couldn't figure out how to actually do this so they decided to spread the FUD hoping to get people to buy their CL22 RAM
The Ballistix looks like an amazing SODIMM kit. 2x16GB at CL16, at half the price of HyperX Impact CL20 2x16GB?
https://www.amazon.co.uk/HyperX-Impact-DDR4-32-SODIMM/dp/B07BGLKD4M/
Edit: just some bad Amazon price, should be around £200, but still more expensive than the faster Ballistix.
What's the catch lol
Edit: the catch was that the Corsair/Micron memory doesn't work/failed...Last edited: Aug 3, 2020normand668, c69k and jc_denton like this. -
ratchetnclank Notebook Deity
Had mine about 3 weeks but no discolouration so far although i imagine it would happen months down the line.
Excellent that's great news. Think i might upgrade my ram soon then
normand668 and Spartan@HIDevolution like this. -
Looking good!Spartan@HIDevolution, Nicolas Paiva and etern4l like this.
-
Great news, this is indicating that Dell is going the right direction
with its Bios.
Please, when you have time, can you post a snip of Hwinfo64, as below?
I want to know the producer of Ballistix SDRAM chips. Thanx in advance !!!
Greetings from skateboarding Amsterdam **_\|/_**etern4l likes this. -
Melvin Rousseau Notebook Consultant
You think I will see a difference in gaming if I upgrade from 2933 to 3200 ?
I only have 8 gigs of ram now as I wanted to upgrade myself -
pathfindercod Notebook Virtuoso
-
Melvin Rousseau Notebook Consultant
-
In earlier post, you saw a reported improvement of about 4% in FireStrike - it's not a huge gain in that use case, but it's noticeable. If you have just one stick, you will certainly notice even larger gains by upgrading to dual-channel RAM. The best part is, you will pay half the price Dell charges for RAM
Kalen likes this. -
I used “disc” interchangeably with “drive”, pardon me Mr. Correct
-
No problem! it is Micron Tech. Screenshot attached and exported report on the memory if that helps any further. Interesting to see all the various and impressive timings it supports at different speeds (lower latencies than the standard timings seen in the screenshot but detailed in the report - i think anyway, you true techies may make more sense of it
), i can only guess Dell were coy about any support for other RAM modules due to wanting to sell there own slow ****e.
I was a fool to buy the 3200 XMP, though at least i took the plunge and things worked out well with the Crucial purchase, i nearly bought the HyperX until i read a review about the Crucial being supposedly the fastest currently.
Attached Files:
Last edited: Aug 1, 2020 -
-
Yeah, it's surprising since their lower frequency memory doesn't seem to be any faster than HyperX. I've heard Micron is not ideal for overclocking, which is irrelevant for a 51M user. Nice find!c69k and ratchetnclank like this.
-
Melvin Rousseau Notebook Consultant
The beast on the right compared to the old 51 m!
FinallyAttached Files:
normand668 and ratchetnclank like this. -
*OFFICIAL* Alienware Area-51M R2 Owner's Lounge
Discussion in '2015+ Alienware 13 / 15 / 17' started by Spartan@HIDevolution, May 9, 2020.
