what was the part about you are the only one who knows what you are doing when it comes to OC rtx gpu and you are world no.1 BS this and that? ya just what i thought.
a little sense of humility goes a long way. instead of boasting, try being a bit more humble
-
-
Would undervolting help more for the RTX 20XX mobile series? Or is there more things going on this time around preventing better performance?alexnvidia and Vistar Shook like this.
-
It’s not BS if it’s true though. He’s legit world #1.Spartan@HIDevolution, Vistar Shook and raz8020 like this.
-
There are good reasons why many are scared of ngreedia. almost 3 years waiting and they gave us a laptop GPU that's slightly faster than GTX 1080 while jacking up the price and making us pay for features that are either half baked or MIA. some of you might be well off or have easy access to new laptops, but the fact is most of us dont and it's our hard earned money that goes to ngreedia (it's not like we have choice or anything. AMD where are you??) speculations tend to arise when something gets overcharged and underdeliver. and when solid reviewers like Jarrod'sTech showed the whole world how poorly rtx 2080 OC, one can't help but wonder what's really going on here. sure some might say it's all speculation, but others (people like me) would argue otherwise based on educated guess.
We all know Turing is a HUGE GPU, and when it comes to huge GPU, chances of failure or poor performing chips are high. that's just the nature of silicon manufacturing and nothing goes to waste. for example, certain defective die areas gets blocked off and they market it as a lower tiered product (ie 2060 is a recycled 2070 that didnt make the cut). this is called die harvesting. those that dont make the cut for desktop 2080 might just get recycled, reduced clocks significantly and marketed as MAX Q or laptop lower clocked GPU. OC GPU is not rocket science, not if you are not aiming to be world "no.1". Having stability issues at just over 70MHz OC sounds off.
having said that, we definately need more data from more 2080 OC to determine if this die harvesting theory is correct.Last edited: Feb 3, 2019Awhispersecho, Kuro Kensei, Darkhan and 5 others like this. -
16 more days. . .come on dell be a pal and send mine early.
BTW my Area 51m back pack got delayed till feb 19-21. Which is fine as I don't have a use for it before then.Last edited: Feb 3, 2019Rei Fukai, Vasudev, Fire Tiger and 1 other person like this. -
It must be a real kick having 9900K !!! I am so hungry for tweaking a new laptop. My next will be definitely with an unlocked and desktop processor. The new laptop will have to have 10-bit OLED 120Hz G-sync, otherwise no deal.
Vasudev and Vistar Shook like this. -
Yes. Just like all the other versions before it. You need to undervolt the card to stay below it's 150W ceiling.Rei Fukai, ssj92 and Vistar Shook like this.
-
propeldragon Notebook Evangelist
Personally after overclocking several different 10 series cards. Your gpu is definitely more efficient. Never been able to use that high of voltage without hitting the power limit. Under air of course.Vistar Shook likes this. -
Mine is more efficient because mine are pretty much always modded. We don't sit and "Drink the Nvidia kool-Aid" Why do you think most people want PremaMod systems? Because the GPU and bios are almost always modded my friend.
Edit:
For the record, the 1080N was able to reach 300+W. This can never be done on "Stock Firmware"Last edited: Feb 3, 2019 -
HaloGod2012 Notebook Virtuoso
Exactly why I went for the P775 from HID
Sent from my iPhone using Tapatalk -
Which makes perfect sense!
Right now sitting here wondering how this A51M is going to pan out.ssj92 likes this. -
Vistar Shook Notebook Deity
So the single and sli 1080N firestrike score you posted is TDP modded to 300+? wow...nice. -
Yes.
So with how many watts that card can pull, you would think they would give us a bit more. Since it has been proven time and time again what it can handle. What do they do? Come out with some nonsense 150W ceiling when the card can do 300W+ legitimately with how it is currently configured. BGA would of course be different, but non the less just as powerful this round.Vistar Shook, Rei Fukai, Ashtrix and 2 others like this. -
Cmon Nvidia we waited 3 years for this!???
The more I look into it the more I think I might skip this gen or just get a cheap RTX 2060 laptop.
I’m still holding out for 51M reviews but they better live up to the hype.
I hope we don’t see another 3 years between new GPU’s.
I really like the M51 but Nvidia really dropped the ball pushing RTX when it should have just focused on upping performance.
RTX is not ready for primetime.Who asked for this anyway?Vistar Shook, alexnvidia and hmscott like this. -
OH SNAP!!!!!
Sent from my iPhone using Tapatalk Pro -
Nvidia just follow the notebook trend. I'm sure they don't see the point push out mobile graphics with higher or equal TGP as they did with first gen laptop desktop cards (Maxwell N). The cooling in many of todays/tomorrows (coming) Gaming laptop models won't handle it.
They know very well that almost all Notebook manufacturers today have banned/abandoned thicker gaming laptops for tinner Apple design.
The few thicker laptop models we still have doesn’t get the Nvidia desktop power due the love for thin and flimsy Joke-books.Last edited: Feb 3, 2019Vistar Shook, Rei Fukai, Ashtrix and 6 others like this. -
Yeah, its sad.
The Future of Gaming is Thin on Pounds, Heavy on Pixels. By MSI's ceo: Charles Chiang
"When it comes to his company's bread and butter products, gaming PCs, Chiang believes the future is thin, mobile and high resolution. He said he expects to see slimmer and lighter gaming laptops become a much bigger portion of the market in the years ahead, particularly as new processes like Intel's 10nm and AMD's 7nm bring greater power efficiency."Ashtrix, Vistar Shook and raz8020 like this. -
And I'm guessing this guy must have a lucky 2080 as well....
https://www.3dmark.com/fs/18149840
We have plenty of folks who know what they are doing, but we can't seem to get compensated from Nvidia in the form of a performance boost. Just performance nonsense.Ashtrix, Vistar Shook, alexnvidia and 4 others like this. -
Is that from the MSI GE75 Raider?
-
I'm not sure, but it's beating a Area 51M just like everyone else is in the gpu department.....Well, until they show their hand that is. You know....not using fake heatinks designed for 2060's and 8700s's and all.Ashtrix, Vistar Shook, Rei Fukai and 2 others like this.
-
Kuro Kensei Notebook Consultant
So, your CPU+GPU pulled 400W+ during this run? Wow, my Ryzen + Vega 10 rocks! It scored 1/10 of your result at 1/50 of the power draw! lol
On a more serious note, I do hope Dell's cards will be possible to mod and push past 300W in the end. Even if that requires a custom modded PSU as well. -
Holy crap. If this is true, I'm glad I ditched the 17R5. If this is true then customized once should have a lot more to offer.
I really want to see more tests for this unit. -
Dell has already confirmed Cpu power cap. Why shouldn't they do exactly the same for PSU? They have done it before. You will most likely need an fully unlocked firmware. But not sure this will help either... Because if you unlock Razer's firmware this doesn't help.Last edited: Feb 3, 2019Ashtrix, Vistar Shook and Rei Fukai like this.
-
One thing is for sure. It can use 330W + 180W for a total of 510W.
CPU being limited to 119W still gives you 391W. GPU is limited to 180W so there's still 211W left over which doesn't make sense.
Unless AW just wanted to show off, they'd need to increase that CPU ceiling.
But we can't judge until a NBR member has one to put it through some real tests.
-
How can you be sure the notebook can utilize everything from both PSU? Is this confirmed? Nope. All I have seen they have stated is iGPU can be used... The smaller 180w is nice for max portability if you don't have to fire up nvidia graphics on the go.Last edited: Feb 3, 2019Ashtrix, Vistar Shook and raz8020 like this.
-
They've said countless times in the videos that you can use one PSU (180W OR 330W) for potability. But for maximum performance you'll need to use both. SO it should be able to utilize both.
When I asked Mr. Azor about 2x 330W PSUs, he said they'd work but you won't get any extra performance. SO either there is a 510W (330W+180W) limit or possibly a 330W x2 limit which is more doubtful. -
As I said... This doesn't mean you will be able to utilize everything (all powa) from both PSU. You can use more power than from the single 330 in this dual psu setup but this doesn't mean they haven't added in a max power cap. Only a fully working mod unlocked firmware can confirm this.
Remember 330+180w psu is equal +630W from the wall with 80% utilizion from PSU.Last edited: Feb 3, 2019 -
Non GSYNC based models might have Hardware MUX based Optimus switching and GSYNC models will not have Optimus Prime. From the repair sheet I think AW 51M can draw power from both ports. Which DC-IN is functional is yet to be seen in review models. It would have been better to include single DC-IN port with Eurocom 780W PSU.Vistar Shook and Rei Fukai like this.
-
VoodooChild Notebook Evangelist
See, this is the problem I see with this system. Where are the reviews for the "fastest laptop in the world"?
Azor was singing to everyone willing to listen at CES and afterwards that "this is the fastest laptop in the world" and "the first with a desktop class cpu" and this is "great legend design which they've changed only 5 times since the founding of AWs " and all that nonsense while we see Clevos again here rocking out numbers, and that is what matters. Numbers! Performance numbers.
If this is truly a enthusiast class system, then where are the numbers for this? Surely, one good review would be out by now don't you think?
I, for one, think that they're still figuring things out and when all is said and done, it'll be over to us guys here at the forums to fix our individual systems (hopefully not again) to repair what should have been working in the first place. That's what we pay premium for such systems, isn't it?
I see no way that 9900K isn't reaching 99° while just gaming at 1080p. It's upto Dell to prove me wrong and they still haven't showed me the numbers which matters and they'll not have a penny of mine until proven otherwise. I am skeptical nonetheless.
Sent from my SM-G965F using TapatalkAshtrix, Vistar Shook, Vasudev and 6 others like this. -
Just don't give them anything they can use against us like they have in the past.
Ashtrix, Vistar Shook, GTVEVO and 6 others like this. -
I would interpret that to indicate that it is dropping to 95W, which means it is throttling to non-turbo clock speeds. Intel TDP now is always non-turbo (base clock). They do not publish TDP for turbo clocks, as it can vary by CPU and system. It cannot hold turbo clocks under load at 95W because that is not enough power. It may be due to the GPU being under load with cancer firmware castrating the CPU when the GPU is under stress. They started that nonsense with the Alienware 18. Rather than supply the power it needs to run at full performance, they cut the performance back to keep the power draw at a predefined maximum they deem to be adequate.Ashtrix, Vistar Shook, Rei Fukai and 3 others like this.
-
Ah, that was a sad story indeed. This is why I tend to not share anything like this with anyone except close people that won't say anything. Manufacturer's seem to just block out everything they can.
There was a member here getting P5000 SLI to work and we never learned how since the person said nVidia would block it which is true.
I watched most of that video (skipping here and there) and it at least seems the GPU is running well. It seems as long as the GPU was kept at a decent temp, it stayed at its turbo clock (no crazy fluctuations like I saw on my MSI 1060).
CPU ran hot and throttled. The youtuber mentioned 2070+9700K heatsink being used on 9900K+2080 config so we'll see in a few weeks if that was true.Last edited: Feb 3, 2019Ashtrix, Vistar Shook and Johnksss like this. -
I dont see why they cant get a 8700k running in there, that would actually make sense.
Im a laymen though. -
I've been wondering as well as why there is so much radio silence for this unit. If it really is an issue with the parts availability I suspect this could have affected review units as well which would mean they would arrive late.
Just a speculation. -
I'm getting mixed answers when I asked them about this. Umar said no but another rep said yes...... An 8700k would really be a reasonable choice for this unit. At this point I'll wait till someone tries them out.
-
I saw this tweet & this n' the other day & forgot to bring 'em in here; anybody see these?, or need to know this?
GM tries to nix idea of buying now with intentions of LCD panel upgrade later (so why wait / just buy now !)
Obviously if they'd designed it to be upgradeable they can sell you the 1080p now & the 1440/4k later & everybody wins; surely they knew 1080 wouldn't cut it in the long run & that owners would upgrade down the road, so why not prep the way now for easy mods later or at least try? ... c ui bono?
"figured I'd do it meself, was able to on past alienwares!" ... & still got no love from the GM, heheh
Indeed
*Note my prior post regarding CPU/GPU/heatsink swaps, Azor's tweet said those were worthy of a new How-To Video on AlienTube; a panel swap he just says "no", make of it what u will
yeah, old onesLast edited: Feb 6, 2019Ashtrix, Spartan@HIDevolution, propeldragon and 4 others like this. -
If they can be put on, they can be taken off.
Hes basically saying its unlikely to still be under warranty if you do it yourself but without actually saying to deflate the hype for the machine.Vasudev likes this. -
I'm on my 4th redone order now since Dell CSR keeps messing it up. I've spent a few hours on the phone total and multiple e-mails. If I didn't want this unit so bad I would have certainly cancelled. They are sending me another confirmation tomorrow with order details. If it is wrong on absolutely anything I'm done with them forever.
In short this is what happened:
1st order: wrong ram and HDD options
2nd order: wrong screen and case color
3rd order: same price as first order, but with downgraded options
4th order: to be determined
I've also come to terms with having the 1080p screen. The RTX 2080 will be the sweet spot I suspect for many new AAA games at 1080p/144. Anthem benched on similar specs (on the demo, which is wrought with problems) doesn't hit 144 AFAIK. Hopefully it is a very nice panel and I won't feel the need to upgrade it later. -
HaloGod2012 Notebook Virtuoso
The screen is what made me decide
Most against it. I love extremely bright screens, and the only 400 nit screen around is the 1440P 120hz gsync panels in the Alienware 17 R5 and the P775; they are gorgeous panels
Sent from my iPhone using Tapatalk -
XxAcidSnowxX Notebook Consultant
Strange how confident he's saying "no" .... Doesn't seem right to just out right say "no".... Why not? Did they do something stupid like glue the panel into the upper assembly?Ashtrix, propeldragon, Terreos and 1 other person like this. -
They used adhesive for the past couple of generations btu where always removable. I have exchanged screen panels in past 3 generatiosn fairly easily including my current AW15R3. Just get a new adhesive strip afterwards after cleaning the old one off and you will be fine. Its extra work ofcourse compared to just exchanging a full screen assembly but it is certainly doable.
BUt in the current service manual they onyl show how to remove the whole assembly so dont be surprised if they did something totally different this time. In previous models their service manuals they did show how to remove the panel itself.Vistar Shook, raz8020, Rei Fukai and 1 other person like this. -
Sweet spot is 1440p at 120hz for RTX2080 not 1080
-
1440p 120hz 100% AdobeRGB panel would be perfect
I personally wouldn't mind a 2160p 100hz 100% AdobeRGB panel either but I'm probbaly the 0.001%
Oh OLED would be nice too. AW 13 R3 still has one of the best displays I've used aside from the 17.3" 1920x1200 RGB display used in the M17xR2 and Dell Studios.propeldragon, Terreos and Kuro Kensei like this. -
Kuro Kensei Notebook Consultant
You could upgrade it but it's not easy, and that's true:
https://www.techinferno.com/index.php?/forums/topic/617-m18x-anti-glare-mod/
http://forum.notebookreview.com/thr...-the-m18xs-matte-display.599542/#post-7772260
Possible? Yes. Drop in swap? Hell no! -
This is what holds me back from purchasing a desktop replacement. I want to be able to swap screens and not just that, I want it to be quick and easy. As an example, if I'm playing a FPS it'd be awesome to have a high refresh rate screen but after I'm done with the gaming session, be able to within 30 seconds change out to a 4K screen for media consumption.
-
Well...even in this situation it seems the whole LCD assembly is removable. I'm sure you'll see on ebay some assemblies in the future.Rei Fukai likes this.
-
Yeah, but portability is still huge for some of us. That said, there is nothing stopping you from plugging into a different screen. I plan on doing that with the 51m. Playing games will mostly be on my lap with the high refresh rate, but I still have a projector for movie night!
I have extra screens though wherever I plan on transporting to though, so I've been thinking of switching to the MSI Trident X Plus, which has a 2080 ti and is a mini PC. It is around $3000, but will no doubt have better performance than even a full spec'd 51m. -
Ummm, not sure where that was posted about it running 400+ watts?
That ran up to 350W max. So far.
Never never that!!!!Vistar Shook, Mr. Fox and ssj92 like this. -
Actually that happened when someone requested him to load CPU ONLY. He said it's pointless to stress test cpu only because his bios was buggy and it would throttle to 95W but he did it anyway just to satisfy his viewers. I'm guessing he was filming it live. and early on, before he stress test CPU only, he did a CPU + GPU stress test and the CPU held it's promised TDP which was around 120W. so i have no reason to not believe his firmware was buggy.
That part was easy to understand. The hard part to understand was about when someone asked him about the memory speed stuck to 2400.Last edited: Feb 3, 2019ssj92 likes this.
*OFFICIAL* Alienware Area-51M R1 Owner's Lounge
Discussion in '2015+ Alienware 13 / 15 / 17' started by ssj92, Jan 8, 2019.