I think I know which video you're referring to and I believe that the laptop was running eyefinity. I would like to see a single screen performance benchmark and my guess is the settings will be higher but not by much.
Sent from my One using Tapatalk
-
-
Meaker@Sager Company Representative
In non enduro mode the 290x will crush the 770m at every turn.
-
hmmm....no 880m on the alienware germany site.
-
Yeah, I tried looking it up and its not there... I just ordered the beast in my signature... now I'm thinking maybe cancel the order and see what happens?
-
Meaker@Sager Company Representative
If you have the 780M on the way I would not worry about the 880M.
-
for some reason that gives me confidence in my decision to go with the 780M over an 880M that hasn't even come out yet officially as well as the under spec'd R9 M290X which is really a rebrand of a rebrand - that in itself doesn't bode well.
so 780M it is, -
-
People are running desktop GTX 680s with 2GB of VRAM just fine ezpz. The 880M is the exact same chip but clocked even lower, and they slap 8GB VRAM on it. It's nothing but a enormous gimmick to fool laymen and a waste of perfectly good GDDR5. It's almost criminal how absurd it is. There is no way GK104 with it's 256bit bus can drive anything requiring that much memory. So dumb.
It's POSSIBLE to make an argument for 4GB (SLI/Crossfire in high resolution setups), but there is no justification for this other than big numbers to fool people who don't know better.unityole likes this. -
Meaker@Sager Company Representative
Saying it's due to the 256bit bus does not really mean much, it's to do with other things really.
-
-
-
imo 880m with 8GB ram is just stupid, although AMD not any better either. at this point cramming more memory we'll never use on 1080p will just increase overall system heat, and waste more power. alienware 330watts PSU is already at its limit yet they pull stuff like this LOL
-
The weird thing is I feel like Nvidia are shooting themselves in the foot, in that the 980M and any future GPU they release is also going to have to have 8GB or otherwise the same people they are fooling with the 8GB are going to wonder why the new GPU has less. GDDR5 is expensive, they are setting the bar to high and they'll have to always reach that bar for any new high end mobile GPU they release. Pretty much rules out any possibility of a more cheaply priced Nvidia flagship in the future
-
I don't understand why they have 8gb 880m Yet, 3gb 780ti and 6gb Titian Black, then 4gb 680's? This all makes no sense I mean is 3Gb for a 780ti enough or should I be like hey lets get a 690 with 4gb or a titan black with 6gb. There needs to be natural progression. Through into that the Titan Black with its extra compute ability over a 780ti and your like what the hell!!!! What do I buy all I want to play is crysis 3 lol :'(
-
Anyone has any idea, when 880m will be available in aus? Usually it's around 1 month behind us
-
I remember reading in the maxwell whitepapers that higher vram amount is a necessity of the way the pipelines of the new Maxwell cores are fed. It's not only about how many assets you're trying to fit into the framebuffer.
Besides. There's already at least 10 3k+ resolution laptops. Get used to it.
As a game developer I can tell you that we're already seeing huge bottlenecks in memory address space preventing us from using high-quality assets even though there's power available for it. -
Meaker@Sager Company Representative
Well 8GB on a GK104 is still a little questionable but it's not really hurting really either.
-
-
-
Meaker@Sager Company Representative
-
Mr. Fox likes this.
-
From what I can see from John's testing, GTX 880M is going to be a least slightly better than GTX 780M in much the same way that GTX 780M was better than GTX 680M. Clock-for-clock they appear about the same, but that's with no true driver support for GTX 880M yet and a vBIOS mod that is still a work in progress. We have already seen that it is probably going to overclock even better than GTX 780M, and that alone makes it worth having if you find great value in that like some of us do. Remember, in the beginning GTX 680M gave GTX 780M a real run for the money, but now there is no question of GTX 780M's performance superiority. The only thing we can say with certainty right now is that a stock vBIOS sucks on just about any NVIDIA GPU.
So while it may be the same with more vRAM, it may not be by the time the fat lady is done singing. I still think it's too early to make the call... time will tell whether it is "only a rebrand" or clearly better. NVIDIA is doing what appears to be some back-door tweaking with their drivers, so they might surprise us with more than we can see. They might even quietly tone down GTX 780M performance just a touch through drivers (yes they can do that) to make GTX 880M look better. More to come as we watch the story unfold.
And what's up with the "well" naming kick these chip makers are peddling? Haswell, Maxwell, Broadwell... what next? Cornwell? Coswell? Roswell? Well, well, well?Hopefully none of them go down the same road of performance mediocrity as Haswell or it won't end well.
-
The thing I worry about is that efficiency might totally go out the window once overclocked - just like Haswell. Then we are back to simply comparing performance increases. Sure low temps will be nice but it's not like 780m's run hot anyway. More so 330watt for 2 x 780m's is plenty for most people since they are usually to worried to even OC, let alone have to keep an eye on temps. I've noticed most people that harp on about 780m using too much power haven't even attempted to tap it out anyway, or got lumped with power hungry Haswell which also claimed 'efficiently', more often than not are repeating what they hear. I don't see why people are waiting around to go for an efficient 8xx, when if at the end of the day if performance is what they want we've got the daddy 780m's sitting under their noses for bargain prices right now.
-
-
Meaker@Sager Company Representative
Did you run any firestrike extreme on it?
-
Mr. Fox said: ↑Brother John has already shown us that the 880M can do more than 780M... I haven't seen a single 780M that can touch results like this single 880M. Considering the driver support is not even present, this seems like something with potential to shape up into something more than a little bit better, or just a rebrand with more vRAM. We will need to wait and see, but I am expecting more than a rebrand with respect to performance potential.Click to expand...
-
Here's nvidia are claiming and releasing for CeBIT.
To be honest it doesn't look like a high end 20nm Maxwell will be out for a long time. nVidia seem to be very careful making sure each GPU tear has only a marginal increase wether it be Kepler or Maxwell. Anyway, for what it's worth they are claiming a 15% increase with 880m over 780m.
Mixed technologies as part of the 8xx series. With the high end flagship still Kepler makes me think that are still away from making things scale well for higher end cards.Mr. Fox likes this. -
unityole said: ↑since I dont understand the score of GPU benchmark, better as in how much better in % for performance ?Click to expand...
Single 780M below compared to the above single 880M. Note benchmark score and difference in max stable overclock on core and memory. Without proper driver support and fully tuned vBIOS it is still an improvement. Give the drivers and vBIOS more time and the gap will most likely widen further.
-
TBoneSan said: ↑Here's nvidia are claiming and releasing for CeBIT.
View attachment 109787
To be honest it doesn't look like a high end 20nm Maxwell will be out for a long time. nVidia seem to be very careful making sure each GPU tear has only a marginal increase wether it be Kepler or Maxwell. Anyway, for what it's worth they are claiming a 15% increase with 880m over 780m.
Mixed technologies as part of the 8xx series. With the high end flagship still Kepler makes me think that are still away from making things scale well for higher end cards.Click to expand...Mr. Fox said: ↑I'll let you do the math...
Single 780M below compared to the above single 880M. Note benchmark score and difference in max stable overclock on core and memory. Without proper driver support and fully tuned vBIOS it is still an improvement. Give the drivers and vBIOS more time and the gap will most likely widen further.
Click to expand...
as for OC card it may or may not oc well, thats not too much of a worry. theres a competition in GPU, not so much in CPU so even if haswell won't OC well, GPU will probably do just fine. -
Hence why I wrote nvidia are 'claiming'. Yes I'd also take it with a pinch of salt. However you asked for a % so that nVidia speak seemed to for the bill
Really you need to look no further than Johnksss's (lol too many 'S' going on) OC testing on p1 to get a decent idea. It showed a measurable improvement so 15% down the road could be entirely possible once a decent set of drivers come out.
Yeah it could have been better silicon but that unknown factor is always going to be there so who really knows right now.
Here's something interesting in this article. It says there will also be 880m available in 4gb too and the possibility of flashing a 780m is entirely possible.
QUOTE
" As you can see below, GeForce GTX 880M is just a rebranded GTX 780M. Card still uses GK104 GPU, which is now used in 3rd generation straight as a mobile high-end model. The only difference is that GTX 880M will be available with 8GB memory, but of course there will also be 4GB models as well. There are sellers already offering a chip modification for GTX 780M owners. So if you feel like owning the first 800 Series GPU, this is the cheapest way to go."
Id like to know of Johnksss and SLV7 could find anything interesting from the vbios about the finer differences between the two cards - if any.reborn2003 and unityole like this. -
I recall john mention 880m overclocked higher than his 780m
-
Meaker@Sager Company Representative
Every sample will be different of course.
-
Meaker said: ↑Every sample will be different of course.Click to expand...
it sucks I still can't believe nvidia and amd pulling this crap lol but guess can't help it no 20nm. Nvidia could have gave maxwell with 880m but donno why it didn't happen. maybe they wanted their next gen mobile GPU to be 20nm + Maxwell and be way better AMD's GPU to gain most market share, who knows. -
Until we see some talented overclockers that prove otherwise, I am going to assume the new AMD mobile GPUs still suck compared to NVIDIA. We have a track record of lameness that needs to be broken with AMD. In performance, reliability and driver support there is a clear distinction with AMD winning at nothing but being the low price king in the mobile market.
-
Mr. Fox said: ↑Until we see some talented overclockers that prove otherwise, I am going to assume the new AMD mobile GPUs still suck compared to NVIDIA. We have a track record of lameness that needs to be broken with AMD. In performance, reliability and driver support there is a clear distinction with AMD winning at nothing but being the low price king in the mobile market.Click to expand...
-
take a look at this Maxwell goes mobile with Geforce 800M series
840m 17watts skyrim at 30 fps? imagine if 880m had maxwell then we probably wouldn't need dual AC adapter anymore thus future gaming laptop in power consumption looking brighter. less power used means less heat and if GPUs themselves stay in good shape we can see some real OC, perhaps 1250 core and 1700 memory LOL -
Meaker@Sager Company Representative
You see the potential of not needing dual power bricks, I see the potential of performance if you push it there anyway
-
Will the new Alienware 17r5 with current 770m gpu support these new 880m gpu? Assuming. Cpu also needs upgrade from 4700 correct?
Sent from my SM-N9008 using Tapatalk -
Meaker said: ↑You see the potential of not needing dual power bricks, I see the potential of performance if you push it there anywayClick to expand...
turilo said: ↑Will the new Alienware 17r5 with current 770m gpu support these new 880m gpu? Assuming. Cpu also needs upgrade from 4700 correct?
Sent from my SM-N9008 using TapatalkClick to expand... -
Instead of dual AC adapters it would be a lot nicer to have just one huge 600W to 800W AC adapter, then you would not have to mess with the junction box. You could still have two if you wanted to... one ordinary 330W for web surfing and business use, then the beastly big one for the more exciting stuff.
I actually kind of like the fact that wicked hardware draws tons of power. It distinguish those products and their users from the "meh mainstream" casual user. I am skeptical greatness will ever be achieved by hardware with conservative power demands. All that should be good for is creating more headroom for even more ridiculous extreme performance. In other words, if you could get a GPU that performs well at 75W and pushes 1250 core/1500 memory, why not push it to 120W and get 2800 or 3000 for a core overclock and still need 600W+ instead of being excited about low power consumption? The trouble with making the low power trash is they make it low capacity so you don't really gain anything in terms of performance, and you can't push it harder because it will blow up or melt. The notion of maintaining status quo performance with less power consumption kind of sucks in my mind. -
Meaker@Sager Company Representative
The nice thing about two 330W bricks is if i'm going somewhere and just need the machine for basic work I can take a single brick with me and it's a bit lighter and takes up less space, especially when the junction box is tiny.
I've been monitoring the power on each brick and it's very good at load balancing the bricks at high loads. -
That's what I said... you could have one 330W brick for that kind of thing and one huge one for the fun stuff. Same concept except you would not use two in unison, just one or the other.
-
Do you really need all 660W, though? If I remember correctly, you only reached like 480W. Am I off on that? My point being, if they can make something between the two options you've suggested, it may be the best of both worlds.
-
The necessities in life
-
Meaker@Sager Company Representative
J.Dre said: ↑Do you really need all 660W, though? If I remember correctly, you only reached like 480W. Am I off on that? My point being, if they can make something between the two options you've suggested, it may be the best of both worlds.Click to expand...
I have drawn over 600W from the wall. Of course mine is a bit more power hungry but not absurdly so.Mr. Fox likes this. -
Wow, I didn't realize you could reach that level of overclock with these systems... I suppose you learn something new every day. :thumbsup: In that case, scratch my idea for something in the middle. The dual PSU modification would be the best option. Even if they were both 330W with some form of dock/adapter, you could still leave one at home and take the other on trips.
-
Wow, lots of..er...umm.. stuff in this thread.
Edit:
unityole said: ↑imo 880m with 8GB ram is just stupid, although AMD not any better either. at this point cramming more memory we'll never use on 1080p will just increase overall system heat, and waste more power. alienware 330watts PSU is already at its limit yet they pull stuff like this LOLClick to expand...
gschneider said: ↑I don't understand why they have 8gb 880m Yet, 3gb 780ti and 6gb Titian Black, then 4gb 680's? This all makes no sense I mean is 3Gb for a 780ti enough or should I be like hey lets get a 690 with 4gb or a titan black with 6gb. There needs to be natural progression. Through into that the Titan Black with its extra compute ability over a 780ti and your like what the hell!!!! What do I buy all I want to play is crysis 3 lol :'(Click to expand...
Turmoil said: ↑From the information available, that seems to be the case... I already have an Alienware 17 on order with the 780m so I'll just hang on to that and wait to upgrade to a Maxwell chip... which brings me to the question: Will I, currently with a Kepler, be able to upgrade to a Maxwell when they are released?Click to expand...
unityole said: ↑just couple somethings I'd like to disagree on.
Originally Posted by Mr. Fox
From what I can see from John's testing, GTX 880M is going to be a least slightly better than GTX 780M in much the same way that GTX 780M was better than GTX 680M.
thats quite different, 880M likely not better than 780M, it maybe better due to Nvidia have more time to optimize an old architecture thats about it, while 780M was better than 680M simply because it had more config core, thus very likely for a higher performance.
It's likely that you are wrong here. It is in fact better, but by no great means but better all the same
You can't over clock 4 gb video ram very much past 1500 while the 8 gb video ram can be clocked up to around 1650 and in some cases 1700.
Uses less voltage in the beginning stages of stock and over clocking. And has to potential to over clock hire. Whos to say we didn't test a set of weak cards?
Clock-for-clock they appear about the same, but that's with no true driver support for GTX 880M yet and a vBIOS mod that is still a work in progress.
I recall vBIOS was the reason that made card real powerful, Nvidia driver don't provide as much performance boost to the cards as much as AMD does, its a trend and ongoing with desktop.
Well, almost every single nvidia gpu from the 580M up to a 880M (Not all lessor gpu's needed a vbios to run alright at stock) needed a vbios to run correctly at stock and without having to set profiles in the start up menu.
We have already seen that it is probably going to overclock even better than GTX 780M, and that alone makes it worth having if you find great value in that like some of us do.
Johnksss tested one card, out of how many are going to be sold. since all cards aren't equal I'm just going to assume he got a decent one that could go higher than normal.
Im just going to assume I had the worst card out of the bunch.
with that in mind I can't see how 880M going to OC better than 780M.
Every single test in the 880m testing there already shows this
from the trend more memory (corsair 2133mhz 8gb) is harder to make than 4gb stick there must be a reason, and it could apply to vram as well.
No it will not
and more crammed together likely produce more heat and more power, needs to dissipated more, thus OC less.
I see you didn't see where 8 gb vram was over clocked and ran at 1650 mhz. Without modding the board for voltage like Meaker did for the 780M?
Remember, in the beginning GTX 680M gave GTX 780M a real run for the money, but now there is no question of GTX 780M's performance superiority. The only thing we can say with certainty right now is that a stock vBIOS sucks on just about any NVIDIA GPU.
I think we really need John's input on this as he experienced it first hand.
John is giving input now. I hope it's not taken as hostile retaliation of sorts.
So while it may be the same with more vRAM, it may not be by the time the fat lady is done singing. I still think it's too early to make the call... time will tell whether it is "only a rebrand" or clearly better. NVIDIA is doing what appears to be some back-door tweaking with their drivers, so they might surprise us with more than we can see. They might even quietly tone down GTX 780M performance just a touch through drivers (yes they can do that) to make GTX 880M look better. More to come as we watch the story unfold.
Sounds like you're talking about Haswell, LOL. Newer ain't always better. What the chip designers are marketing (i.e. telling lies to make the sale) as efficient might actually be deficient when it comes to performance. If that's the case, they can stuff their "efficiency" nonsense where the sun don't shine.
although 860M was an early benchmark, it was just over half the power used for the same performance. I never once mentioned about I want longer battery life and lower TDP. lower watt used can be great in multiple ways, like use less power thus higher OC due to less heat being produced, less power longer battery life and only need single AC adapter.. etc we can benefit from all of this than just "go green" and save power.
The part you forgot was where they tried to push the performance up and it choked royally. In a few cases it did worse.
And what's up with the "well" naming kick these chip makers are peddling? Haswell, Maxwell, Broadwell... what next? Cornwell? Coswell? Roswell? Well, well, well?Hopefully none of them go down the same road of performance mediocrity as Haswell or it won't end well.
Click to expand...
Meaker said: ↑Did you run any firestrike extreme on it?Click to expand...
unityole said: ↑since I dont understand the score of GPU benchmark, better as in how much better in % for performance ?Click to expand...
unityole said: ↑that chart is just so wrong and to blind the public and we alienware folks are better than that. nvidia post the chart with info on default 780m clock to default 880m clock, then they get 15%. from my point of view here we are comparing 3920xm and 3940xm, where as 3940xm is likely to be better optimized and nothing more. Here for the real performance talk, so bring up the big gun, OC 880m to its max compare to 780m to it's max, if we will get 15% from it, if we do, great job nvidia optimized the card like true masters, if not it totally make sense cause its well within realm of reasons.
As to this 15%? Not even close when going up against a 780M, but everything else yes and more.
View attachment 109790
as for OC card it may or may not oc well, thats not too much of a worry. theres a competition in GPU, not so much in CPU so even if haswell won't OC well, GPU will probably do just fine.Click to expand...
unityole said: ↑I recall john mention 880m overclocked higher than his 780mClick to expand...
Meaker said: ↑Every sample will be different of course.Click to expand...
Mr. Fox said: ↑Until we see some talented overclockers that prove otherwise, I am going to assume the new AMD mobile GPUs still suck compared to NVIDIA. We have a track record of lameness that needs to be broken with AMD. In performance, reliability and driver support there is a clear distinction with AMD winning at nothing but being the low price king in the mobile market.Click to expand...
unityole said: ↑14.2 driver is amazing with all the games supporting mantle its awesome. take a look at the desktop graphics and their driver, nvidia is barely on par with the new cards. mobile side AMD lacks but not by much, especially with newer drivers and all. for now i'm just going to assume 880m isn't as good because its pratically a rebrand, even if you can oc higher than 780m it'll be proportional for performance to percentage of how much you can actually OC it.Click to expand...
All the driver optimizations in the moible world will not help amd here. Now on the desktop side, that is a whole nother story. the 290 is a very comparable card at benching and gaming. Not so much on the laptop side.
Meaker said: ↑You see the potential of not needing dual power bricks, I see the potential of performance if you push it there anywayClick to expand...
Mr. Fox said: ↑Instead of dual AC adapters it would be a lot nicer to have just one huge 600W to 800W AC adapter, then you would not have to mess with the junction box. You could still have two if you wanted to... one ordinary 330W for web surfing and business use, then the beastly big one for the more exciting stuff.
I actually kind of like the fact that wicked hardware draws tons of power. It distinguish those products and their users from the "meh mainstream" casual user. I am skeptical greatness will ever be achieved by hardware with conservative power demands. All that should be good for is creating more headroom for even more ridiculous extreme performance. In other words, if you could get a GPU that performs well at 75W and pushes 1250 core/1500 memory, why not push it to 120W and get 2800 or 3000 for a core overclock and still need 600W+ instead of being excited about low power consumption? The trouble with making the low power trash is they make it low capacity so you don't really gain anything in terms of performance, and you can't push it harder because it will blow up or melt. The notion of maintaining status quo performance with less power consumption kind of sucks in my mind.Click to expand...
J.Dre said: ↑Do you really need all 660W, though? If I remember correctly, you only reached like 480W. Am I off on that? My point being, if they can make something between the two options you've suggested, it may be the best of both worlds.Click to expand...
J.Dre said: ↑Wow, I didn't realize you could reach that level of overclock with these systems... I suppose you learn something new every day. :thumbsup: In that case, scratch my idea for something in the middle. The dual PSU modification would be the best option. Even if they were both 330W with some form of dock/adapter, you could still leave one at home and take the other on trips.Click to expand...Mr. Fox, unityole and Optimistic Prime like this. -
J.Dre said: ↑Do you really need all 660W, though? If I remember correctly, you only reached like 480W. Am I off on that? My point being, if they can make something between the two options you've suggested, it may be the best of both worlds.Click to expand...
As you can see here in this video, a modest 4.3GHz on CPU and 1125/1500 and only 1.137V pulls about 560W.
johnksss said: ↑AW would need to redesign the power jack. Where as clevo has the right power jack to start with. 4 pin jack.Click to expand... -
Meaker@Sager Company Representative
I have attached a comparison of your single 880m score vs my sli 780m.
Attached Files:
-
Dell adds Nvidia GTX 880M / 860M, AMD R9 M290X to Alienware 17 & 18
Discussion in 'Alienware' started by danijelzi, Jan 17, 2014.