The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Dell adds Nvidia GTX 880M / 860M, AMD R9 M290X to Alienware 17 & 18

    Discussion in 'Alienware' started by danijelzi, Jan 17, 2014.

  1. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    I think I know which video you're referring to and I believe that the laptop was running eyefinity. I would like to see a single screen performance benchmark and my guess is the settings will be higher but not by much.

    Sent from my One using Tapatalk
     
  2. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    In non enduro mode the 290x will crush the 770m at every turn.
     
    bumbo2 and unityole like this.
  3. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    hmmm....no 880m on the alienware germany site.
     
  4. Turmoil

    Turmoil Notebook Evangelist

    Reputations:
    115
    Messages:
    485
    Likes Received:
    30
    Trophy Points:
    41
    Yeah, I tried looking it up and its not there... I just ordered the beast in my signature... now I'm thinking maybe cancel the order and see what happens?
     
  5. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    If you have the 780M on the way I would not worry about the 880M.
     
  6. thegh0sts

    thegh0sts Notebook Nobel Laureate

    Reputations:
    949
    Messages:
    7,700
    Likes Received:
    2,819
    Trophy Points:
    331
    for some reason that gives me confidence in my decision to go with the 780M over an 880M that hasn't even come out yet officially as well as the under spec'd R9 M290X which is really a rebrand of a rebrand - that in itself doesn't bode well.

    so 780M it is,
     
  7. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    i wouldnt say its exactly a rebrand of a rebrand. you don't call 4940mx a rebrand of 4930mx because it is likely more optimized, which is usually the trend. given more time companies are usually to fix and optimize stuff before releasing a better version, but all in all yes we sees it as rebrand cause if performance can be covered by simply OC then not something worth that much money.
     
  8. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    People are running desktop GTX 680s with 2GB of VRAM just fine ezpz. The 880M is the exact same chip but clocked even lower, and they slap 8GB VRAM on it. It's nothing but a enormous gimmick to fool laymen and a waste of perfectly good GDDR5. It's almost criminal how absurd it is. There is no way GK104 with it's 256bit bus can drive anything requiring that much memory. So dumb.

    It's POSSIBLE to make an argument for 4GB (SLI/Crossfire in high resolution setups), but there is no justification for this other than big numbers to fool people who don't know better.
     
    unityole likes this.
  9. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    Saying it's due to the 256bit bus does not really mean much, it's to do with other things really.
     
  10. Turmoil

    Turmoil Notebook Evangelist

    Reputations:
    115
    Messages:
    485
    Likes Received:
    30
    Trophy Points:
    41
    Can you go into detail on this? I'm stoked with the build I got coming, but I have a feeling they'll be dropping the 880m in the next few weeks and I know I'll have the ability to return whats coming and order a new one... maybe not as deeply discounted as I got my current build for, but still...
     
  11. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    I meant that the GPU is not powerful enough to drive anything that would need that much VRAM in the first place. In any case, you'd have to be deliberately trying to fill it up, you'd never see that kind of usage normally. The 780Ti and 290X could probably drive 4K just fine with their 4GB and less frame buffers.

    The 880M will have better resale value but that is the only benefit.
     
  12. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    imo 880m with 8GB ram is just stupid, although AMD not any better either. at this point cramming more memory we'll never use on 1080p will just increase overall system heat, and waste more power. alienware 330watts PSU is already at its limit yet they pull stuff like this LOL
     
  13. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    The weird thing is I feel like Nvidia are shooting themselves in the foot, in that the 980M and any future GPU they release is also going to have to have 8GB or otherwise the same people they are fooling with the 8GB are going to wonder why the new GPU has less. GDDR5 is expensive, they are setting the bar to high and they'll have to always reach that bar for any new high end mobile GPU they release. Pretty much rules out any possibility of a more cheaply priced Nvidia flagship in the future
     
  14. gschneider

    gschneider Notebook Evangelist

    Reputations:
    248
    Messages:
    568
    Likes Received:
    166
    Trophy Points:
    56
    I don't understand why they have 8gb 880m Yet, 3gb 780ti and 6gb Titian Black, then 4gb 680's? This all makes no sense I mean is 3Gb for a 780ti enough or should I be like hey lets get a 690 with 4gb or a titan black with 6gb. There needs to be natural progression. Through into that the Titan Black with its extra compute ability over a 780ti and your like what the hell!!!! What do I buy all I want to play is crysis 3 lol :'(
     
  15. vs3074

    vs3074 Notebook Evangelist

    Reputations:
    349
    Messages:
    588
    Likes Received:
    159
    Trophy Points:
    56
    Anyone has any idea, when 880m will be available in aus? Usually it's around 1 month behind us :(
     
  16. sangemaru

    sangemaru Notebook Deity

    Reputations:
    758
    Messages:
    1,551
    Likes Received:
    328
    Trophy Points:
    101
    I remember reading in the maxwell whitepapers that higher vram amount is a necessity of the way the pipelines of the new Maxwell cores are fed. It's not only about how many assets you're trying to fit into the framebuffer.
    Besides. There's already at least 10 3k+ resolution laptops. Get used to it.
    As a game developer I can tell you that we're already seeing huge bottlenecks in memory address space preventing us from using high-quality assets even though there's power available for it.
     
  17. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    Well 8GB on a GK104 is still a little questionable but it's not really hurting really either.
     
  18. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    so somehow 860m is maxwell and uses way less power but 880m is just a rebrand and more memory? how the hell does that work nvidia. and just when will we see 20nm + maxwell?
     
  19. Turmoil

    Turmoil Notebook Evangelist

    Reputations:
    115
    Messages:
    485
    Likes Received:
    30
    Trophy Points:
    41
    From the information available, that seems to be the case... I already have an Alienware 17 on order with the 780m so I'll just hang on to that and wait to upgrade to a Maxwell chip... which brings me to the question: Will I, currently with a Kepler, be able to upgrade to a Maxwell when they are released?
     
  20. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    When TSMC has a 20nm process that's ready to produce chips lol.
     
  21. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    well aside from 20nm, 860m uses the new maxwell but 880m doesn't. it seems johnkss has tested 880m, and techinferno it is strictly a 780m rebrand with more VRAM and thats it, where as 860m a lower performance brand gets the new design.
     
    Mr. Fox likes this.
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    From what I can see from John's testing, GTX 880M is going to be a least slightly better than GTX 780M in much the same way that GTX 780M was better than GTX 680M. Clock-for-clock they appear about the same, but that's with no true driver support for GTX 880M yet and a vBIOS mod that is still a work in progress. We have already seen that it is probably going to overclock even better than GTX 780M, and that alone makes it worth having if you find great value in that like some of us do. Remember, in the beginning GTX 680M gave GTX 780M a real run for the money, but now there is no question of GTX 780M's performance superiority. The only thing we can say with certainty right now is that a stock vBIOS sucks on just about any NVIDIA GPU.

    So while it may be the same with more vRAM, it may not be by the time the fat lady is done singing. I still think it's too early to make the call... time will tell whether it is "only a rebrand" or clearly better. NVIDIA is doing what appears to be some back-door tweaking with their drivers, so they might surprise us with more than we can see. They might even quietly tone down GTX 780M performance just a touch through drivers (yes they can do that) to make GTX 880M look better. More to come as we watch the story unfold.

    Sounds like you're talking about Haswell, LOL. Newer ain't always better. What the chip designers are marketing (i.e. telling lies to make the sale) as efficient might actually be deficient when it comes to performance. If that's the case, they can stuff their "efficiency" nonsense where the sun don't shine. ;)

    And what's up with the "well" naming kick these chip makers are peddling? Haswell, Maxwell, Broadwell... what next? Cornwell? Coswell? Roswell? Well, well, well? :D Hopefully none of them go down the same road of performance mediocrity as Haswell or it won't end well.
     
  23. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    just couple somethings I'd like to disagree on.
     
    Mr. Fox likes this.
  24. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    The thing I worry about is that efficiency might totally go out the window once overclocked - just like Haswell. Then we are back to simply comparing performance increases. Sure low temps will be nice but it's not like 780m's run hot anyway. More so 330watt for 2 x 780m's is plenty for most people since they are usually to worried to even OC, let alone have to keep an eye on temps. I've noticed most people that harp on about 780m using too much power haven't even attempted to tap it out anyway, or got lumped with power hungry Haswell which also claimed 'efficiently', more often than not are repeating what they hear. I don't see why people are waiting around to go for an efficient 8xx, when if at the end of the day if performance is what they want we've got the daddy 780m's sitting under their noses for bargain prices right now.
     
    Mr. Fox and unityole like this.
  25. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Brother John has already shown us that the 880M can do more than 780M... I haven't seen a single 780M that can touch results like this single 880M. Considering the driver support is not even present, this seems like something with potential to shape up into something more than a little bit better, or just a rebrand with more vRAM. We will need to wait and see, but I am expecting more than a rebrand with respect to performance potential.

     
  26. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    Did you run any firestrike extreme on it?
     
  27. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    since I dont understand the score of GPU benchmark, better as in how much better in % for performance ?
     
  28. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Here's nvidia are claiming and releasing for CeBIT.

    image.jpg

    To be honest it doesn't look like a high end 20nm Maxwell will be out for a long time. nVidia seem to be very careful making sure each GPU tear has only a marginal increase wether it be Kepler or Maxwell. Anyway, for what it's worth they are claiming a 15% increase with 880m over 780m.

    Mixed technologies as part of the 8xx series. With the high end flagship still Kepler makes me think that are still away from making things scale well for higher end cards.
     
    Mr. Fox likes this.
  29. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    I'll let you do the math... ;)

    Single 780M below compared to the above single 880M. Note benchmark score and difference in max stable overclock on core and memory. Without proper driver support and fully tuned vBIOS it is still an improvement. Give the drivers and vBIOS more time and the gap will most likely widen further.
    [​IMG]
     
  30. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    that chart is just so wrong and to blind the public and we alienware folks are better than that. nvidia post the chart with info on default 780m clock to default 880m clock, then they get 15%. from my point of view here we are comparing 3920xm and 3940xm, where as 3940xm is likely to be better optimized and nothing more. Here for the real performance talk, so bring up the big gun, OC 880m to its max compare to 780m to it's max, if we will get 15% from it, if we do, great job nvidia optimized the card like true masters, if not it totally make sense cause its well within realm of reasons.

    images.jpg

    as for OC card it may or may not oc well, thats not too much of a worry. theres a competition in GPU, not so much in CPU so even if haswell won't OC well, GPU will probably do just fine.
     
  31. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Hence why I wrote nvidia are 'claiming'. Yes I'd also take it with a pinch of salt. However you asked for a % so that nVidia speak seemed to for the bill :p Really you need to look no further than Johnksss's (lol too many 'S' going on) OC testing on p1 to get a decent idea. It showed a measurable improvement so 15% down the road could be entirely possible once a decent set of drivers come out.

    Yeah it could have been better silicon but that unknown factor is always going to be there so who really knows right now.

    Here's something interesting in this article. It says there will also be 880m available in 4gb too and the possibility of flashing a 780m is entirely possible.

    QUOTE

    " As you can see below, GeForce GTX 880M is just a rebranded GTX 780M. Card still uses GK104 GPU, which is now used in 3rd generation straight as a mobile high-end model. The only difference is that GTX 880M will be available with 8GB memory, but of course there will also be 4GB models as well. There are sellers already offering a chip modification for GTX 780M owners. So if you feel like owning the first 800 Series GPU, this is the cheapest way to go."

    Id like to know of Johnksss and SLV7 could find anything interesting from the vbios about the finer differences between the two cards - if any.
     
    reborn2003 and unityole like this.
  32. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    I recall john mention 880m overclocked higher than his 780m
     
  33. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    Every sample will be different of course.
     
  34. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    yes that was what I thought but since it is off to a good start, I really doubt we'll see 15% since shader, and everything else is almost the same. as for card optimization that I do not know but John didn't mention something about super OC or way higher performance, he just said a "new king" i think.

    it sucks I still can't believe nvidia and amd pulling this crap lol but guess can't help it no 20nm. Nvidia could have gave maxwell with 880m but donno why it didn't happen. maybe they wanted their next gen mobile GPU to be 20nm + Maxwell and be way better AMD's GPU to gain most market share, who knows.
     
  35. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Until we see some talented overclockers that prove otherwise, I am going to assume the new AMD mobile GPUs still suck compared to NVIDIA. We have a track record of lameness that needs to be broken with AMD. In performance, reliability and driver support there is a clear distinction with AMD winning at nothing but being the low price king in the mobile market.
     
  36. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    14.2 driver is amazing with all the games supporting mantle its awesome. take a look at the desktop graphics and their driver, nvidia is barely on par with the new cards. mobile side AMD lacks but not by much, especially with newer drivers and all. for now i'm just going to assume 880m isn't as good because its pratically a rebrand, even if you can oc higher than 780m it'll be proportional for performance to percentage of how much you can actually OC it.
     
  37. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    take a look at this Maxwell goes mobile with Geforce 800M series

    840m 17watts skyrim at 30 fps? imagine if 880m had maxwell then we probably wouldn't need dual AC adapter anymore thus future gaming laptop in power consumption looking brighter. less power used means less heat and if GPUs themselves stay in good shape we can see some real OC, perhaps 1250 core and 1700 memory LOL
     
  38. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    You see the potential of not needing dual power bricks, I see the potential of performance if you push it there anyway ;)
     
    Mr. Fox and unityole like this.
  39. turilo

    turilo Notebook Consultant

    Reputations:
    59
    Messages:
    117
    Likes Received:
    44
    Trophy Points:
    41
    Will the new Alienware 17r5 with current 770m gpu support these new 880m gpu? Assuming. Cpu also needs upgrade from 4700 correct?

    Sent from my SM-N9008 using Tapatalk
     
  40. ole!!!

    ole!!! Notebook Prophet

    Reputations:
    2,879
    Messages:
    5,952
    Likes Received:
    3,982
    Trophy Points:
    431
    yes i'd still want dual PSU for next flagship clevo machine for sure. no point not pushing it lol.

    it'd support them depending on the vbios. check with your laptop vendor.
     
  41. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Instead of dual AC adapters it would be a lot nicer to have just one huge 600W to 800W AC adapter, then you would not have to mess with the junction box. You could still have two if you wanted to... one ordinary 330W for web surfing and business use, then the beastly big one for the more exciting stuff.

    I actually kind of like the fact that wicked hardware draws tons of power. It distinguish those products and their users from the "meh mainstream" casual user. I am skeptical greatness will ever be achieved by hardware with conservative power demands. All that should be good for is creating more headroom for even more ridiculous extreme performance. In other words, if you could get a GPU that performs well at 75W and pushes 1250 core/1500 memory, why not push it to 120W and get 2800 or 3000 for a core overclock and still need 600W+ instead of being excited about low power consumption? The trouble with making the low power trash is they make it low capacity so you don't really gain anything in terms of performance, and you can't push it harder because it will blow up or melt. The notion of maintaining status quo performance with less power consumption kind of sucks in my mind.
     
  42. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    The nice thing about two 330W bricks is if i'm going somewhere and just need the machine for basic work I can take a single brick with me and it's a bit lighter and takes up less space, especially when the junction box is tiny.

    I've been monitoring the power on each brick and it's very good at load balancing the bricks at high loads.
     
  43. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    That's what I said... you could have one 330W brick for that kind of thing and one huge one for the fun stuff. Same concept except you would not use two in unison, just one or the other.
     
  44. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Do you really need all 660W, though? If I remember correctly, you only reached like 480W. Am I off on that? My point being, if they can make something between the two options you've suggested, it may be the best of both worlds.
     
  45. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    The necessities in life :D
     
  46. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    For overclocking yes certainly for my system.

    I have drawn over 600W from the wall. Of course mine is a bit more power hungry but not absurdly so.
     
    Mr. Fox likes this.
  47. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    Wow, I didn't realize you could reach that level of overclock with these systems... I suppose you learn something new every day. :thumbsup: In that case, scratch my idea for something in the middle. The dual PSU modification would be the best option. Even if they were both 330W with some form of dock/adapter, you could still leave one at home and take the other on trips. ;)
     
  48. Johnksss

    Johnksss .

    Reputations:
    11,531
    Messages:
    19,450
    Likes Received:
    12,805
    Trophy Points:
    931
    Wow, lots of..er...umm.. stuff in this thread. :D
    Edit:


    Has nothing to do with the add vram is there, since most gaming has yet to pas the 4GB mark, but there are other programs that can use it.


    Refer to prior post


    I would cancel and re order with 880m's The price should be pretty close. Or you can wait 6 months to a year for high performance maxwell.


    :D:D

    Yes. Refer to hwbot for all test ran.


    This is Htwingnut and Cloudfires area of expertise.



    That it did.


    And the 2 samples we tested did pretty much exactly the same thing in two completely different machines. Don't know if they are good or bad. Could be bottom of the barrel or top of the line. Who knows at this point.


    We haven't tested the 290 yet, but everything else below it is no where near up to our standards.



    All the driver optimizations in the moible world will not help amd here. Now on the desktop side, that is a whole nother story. the 290 is a very comparable card at benching and gaming. Not so much on the laptop side.
    If you want to have outstanding numbers you will need a dual psu no matter what. That's the way of the world. Only difference would be now... You can go higher. with the same watts as prior gens.


    AW would need to redesign the power jack. Where as clevo has the right power jack to start with. 4 pin jack.


    i was able to pull just under 600W with mine.


    This is what we have propose to AW, more so than not.
     
  49. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,203
    Messages:
    39,332
    Likes Received:
    70,615
    Trophy Points:
    931
    Actually, I have gone over a little over 600W and tripped one or both of the breakers on my dual AC adapter pushing the system hard several times with the R2 recently. I think that might be what knocked out one of my 330W AC adapters on the dual setup a few weeks ago. If you run 4.8GHz and 1.2V at 1200/1500 600W does not go far at all. And, that's with Ivy Bridge. I hit 140W with the 4930MX in the 18 the other day. Ivy rarely hits 100W. If the Alienware 18 was capable of using the dual AC adapter I suspect it would be pulling closer to 700W. We don't know what it is capable of, and probably never will, because it is crippled so badly in terms of power handling. We're only recently discovering what the R2 is capable of given enough power to work with. It's unfortunately the 18 is the way it is.

    As you can see here in this video, a modest 4.3GHz on CPU and 1125/1500 and only 1.137V pulls about 560W.

    And, they should get cracking at that right away. No need to reinvent the wheel... just get with the program and use what's already available.
     
  50. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,426
    Messages:
    58,174
    Likes Received:
    17,885
    Trophy Points:
    931
    I have attached a comparison of your single 880m score vs my sli 780m.
     

    Attached Files:

← Previous pageNext page →