The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Brace yourself: NEW MAXWELL CARDS INCOMING!

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jul 14, 2014.

  1. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Architecture's have a finite number of process generations they work over, AMD may have tuned theirs to shrink lower.
     
  2. justinb127

    justinb127 Notebook Enthusiast

    Reputations:
    0
    Messages:
    40
    Likes Received:
    0
    Trophy Points:
    15
    Is there anywhere i can preorder a laptop with the new 880m or 980m i was going to get one but decided not too seeing as though they will be out within the next month or so i dont want to spend cash now on something then right around the corner it something better is there well thank you
     
  3. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    Y u no punctuation? As far as I know no preorders are available yet. There's probably an NDA until the Nvidia-event 10-11th September and preorders will appear after that.
    And just to clarify 880M has been out for awhile. 980M and its lesser siblings are the new ones.
     
  4. justinb127

    justinb127 Notebook Enthusiast

    Reputations:
    0
    Messages:
    40
    Likes Received:
    0
    Trophy Points:
    15
    sorry....meant the new 880mx or 980m, which i dont know what the new card will be called and thank you
     
  5. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah can`t wait.
    I sincerely hope the live streaming is a ton better than the recent AMD one. That was a complete waste of time X_X

    Zero details have been given about the GTX 980M.
    We will have much more information to go by once GPU-z leaks from the GTX 880 hit the internet. Preferably the GTX 980M but Nvidia is keeping their cards close to heart this time.

    I don`t think we will see any information from Nvidia regarding the GTX 980M until AMD have released the M295X.
    Nvidia got to market the 980M you know. "GTX 980M is xx% faster than M295X. Go buy it" etc
     
  6. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I'm not gonna lie, part of this whole 880M debacle and 980M wait is why I'm going back to desktops after a 10 year hiatus. I mean yeah I use a desktop at work but I haven't used a gaming desktop since 2004.

    Pretty much settled on X79 with 4930K now. Good performance that doesn't break the bank (or rather, I don't feel like throwing wads of cash at a 3 year old platform), and those extra 2 cores will come in very handy for some of the molecular modelling stuff I do on occasion.
     
  7. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    if you think that a socket that is EOL is going to be better than a socket that should still have 3 years of life, suit yourself.
     
  8. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    X99 only supports DDR4, and I don't like paying more for less. Plus Haswell's inconsistency and poor OC headroom just about cancels any IPC advantage it has over Ivy Bridge.

    The only thing I'm missing from X79 is native USB3.0 support and only 2 native SATA3 ports. Both of these are things I can live with.
     
  9. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    http://www.vcpost.com/articles/25819/20140825/nvidia-geforce-gtx-880-870-gm204-gpu-release-date.htm

    Here are the complete rumored specs of the Nvidia GeForce GTX880:

    28 nm GM204 silicon
    7.9 billion transistors
    3,200 CUDA cores
    200 TMUs
    32 ROPs
    5.7 TFLOP/s single-precision floating-point throughput
    256-bit wide GDDR5 memory interface
    4 GB standard memory amount
    238 GB/s memory bandwidth
    Clock speeds of 900 MHz core, 950 MHz GPU Boost, 7.40 GHz memory
    230W board power

    If true... Damn.

    Sent from my HTC One_M8 using Tapatalk
     
  10. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    forgot about the death of ddr3 as well, its going to be a build that won't expand at all. but it doesn't matter I don't consider pcs upgradeable anymore
     
  11. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    I NEED DIS NAO

    GPU upgradeability is all I care about at this point. Honestly IMO pushing DDR4 on X99 is going to become its undoing. At least for the next year, you'll be paying through your nose for DDR4, unless you settle for bottom-of-the-barrel 2133MHz variety that will have terrible latency and perform worse than DDR3 at the same speed while still costing you more. I predict it won't be till 2016 that X99 finally gets to reach its full potential, but by that point you might as well wait for Skylake.

    In any case, I'm not buying anything until GTX 880 is released and performs as rumored with an equally impressive price tag (<$500) to match. I guess I'm technically also waiting for X99 as well but I don't have high hopes for it.
     
    Belly3D likes this.
  12. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    8 core haswell-e looks very tempting for video editing, especially since im planning on moving to 4k video production as soon as i get some extra money. if the 880 rumors are true, then i'm definitely building a desktop. cheaper and faster performance that will blow any laptop out of the water will be nice. plus, 4k monitor will be nice. 980m will really have to blow my mind at this point. laptop gaming is killing my shoulders because of the small screen real estate.
     
  13. Link4

    Link4 Notebook Deity

    Reputations:
    551
    Messages:
    709
    Likes Received:
    168
    Trophy Points:
    56
    7.9 billion transistors and 3200 cores, thats not GM204 anymore. It's either BS or someone decided to mix the specs of GM200 with GTX 880 known specs (memory bus and bandwidth) .
     
  14. irfan wikaputra

    irfan wikaputra Notebook Consultant

    Reputations:
    80
    Messages:
    276
    Likes Received:
    94
    Trophy Points:
    41
    omg,i thought maxwell mobile will be delayed until next year??
    and i just bought my P377SM-A with GTX 870M SLI :(
     
  15. Mr Najsman

    Mr Najsman Notebook Deity

    Reputations:
    600
    Messages:
    931
    Likes Received:
    697
    Trophy Points:
    106
    No it looks like an October release, but the actual performance difference is still uncertain. They might save some juice for 2015 so don´t despair just yet.
     
  16. irfan wikaputra

    irfan wikaputra Notebook Consultant

    Reputations:
    80
    Messages:
    276
    Likes Received:
    94
    Trophy Points:
    41
    How am i not despair?
    after reading throughout this thread, there is a ridiculously high chance that my GTX 870M SLI will be beaten by a single GTX 970M or GTX 980M if I am lucky enough lol
    and they would obviously cost less than what I just paid for the machine.
    well, blame the technology
     
  17. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Why so few ROP's?
     
  18. irfan wikaputra

    irfan wikaputra Notebook Consultant

    Reputations:
    80
    Messages:
    276
    Likes Received:
    94
    Trophy Points:
    41
    because that is what you get for 256-bit memory width
    16 = 128 bit
    24 = 192 bit
    32 = 256 bit
    40 = 320 bit
    48 = 384 bit

    8 rop/ 64 bit ratio
     
  19. deadsmiley

    deadsmiley Notebook Deity

    Reputations:
    1,147
    Messages:
    1,626
    Likes Received:
    702
    Trophy Points:
    131
    Even if this is true, you still got GTX 870M SLI for about $50 over the cost of a single GTX 880M. That's not a bad deal. Plus if you look around the forums you will notice people still using older SLI systems. Why? Because they are still viable. My next system will be SLI for this very reason.
     
    irfan wikaputra and gabrielmocan like this.
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    The GPU-Z from GTX 870 said 1664 cores (13SMM).
    It may not be accurate though since GPU-Z added support for GTX 870 couple of days later.



    But say it is.
    Would you guys really expect the GTX 880 to have 3200 (25SMM) cores? I think GTX 880 (and GTX 980M) to have maximum 2560 cores but looking at GTX 870, I`d say 1920 (15SMM) cores sounds more plausible.
     
  21. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    SLI looks more expensive but has much more longetivity

    Sent from my Nexus 5 using Tapatalk
     
  22. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Are you sure? Where can I find more info on this? As far as I know, rops have been de-attached to a particular dependency, because you can see AMD Cards with 384 bit bus and 32 rops (tahiti). Or maybe the dependency is only due to design from nvidia's part? (Then again, R9 290x does have 64 rops with 512 bit bus)
     
  23. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    Nvidia have never deviated from that ratio.
     
  24. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Makes sense. Though it is not fixed as a rule, it is a design choice. And yeah I did notice all their GPUs follow that ratio.
     
  25. joshwaan2k

    joshwaan2k Notebook Guru

    Reputations:
    0
    Messages:
    56
    Likes Received:
    10
    Trophy Points:
    16
    I got a 770M in my Dell M6700 which is great. I'm looking at a 870MX would TDP be about 75 instead of 100 like the 870M do you guys think?

    Also do you think MXM 3.0B will still be used with this line of cards?
     
    octiceps likes this.
  26. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    For maxwell it should be. For broadwell maybe not! Although I like to hope they keep the MXM 3.0 B form factor and connector and just change a few small things to make it backwards compatible
     
  27. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    This is very interesting.

    There is a rumor going on a chinese MSI forum that MSI GT72 will be listed both with GTX 970M and GTX 980M on October 11th.
    The NDA might end then.

    Makes sense regarding the October availabilty I talked about earlier. Cheers :)
     
    trvelbug, Mr Najsman, RMXO and 2 others like this.
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well... let's see if that 980M is gonna be 100W and better than the 780Ms in every respect =D
     
    Cloudfire likes this.
  29. RMXO

    RMXO Notebook Deity

    Reputations:
    234
    Messages:
    722
    Likes Received:
    242
    Trophy Points:
    56
    Let say it does come out would the current MSI GS60/70 Stealth Pro, Gigabyte 35w v2 and its barebone variants be able to have a 970M? Or no due to needing a new motherboard to take advantage of the new architect?
     
  30. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,323
    Likes Received:
    3,820
    Trophy Points:
    431
    It will be compatible. The socket did not change. It will probably change with Pascal.
     
    Cloudfire and RMXO like this.
  31. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`m still guessing GTX 980M will be a 100W GPU based on GTX 880. And GTX 970M will be a 75W GPU based on GTX 870.
    I hope I am right because that will put the GTX 980M quite up high in performance since GTX 870 scored X4600 in 3DMark11.

    Shouldnt be a problem at all as long as they dont use soldered GPUs
     
    RMXO likes this.
  32. Rupp3r

    Rupp3r Notebook Geek

    Reputations:
    7
    Messages:
    95
    Likes Received:
    3
    Trophy Points:
    16
    Do you think it will come out on the GT70 aswell? (the 980M)
     
  33. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Isnt it about time they concentrate on the GT72 and remove the GT70 from the market dont you think? :)
     
    D2 Ultima likes this.
  34. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I'd be more worried about being locked out due to BIOS issues.
     
    Mr Najsman likes this.
  35. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    New generation machines usually have different connectors.
     
    RMXO likes this.
  36. RMXO

    RMXO Notebook Deity

    Reputations:
    234
    Messages:
    722
    Likes Received:
    242
    Trophy Points:
    56
    Care to share more about that?
     
  37. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The BIOS probably would simply prevent you from putting in a non-supported card.
     
    RMXO likes this.
  38. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    the gs60 already has a mobo with the 850m which is maxwell. So when needed the rewiring of the mobo should be faster or so I think.

    the p35w v2 on the other hand doesnt have a maxwell model, and thus should suffer more. however gigabyte was one of the first OEMs that made the maxwell gpus available


    basically the main problem here is the wiring of the mobo and what needs to be changed to support the new gpu arch.

    in the soldered gpu realm thats what matters
     
    RMXO likes this.
  39. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    I'm really glad that nVidia decided to focus mobile first- this may provide us with the last true 100% gains generation. (Or hopefully close...)
    I also hope it lights a fire under AMD's butt and they get it together!
     
    Cloudfire likes this.
  40. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Doesnt look like the lack of competition from AMD is stopping Nvidia no.
    It looks like AMD and Nvidia have shifted roles though. AMD was almost always first with new architectures earlier, now it seems Nvidia is leading the pack. Someone got to take the first step I guess.

    Not sure what I will call the M295X though. The cards are still in the air about that one, but doesnt look very promising.
     
  41. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I want AMD to bring out something amazing though. I want them to try to take back the mobile market from nVidia. In most desktop forums when someone wants power, they just throw R9 280X or R9 290X cards at people for good prices and most people I see just consider nVidia overpriced, whereas here we NEED to have nVidia for the good stuff. Anybody with crossfire 7970Ms or an Enduro laptop from 2012 or so will probably never want to go back to AMD in the mobile market.

    So let's see AMD bring the thunder, the fire, the water, the army of cows, let them bring everything.
    As for nVidia, I don't want a 680M-esque card with such limited memory, and I don't want a 880M overheat-central-with-crash-flags card. Then, we'll be set for a nice long time =D. Hopefully both companies can pull through and provide a healthy competition scale on the mobile market. I am about 300% certain the 880M would have been a fantastic and perfectly amazing card with 0 issues had there been something even close to 780M power from AMD
     
    Cloudfire likes this.
  42. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    My 7970m crossfire was OK there was nothing inherently bad about it that made me hate amds guts. There was this weird issue though where it looked like one frame was sliding down in front of the other frame with horizontal bars running across the screen that only showed itself in certain games..

    Sent from my Nexus 5 using Tapatalk
     
  43. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  44. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That's because AMD's drivers back then were a mess in regards to frame pacing in CrossFire. They still haven't fixed it in DX9 games and probably never will.
     
  45. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    That was how CrossfireX rendered your game in the past, and should not happen now. It's the frametime difference that people started making a huge deal about recently. CrossfireX previously rendered a small square of your screen, approximately 1/2 the pixels, on 1 card, and the L-shaped border to fill the screen was done by the other card, and it could cause green lines or sliding frames.

    Again, it shouldn't happen now, but it was a problem in the past I think.

    Edit: Also, you're one of the few people I've seen who had dual AMD GPUs in the mobile market who does not dislike them
     
  46. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    That sounds like some really funky SFR stuff going on there. SLI and CrossFire have both used AFR for years in games.
     
  47. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I knew CrossfireX had AFR, but my friend who had dual 7870s showed me how that worked and said that he got issues like that; which is why he sold off his second 7870 and kept it for a year until he got his R9 290. Oh well. This is why I haven't written a crossfire guide. XD.

    But still, I was shown and told how it happens by someone who had the cards.
     
  48. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    It was really weird, it would happen regularly in Payday 2 but most games would be fine. Also the problem would happen pretty much in every game I ran when I output through the mini-displayport to a U2713. Kind of turned me off tbh and I'm not too upset about not having the r2 anymore because of it.
     
  49. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Well the big issue with CrossFire in the past was that there no frame metering going on in either hardware or software. Each card would simply render and send its frames to the display as fast as possible with no regard to synchronizing/spacing them out evenly, which is why you had frames colliding/overlapping with each and causing stepped tearing and runts or dropping entire frames altogether. All of which, of course, added more frame time variation on top of what is already inherently present in multi-GPU configurations and contributed greatly to so-called microstuttering. Some games were so bad that they were dropping or runting every other frame, which meant your actual frame rate was half of what was being reported by FRAPS and you were getting an objectively worse experience with CrossFire than with a single GPU.

    AMD later added software frame metering through their drivers for pre-GCN 1.1 cards while GCN 1.1 cards additionally have hardware frame metering through XDMA engine. An issue with this is that there is a little bit of overhead as the driver is essentially limiting how fast each card can go in order to smooth out the overall delivery of frames. Absolute performance (how fast the cards can render if completely unfettered) is reduced while actual real-world performance goes up.

    SLI hasn't suffered from this problem because Nvidia's had hardware frame metering on their cards since G80.

    Yep, and Payday 2 is DX9 so frame pacing has always been broken in it.
     
    Splintah and Robbo99999 like this.
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So do you guys trust me about GTX 980M and GTX 970M now? ;)
    Desktop will also be on the 900 series along with mobile

    Only at VC: NVIDIA to skip GeForce 800 series, GeForce GTX 980 and GTX 970 mid-September | VideoCardz.com

    September 9-10th is when Nvidia will have the big announcement. Put a big cross on your calender. We are veeeeery close :)
     
    Mr Najsman and Killerinstinct like this.
← Previous pageNext page →