The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    AMD 7970m vs GTX 680m

    Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.

  1. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    Im fairly certain that ambient occlusion and TxAA is an nvidia exclusive technology as is physX. I'll be glad to be proven wrong. Kinda sounds like we are going to be paying roughly $100 per technology, haha :rolleyes:
     
  2. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    But there are several games where I've enabled ambient occlusions and playing on my current laptop's ATI 6570M... Are you saying this isn't enabled at all?
     
  3. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,462
    Likes Received:
    12,847
    Trophy Points:
    931
    thanks brother speedy.
    i wanted to see what effect it has on 3.0 and the other test would be 3dmark11...


    since this is where it counts most. this is the reason why vantage was a good bench...you cant just readily fool it into thinking something different. turning on and off lod or tess on and off. the score is still the same. same as if you ran one core or 8 cores..the gpu score is still the same with no influence of tess off or more cores.
     
  4. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Ambient Occlusion isn't a Nvidia exclusive thing. And "ambient occlusion" is a generic term. There's multiple ways of doing it (e.g. HBAO, SSAO). It's like saying "antialiasing" without mentioning what specific kind of AA.
     
  5. Prasad

    Prasad NBR Reviewer 1337 NBR Reviewer

    Reputations:
    1,804
    Messages:
    4,956
    Likes Received:
    10
    Trophy Points:
    106
    Yes, and both HBAO and SSAO seem to work just fine on my ASUS laptop w/ ATI 6570M on BF3!
     
  6. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    Sorry, i forgot to mention any specifics, I was just using it in a general term, as i was under the impression that AMD didn't have the technology to perform any form of AO, thank you for the correction.

    That's good news :D as i haven't experienced AO before and would like to see it for myself on my 7970M

    Can anyone else confirm if AMD users receive TxAA?
     
  7. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    It's not that.

    Nvidia has the feature of driver enabled AO, which forces it in any designated game, regardless of whether said game has an AO option in its graphics option menu.
    With AMD we have driver enabled MLAA.
     
  8. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    AMD released a revised SSAA with 12.4 :).
     
  9. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    Speaking of Driver's i am really excited about the Intel RST 11.5 which supposedly supports TRIM in an SSD Raid 0 array.

    OT:

    Can anyone link me to a page explaining MLAA and SSAA?

    Or if someone knew the differences between these and the combination of FxAA + MsAA i.e TxAA

    All i get is Master Locksmith Association Australia and Sporting Shooter's Association Australia :eek:
     
  10. Arestavo

    Arestavo Notebook Evangelist

    Reputations:
    188
    Messages:
    559
    Likes Received:
    215
    Trophy Points:
    56
    Could you point me to some info regarding this? I've done Google searches, and I am unable to find information about this.
     
  11. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    I am 100% sure i read that somewhere, but i can't quite find it at the moment; i'm at work haha, but here is a snippet and another.

    I have 7 days to change my order from an AMD 7970M to a GTX 680M, c'mon people where are the benchies :D I mean TSMC can we has 28nm chips?
     
  12. DrChips

    DrChips Notebook Guru

    Reputations:
    28
    Messages:
    55
    Likes Received:
    9
    Trophy Points:
    16
    Reading this thread from the beginning is like watching groundhog day.

    I think Ive read approx 50 pages of the same people reiterating the same speculation and self-praise that they've already given us 10 times in this thread already

    I hope the 680M is awesome. So I can regret my 7970M purchase, but have something stunning to upgrade to.

    Im getting tired of having to read 10+ pages every day looking for real info only to read the same posts as the day before, reiterated in slightly different words by the same people. And Im sure Im not the only one

    Hats off to those of you who put in the time to give us real numbers as soon as humanly possible. Slickdude, I like you.

    -Tristan
     
  13. drgandalf

    drgandalf Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    Yooo vuman how much did it cost to ship that sager to sydney im ther too
     
  14. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    OT:

    I didn't order from Sager, I ordered from LogicalBlueOne , I didn't want to risk any warranty issues, so i went with a local reseller, plus they're great with communication and are cheaper than the 3 other resellers in AUS.

    Shipping was $29.99 (3-5 business days).
     
  15. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    With respect to benchmarks, especially games, there needs to be a more comprehensive system of testing and scoring. For example, static benchmarks like 3dmark 11/vantage etc should be weighed much less than video games (which they typically are). However, even among games, SP games should have much less weight than popular AAA MP titles which hold 100s of hours of replay.

    For example, if a gamer finishes an 8 hr SP game, even if its an awesome DX 11 title, they likely won't go back to it or at the most once or twice unless it has a huge modding community behind it. So if nVidia 680M gets 55 fps @ 1080p with ultra settings and AMD 7970M gets 70 fps @1080p, the fact that it is a SP game should weigh into giving it a score on a performance scale.

    On the other hand, if a game like BF3 with settings on ultra @ 1080p (which even stresses high end desktop cards) gets 60 fps with 680M on a heavy MP level (multiple runs of FRAPs should be run for accuracy) vs 45 fps on 7970M, then there should be more weight placed on a benchmark like this since it has a large community of players replaying the same title daily vs a SP game. This can extend to other MP titles as well but this is just one example.

    So before arguing about benchmarks take this into consideration and then decide which brand really provides the best performance and value for gaming. Of course there is always feature support, 3rd party development support and drivers that also make a huge impact. Most people here just simplify it to overall benchmark % + cost and that is quite misleading.
     
  16. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    I know right, it is like half price for US and they complaining "it's too much to upgrade" when they didn't spend half as much as we do in the first place.
     
  17. vuman619

    vuman619 Notebook Evangelist

    Reputations:
    381
    Messages:
    367
    Likes Received:
    2
    Trophy Points:
    31
    wow that was quite informative, i honestly wasn't considering the games themselves and how much value they add to each card's scores. I thank you Joker, very much, I am now determined to get the correct card once i can safely see the gtx 680m in use by the greater community.
     
  18. oan001

    oan001 Notebook Evangelist

    Reputations:
    256
    Messages:
    482
    Likes Received:
    0
    Trophy Points:
    30
    I agree! 10char
     
  19. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Since it's been happening often in this thread lately, i'll leave a general warning: Do not bypass the language filter, it's there for a reason and it's explicitly written not to do it in the forum rules.

    Now back to the speculation until we have solid facts. :p Hoping to see some 680m thorough testing soon as i'll soon be in the market for a new notebook.

    EDIT: since i saw someone ask the question, if you can't delete your own double post for some reason, report it and a mod will gladly remove it for you. ;)
     
  20. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah it looks like you are becoming a grumpy old man. Bad eye sight is also a good indicator that you are getting old. Bad memory too. A loss of logical thinking too. It don`t look good for you
     
  21. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,462
    Likes Received:
    12,847
    Trophy Points:
    931
    im going to have to disagree... since not a lot of people actually running 3dmark11 extreme settings. (1080P) everyone is basing it off of the 720P.
    you get 60+ or higher in all 4 test in that version of the benchmark...then your golden...dont see that happening with this gen mobile...maybe next gen mobile...and that's a very big maybe. and that bench is taxing your cards far more than any game is at this current time.
     
  22. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30
    Indeed. 3DM11 X score is true test of cards power. Its not a full power stress test, but it shows how capable your card is in overall.
     
  23. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,462
    Likes Received:
    12,847
    Trophy Points:
    931
    and just what exactly is? other than furmark. because now im curious.
     
  24. arcticjoe

    arcticjoe Notebook Deity

    Reputations:
    66
    Messages:
    877
    Likes Received:
    67
    Trophy Points:
    41
    the whole point of benchmarks is to create an identical environment to accurately measure performance, which unfortunately is not possible in multiplayer games (too many variables, no way of ensuring that both machines are running of the same data).
    There will always be differences in optimisation between different applications, but you still get a reasonably accurate idea of raw potential of a graphics card from a benchmark result.
     
  25. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    ^ I agree when you say that you get a reasonable performance snapshot of the card, but that should be valid between cards of the same brand.

    When comparing these scores with a card of the same segment but from another manufacturer people should consider that there are a few factors that can determine a difference between them.
    Speaking of points, huge difference in terms of 1000 points should be takes as a "huge performance gap" but when you're in 200-300 range it's not even worth having an argument.

    What peoples use these benchmarks for is only to brag or flame - there are, thanks god, some exception in the form of users who like to do so in order to help into having an idea.

    Just to further explain my point, I've battled a few times some HD6990 that supposedly should concur with gtx580m (I own the 485m).

    I was able to stay ahead of them in some benchmarks but I was a bit behind (200ish) in some others;

    Bottom line is that both cards in game tests and benchmarks differ for maybe a 1-2fps, or they don't differ at all.
    Same happened with 6970 vs 6990 iirc.

    Fact is that in real life situations you can't be sure that the faboulus 10% (or more as people say) advantage in benchmarks actually does something ground breaking.

    In my book as long as it stays above 55fps is fine. Also AMD needs to release an official driver as yet for the hd7970m so as I said there's much room for improvement. Nvidia had their drivers quite mature due to desktop kepler cards.

    I just want to take everything into a context guys, shooting percentages and numbers here and there doesn't help at all, it can confuse people more than helping them.

    Ok so BF3 does 60fps ultra/1080p on 680m?
    Who cares. I'll do Very High with Fxaa/Af/and 2xMsaa with over 75fps and call it a day.

    That's just an example but that's how it should really work.
     
  26. fantomasz

    fantomasz Notebook Deity

    Reputations:
    147
    Messages:
    1,113
    Likes Received:
    0
    Trophy Points:
    55
    any news about samsung with 7970?
     
  27. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    PERFORMANCE is not a very good indicator on graphic cards that is going to be used on 1080p with max settings. There is a reason why most reviewer sites nowadays test extreme settings with 3DMark11/Vantage and use extreme tessellation with heaven benchmark. Notebookcheck is a joke who test the GPUs with 1028x1024 with heaven, and test the games at either ridicilous settings like 720p with no extra goodies and then suddenly goes up to 1080p with every settings on. No middle ground to make the tests more interesting...

    A simple example is Crysis 2. At 1024x768 which is about the same resolution as 3DMark 11 test, 580M score 121FPS while 7970M score 102FPS. But at 1080p 580M score 35FPS while 7970M score 55FPS...

    You won`t hit the GPU (speaking 680M/7970M) with the typical load at 720p. You won`t load the memory and memory bus with enough to see if it will suffer. Whats going to be interesting is to see how 680M/7970M perform at Extreme settings, but most importantly in games. Once you cramp up to 1080p, 16AF and 4/8MSAA, you will see the difference in hardware reflect on the FPS.
     
  28. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    you're probably going to have to wait another few weeks before cards start to trickle into private hands.

    From everything we have seen, if you can afford the +$300, and its gonna let you sleep at night, then get the 680m. I can see that you are already leaning that way. Your laptop will be delayed till early to mid July.

    As of right now, I'm seeing +/- 5% between the cards and we have confirmed through many owners how well the 7970m overclocks. We don't know how well the 680m overclocks yet, and it bugs the crap out of me that they put 900mhz vram on these cards for 3.6ghz effective

    Would the $300 be better spent on a SSD and some ice cream?
     
  29. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    We should parrot what the tech sites do, and come up with our own suite of standardized testing for GPUs.

    I'm willing to help brainstorm.
     
  30. Torment78

    Torment78 Notebook Evangelist

    Reputations:
    261
    Messages:
    555
    Likes Received:
    0
    Trophy Points:
    30
    i must say that sweclockers does great testing they do acouple of benches with 11 and heaven in loads of res and settings then run loads of games dif settings res up tripell screen res so you get a allaround test

    oh ye i just watched a vid from sweclockers and and they have been told by nvidia that turbo boost function have been take of the 680m and it is working with a set speed
    Nvidia demonstrerar Geforce GTX 680M, surfplattor med Windows 8 och Tegra - Computex 2012 - SweClockers.com

    here is a link to the segment
     
  31. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
    I don't follow. So what does that mean in terms of performance?
    p.s you say 7970m is +/-5% the performance of 680m. Is that with 7970m @ stock clocks or OCed?
     
  32. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    It means that with AA and at higher resolutions the performance would start dropping off more rapidly.
     
  33. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
    Higher resolution? I'll only be playing @ 1920x1080 on my laptop. And even if I connect it to my external monitor it'll still be that too. Does that rapid performance drop affect that res?

    p.s I would really appreciate any recommendation for me on whether to choose 7970m or 680m.
    Thx in advance. Much appreciated :)
     
  34. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    ok, i will explain:

    I see this as another 580m vs 6990m situation that we saw last year. When both cards came out, all the PR from both companies was mind boggling. Each company was claiming they had the best and provided "benchmarks" to prove it.

    The bottom line, the 580m proved to be slightly faster around 5%, but some games like metro 2033 and Crysis 2 favored the 6990m. Even now, someone on the m17x forum was blown away getting 50+ in the Witcher 2 with their 7970m, so I don't trust PR BS.

    We are going to see the same situation here with 680m vs 7970m. We have already seen that both the 680m and 7970m score similarly in the synthetic benches (with the 7970m having a small lead). Now, this is a very good indication that they will score similarly in games with some games favoring nvidia and some games Amd.

    And just because one bench shows a card getting better frames in one game, it doesn't reflect on the performance as a whole. So if the 7970m is getting faster frames in Crysis 2, it doesn't make it overall faster than the 680m and vice versa.

    And why I'm irked that they have 900mhz vram on the 680m cards is because overclocking is what is going to make or break this for me. Nvidia put such low bandwidth on these cards on purpose (900mhz, 3.6mhz effective, 115 GB/s). It could be due to heat, or power, but who knows. If we are going to start overclocking the 680m, we may find that the bandwidth or lack there of will hold the card back. It may not be an issue at stock clocks, but we will need more bandwidth as we push the core up. And nvidia isn't going to put highly rated vram on these cards just to downclock them to 900mhz. it doesn't make financial sense to do that.

    I can go to 1ghz/1.5ghz on my 7970m on stock voltage. I saw yesterday someone hitting 1060mhz/1.6ghz @ 1.1v to score over 7.1K in 3dmark11 on a 7970m without an XM CPU (probably over 8K without tess). Temps are great. So if the 680m won't overclock and I can already see the bandwidth limitation, this card is not for me....especially at a $300 premium

    EDIT: This is just my opinion and isn't intended to sway a person trying to decide to buy either card. When i saw the initial nvidia PR slides and saw 15-30% increase over 7970m, I was really happy and it was time to sell my 7970m and get a 680m. But realistically, cards that score identically in synthetic benches, aren't going to magically give you +30% game performance...so that's why I said +/- 5%. So, upon further logical thought, it doesn't make sense at this moment. If this card overclocks, then it may sway me back. As it stands right now, both cards are a wash...you are paying $100 for Cuda, $100 for physX and $100 3d...all of which I will never ever use.
     
  35. Tyranids

    Tyranids Notebook Evangelist

    Reputations:
    332
    Messages:
    525
    Likes Received:
    0
    Trophy Points:
    30
    I have questions mainly about heat:
    1) The 7970M doesn't get too hot at stock, from what people have been saying. But, what about when you overclock it like that? At stock voltage anyway, what kinds of temps are you getting?
    2) I only read one thread but people were saying that the 675M didn't overclock much because it's already an overclocked 580M and runs really hot. Is this usually the way things are with high end Nvidia cards? If so, that would really suck, I like the idea of a slightly faster card (and CUDA could be useful for some applications), but not when it costs $300 more and runs at 80C+ all the time...
     
  36. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    the 7970m @ 1ghz/1.5ghz games at mid 70's, i get 78-79c in furmark

    So dropping to a more reasonable 950/1400, my temps don't crack 70c

    The 675m is a 580m. It isn't faster or slower, they are the identical card. All current 580m's can be bios flashed to 675m
     
  37. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    2) What thread was that? Don't listen to any of them.

    675M = 580M (same clocks, same ocing potential of around 25%-30%). Temps are somewhat high but you are at that point running a GTX560ti in a notebook (same clocks).

    675M and 680M are totally different though, so you can't draw conclusions on one based on the other.
     
  38. Tyranids

    Tyranids Notebook Evangelist

    Reputations:
    332
    Messages:
    525
    Likes Received:
    0
    Trophy Points:
    30
    Hmm those are really good temps. Do you use a cooler? I know you use Alienware systems, correct? I'd be using either card in a P150EM, but the cooling isn't bad there either I don't think... Also, I thought the 675M's stock clocks were higher than stock 580M? I knew they could be flashed though, that really makes me laugh. Only 1 person has to buy the 675M then everyone with last year's card gets it for free (well they already paid for their 580M, but whatever). Thanks for your info though, you're generally pretty helpful in these threads.

    I know they're different architectures, but are people expecting the 680M to run really hot? Or, is it impossible to tell? I don't know really, I guess it would be hard to tell now...
     
  39. mrm2x

    mrm2x Notebook Consultant

    Reputations:
    103
    Messages:
    151
    Likes Received:
    5
    Trophy Points:
    31
    thats basically all there is to say
     
  40. Shizm

    Shizm Notebook Enthusiast

    Reputations:
    25
    Messages:
    27
    Likes Received:
    0
    Trophy Points:
    5
    Can someone tell me what kind of min/avg fps are you getting in BF3 ultra with the 7970m please? both stock and OC.. single player and multi.
     
  41. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    People can expect all the temperatures they like. Until the chip is properly tested and results released we can't really know.
     
  42. core²

    core² Notebook Consultant

    Reputations:
    6
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30

    this has been posted like a hundred times by slickdude and others..... afaik it was always 40+ minimum (stock clocks)
     
  43. Supranium

    Supranium Notebook Evangelist

    Reputations:
    80
    Messages:
    412
    Likes Received:
    0
    Trophy Points:
    30

    There is huge difference in different resolutions as well in between maps and MP vs SP.
    Even HD7970 and GTX680 falls below 40 sometimes. In 64MP Caspian Border when huge action is going on. I have tested BF3 with various cards and maps in Ultra preset 1920x1200. For example Metro map is a joke compared to CB.
    To keep FPS above 60 in any situation, its best to use High preset and perhaps some easyer AA.
     
  44. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    With BF3, you want to reduce everything anyway, makes it A LOT easier to see your target if you turn off all that post-processing.
     
  45. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    with a few simple tweaks in BF3, you can be flying at 65+ fps. i think it is more important to know that the game is playable at Ultra...then tweak a little. Use SMAA or FXAA or bring down the msaa to 2x. it really makes a diff. Also a slight bump in clock speeds really help ;)
     
  46. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    Listen to him folks, he is much more of an nvidia fan than any one of us (he has 670 SLi on desktop, speaks for itself), 680m will be about 5% more than 7970m, cannot be magically beating the card while doing the same on 3dmark11 and vantage.. anyway, we'll see soon enough :)

    BTW Slick, 7.1k tess ON :eek: CRAZY!!
     
  47. SlickDude80

    SlickDude80 Notebook Prophet

    Reputations:
    3,262
    Messages:
    4,997
    Likes Received:
    3
    Trophy Points:
    106
    lol...i don't know if I'm a bigger nvidia fan than some people here, but there is no denying that the GTX 670 desktop is the card to have. I also had a 680 desktop but returned it in favor of 670 SLi...They were cherry picked for me though...both cards do 1400mhz core.

    I can't say I'm an nvidia fan or Amd fan, i get the best performance and do what makes sense to do

    yup...7.1K in 3dmark11 tesselation on. Single 7970m OC'ed to 1060mhz/1.6ghz...and he did it with a non XM CPU which is impressive
     
  48. ObserverJLin

    ObserverJLin Notebook Evangelist

    Reputations:
    77
    Messages:
    382
    Likes Received:
    7
    Trophy Points:
    31
    Hear hear dude. Whomever wants best price/performance and chooses anything else apart from GTX670 in the current desktop technological stage is making an ill choice.

    [​IMG]

    [​IMG]
     
  49. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    ^^ looking from these benches, it is pretty impossible for 680m to get +60 fps in BF3 with the same settings considering it is some 30% downclocked... yeah, nvidia certainly bumped up some numbers ;)
     
  50. Yanm

    Yanm Notebook Enthusiast

    Reputations:
    2
    Messages:
    49
    Likes Received:
    1
    Trophy Points:
    16
    Mmmm....if there's one question that I have about the GTX 680M, I see that it's in 17 inch notebooks primarily, but why put it in the Clevo P150EM and not the MSI GT60? I was always under the impression that MSI's cooling was just as good as Sager/Clevo, occassionally better at times. I was looking forward to a GT60 sometime around Black Friday/Christmas, but if it won't get the GTX680M, I'm not sure what I'll do. I don't believe I'll have the money to buy it+a seperate GTX680M, and then potentially void my warranty by installing it :(
     
← Previous pageNext page →