I'm guessing that is when the cards down-clock?
-
HaloGod2012 Notebook Virtuoso
its because 3dmark gets the system information before your gpu gets loaded for tests, which is dumb, it should take system info clocks when gpu and cpu are at full load so we can see exactly what overclocks are running
-
Now back to your previously scheduled nonsense and speculation. -
Sent From My Rooted EVO 3D -
I really hate to say this but, I realized nvidia put p150em and p170em to their article for 680m, I think it also won't be compatible hm series
(for example see that they cite m17x, not r4) sorry guys
-
Wow, I'm very mad right now. I was stupid and went to work.... and it seems I missed quite a bit of drama. I hate when real life interferes with reality internets.
It is funny how this arguement comes up with every card release by the 2 companies. I saw it when the 5870m came out (vs 460m, I believe) and when the 485m came out (vs the 6990m I think). So I can't wait for the
17650m vs the 999m in a few years. Deja Vu is happening all over again (to steal a phrase).
Both look like they will be absolutely great cards. Unless you absolutely need the absolute best, get the one with either the best price, or that is actually available. I have had both AMD and nV over the years, and with the exception of a couple of noteable missfires, each company continues to either crank out great cards, or refreshes other cards. We fight over which is bigger, faster,stronger ( did you see the 6 million $ man reference...err, actually, did you GET the reference?) for anywhere from a few to several months, and then do it all over again later in the year. Or early the next year. Same discussions, different people.
Buy the card you want to buy. Then enjoy it.
See you all later this year.
Seriously, thanks to you that already have the 7970m for the feedback on it. Looks like a great card. In spite of all the name calling and such, most of us really appreciate what your letting us know. And nice job on your card Slick. That thing rocks. Meaker, will be looking forward to your feedback after your done play.... err, I mean, testing it out. See you on here after the weekend, if you know what I mean. All you people that actually use the hardware, test it and tweak it are why so many of us come to NBR. I always have more respect for what I get from the users, and have learned so much more, rather than the PR that gets tossed out there. Great job, sorry for the TL;DR version. -
I almost ordered my M17X R4 with a GTX 675M, so I wanted to say thanks to slick and the rest for educating me. I have been seeing green for so long I almost forgot to do my home work. I could care less if the 680M is faster I just know that I didnt pay $150 more for card that is like 50% slower. The 7970m is fast enough to stop thinking about gpus and just play some games.
-
, 680M is $400 more than 675M.
-
-
-
Haha wow, what a steal (for them). Good choice going with AMD then.
-
Apparently Xoticpc says they have an ETA of the GTX 680m by the end of June, $295 upgrade from 7970m.
I think i'll wait for 680m benchmarks on games like bf3, 4 GB sounds awesome and good for OC'ing.
and LoL @ Xotic now selling Alienware for people who can't wait for the 7970m. Good for them. -
And why would Nvidia put the HMs in an article about brand new graphics with brand new notebooks, when the machines are no longer being sold?
Chill fam, you're trying to read into things but you're doing so a little bit too erroneously. All we can do is sit back and ask Sager to test it for us. -
Meaker@Sager Company Representative
Dirt3 1080p ultra preset 4xMSAA -
Chances of FPSs in Crysis 1/W/2 and Witcher 2? My 7970M should be with me next week.
Thanks boss. -
SlickDude80 Notebook Prophet
almost 75fps? ya, i think that's overclocked (but not much)
Based on Nvidia's own PR results, it will take an overclocked 680m to beat this score. -
Alright so I think I'm going to go with the 7970m so I would like to know on stock clocks what's the average temperature idle and the temps when playing heavy game's and anyone kind enough to tell me when there average temps are when they OC and what the OC is please
Sent From My Rooted EVO 3D -
Have any of you tried a 670/680 dekstop card and can perhaps attest to its FXAA and adaptive v-sync? I've read a positive review on [h]ardocp which makes me believe the adaptive v-sync alone is almost worth the price of admission. Then there's the superior drivers..?
Man, I hope the 680m will be worth it. -
Meaker@Sager Company Representative
Metro 2033 1920x1080, very high, 4xAA, 4xAF, tess enabled (DX11)
900/1575
Average Framerate: 40.00
Max. Framerate: 99.34
Min. Framerate: 9.46
925/1575
Average Framerate: 43.24
Max. Framerate: 88.79 (Frame: 1744)
Min. Framerate: 10.57 (Frame: 1301) -
-
-
-
Meaker@Sager Company Representative
Max can fluctuate a lot. It's a meaningless number anyway.
-
-
Anyways, nice numbers there! I can't wait to order mineI noticed we have a similar system Meaker, except im running older gen hardware lol, but similar speeds, config etc.
Sucks that my SATA controller is not as fast -
-
@Frost451, I liked adaptive vsync when I was using, also FXAA is crazy good are you joking? it is like 4xMSAA without performance hit (seriously) because of that BF3 performance increase didn't look too much when I upgraded (the other games are nuts better, but BF3, I needed to force 4xMSAA with 7970m to get the same quality)
-
What? Adaptive Vsync is good but if you are not running at 60fps, its essentially turned off.
It's just switching between Vsync on and off, to prevent tearing above the screen refresh rate without the performance hit of going from 60fp to 30 for a small drop in fps.
It's actually a pretty simple idea which makes me wonder why it took a while to be developed... but if you don't have the performance to even reach your display's refreshrate, its the same as running vsync off. -
-
And then page 4 and 5.
The bottom line - the 'blur' is actually how the technology works, and I understand that it shouldn't bother you once you get used to it (can anyone attest to that?)
When looking at screenshots, it seems that FXAA is much better than 2X MSAA (or none), and nearing 4X MSAA in quality, but not quite as good (ie: the blur.) The way I see it, though, at almost no performance hit, it's a very nice compromise later on when games will start to be more and more demanding and prevent you from activating MSAA and still be able to play them properly. I have to admit, though, that when the time comes that my laptop won't be able to even run a game with 2X MSAA on, then it will be time to get a new one.
So what's the point of it? Well, I've looked at reviews and screenshots, and high MSAA+FXAA combined looks much more crisp than high MSAA alone, without a degrade in performance. So there's that.
Regarding the adaptive v-sync, sure, if you can barely run the game the game on 40-50 FPS then it's completely irrelevant, but if the game constantly runs at 80-90 but dips into 30-50 at times (like Skyrim, for example), then it's a very effective workaround, as far as I'm concerned. -
-
At any rate, adaptive vsync should be a standard... haha. -
-
In that regard, if that is indeed the case, then I agree completely that FXAA is pointless for you.
But I don't play FPS games. At all. I suck at them and I hate them. I like RPGs the most, and these get more and more demanding on the hardware each year. When playing those games, losing a bit of detail here and there in the distance shouldn't be that much of a problem, I think. -
PS FXAA looks really bad to me in racing and flying games, IMO just my opinion. -
Meaker@Sager Company Representative
*snuggles sample*
960/1575 at 1.025V?
AMD Radeon HD 7970M video card benchmark result - Intel Core i7-2820QM Processor,Micro-Star International Co., Ltd. MS-16F2 score: P6536 3DMarks
Can't of asked for a better 7970M to pair with my mobo. I'd be scared of what it could do at 1.075v.
Also been checked in metro and dirt3. -
Extremely great sample you got there... Imagine what you could do at 1.225v
-
I could imagine something very close to the 680 desktop version if overclocked correctly for a while (maybe slightly less that is)
edit: okay no, I might have been a little too optimistic there.. but the power would be insane -
I read today that 7970m is soldered to the motherboard of Clevo P170EM. Meaning I can't change or upgrade the GPU in the future.
Is this true? I thought P170EM was an upgradable laptop. -
No it is not.. -
vortex III post #4
-
Kingpinzero ROUND ONE,FIGHT! You Win!
The 7970m is an mxm card like nvidia cards, so yes, you can upgrade to what you want anytime. -
Not that I'm in any way affiliated to Eurocom, but this video is enough proof that he's wrong:
Starting from minute 2:00:
AMD Radeon HD 7970M CrossFireX Installation in EUROCOM Panther 2.0 - YouTube -
pathetic how smart people think they are once they get hold of the anonymous internet identity........ so sad false information everywhere -
-
I was referring to the 7970m card itself, not Clevo, as did this 'jamwllms' in the link you provided, from what I could tell.
Edit: I'm also undecided between a 680m and a 7970m. I currently pre-ordered a machine with a 680m, though. -
hi everyone... i was originally planning to buy an m17x r4 this week but this got my interest.. http://vr-zone.com/articles/msi-s-g...7970m...-and-trinity-a10-4600m-apu/16190.html
i wanted to the amd cpu will give any advantage over intel's?... i really like msi but i cant wait for the gt70 to get the new 680m...if i had a choice i would pick msi over alienware...
Sent from my GT-I9100 using Tapatalk 2
nevermind... i've decided on the 17x..
Sent from my GT-I9100 using Tapatalk 2 -
Pairing 7970M with A10-4600M is a total and utter epic fail. It will bottleneck the GPU and will deliver crappy FPS in games where the CPU plays a big role compared to the Sandy Quads
In pure computational power the 4600M isn`t even as good as i5 Sandy Dual cores.
So sad to see MSI cheaping out, greatly hindering "what could have been" just to satisfy some AMD CEO. -
A10 will only bottleneck benchmarks like 06.
-
Anyhow, found this:
Computex: MSI Showing Off Five G-series Gaming Notebooks | PC Perspective
I wonder if they tested them with heavy tesselation or with 4/8xAA or something since 7970M score 59.5FPS on normal tessellation with no AA enabled. Its not tessellation disabled and with no AA selected since we know 680M score 65.9FPS with that. It could be that the 4600M is wearing the performance down. Or that 680M runs away once the visual settings is enabled. Shame it doesn`t say which settings they used.
EDIT: Searched around and the forums say that Heaven benchmark is almost a completely GPU benchmark and CPU doesn`t matter. And yes, I still think benchmarks matter very little.
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.