If the same thing that's plaguing AMD users is the same that's happening here I guess that Clevo screwed up big time.
And I feel already EM users pissed.
I'm not saying that 680m is hugely faster than 7970m, but I believe something is just plain wrong with those.
Even so I wouldn't have posted a review of a system in this state before even doing some researches on the subject, clearly they're underestimating the enduro problems on EM machines.
Since technically everything is the same (igpu muxed to dgpu) it could be the very same problem.
Only an alienware system could solve this puzzle since you can turn off the igpu completely.
-
Kingpinzero ROUND ONE,FIGHT! You Win!
-
I have not read of serious Enduro issues with Clevo, not sure where you're getting that from. And more than anything, wouldn't be surprised if it's user caused.
Though personally I think you're just on a crusade against Clevo anyway. -
Kingpinzero ROUND ONE,FIGHT! You Win!
But you have a point tho, it can be drivers or either user caused.
By the looks of it, however, seems that's enduro the main culprit. -
- And yes I do know about that thread and read through it, didn't find it very credible, not until I see it for myself.
And in the last few years, disaster drivers have all come from Nvidia. I know of 3 driver sets that caused Nvidia GPU's to overheat. So stop spreading FUD.
The number of issues in the Nvidia forum on Guru3D actually exceeds that of AMD forum as the AMD forum is just people complaining about lack of multi-GPU support from games, which isn't AMD's or Nvidia's fault. -
Thanks, I'm aware Kepler's CUDA cores are not as fast per core to Fermi.
I would have lost 2 bets in this thread...
I was wrong about saying the card would not have 1344 cores to Cloudfire.
I would have been wrong to bet that it would be at least 10-15% faster in most games. -
Kingpinzero ROUND ONE,FIGHT! You Win!
-
And as you stated also, let it settle with drivers. I'd wait for at least a few beta drivers from Nvidia released before getting really upset. Nvidia performance in Civilization V at first was atrocious. But with a driver update, their performance blew past AMD's.
-
Since I was disillusioned about Nvidia's usual boasting, I expected a mere slight performance gain. Perhaps even similar results. But not worse. And this isn't a synthetic benchmark either, these are actual results with real gaming performance comparisons, including my beloved Skyrim. While it is only a few FPS less now, it will get worse a year or three from now. This might be the difference between enjoying a game and scaling it down to manage to get it to run, and this isn't why we buy expensive top end hardware.
I'm honestly waiting a week to see whether Nvidia manages to rectify this disaster, and if nothing gets improved I'm cancelling my preorder and getting me a nice laptop with a 7970 from a more respectable reseller, to boot. -
when has notebookcheck ever been non-credible?
and yeah if the 680m is turning out to be held back in perf by optimus (just like the 7970m with enduro) then im truly glad i didnt yet jump onto the switchable graphics train
Sent from my GT-I9001 using Tapatalk 2 -
-
Huzzah for non-ambiguous benchmark results! My quest for the ultimate gaming laptop has finally come to an end! The 7970M is slain, and the 680M has prevailed! /sarcasm
This means another week of scouring the interwebs regularly, to decide once for all whether the 680M is worthy of my laptop (and money) or not. -
-
SlickDude80 Notebook Prophet
Like i have been saying all along...when 2 cards score similarly in the synthetics, you aren't going to get incredible differences in gaming performance. The synthetics put the card in a specific ballpark without predicting individual game bias (i.e., some games like Nvidia and some Amd)
That said, that notebookcheck review was very dissappointing. 7970m has better performance and costs $300 less -
I can`t pinpoint exactly what could have caused the problems, but there is gotta be something that was wrong with the test they did on 680M.
According to their own benchmarks (gamplay tests) the GTX 680M is in average 35% faster than GTX 580M on Ultra with the games...
Turbo boost not working?
CPU/GPU throttling?
Driver issues?
I have no idea -
Kingpinzero ROUND ONE,FIGHT! You Win!
@Mr.Mischief: Well no I wasn't, but there's no point for me to reply, he's stated his opinion very well.
Actually what happened to my laptop is my business, and the petition I made helped a bit those users who wish to upgrade their HM as much I wanted to.
I was trying to make justice to Nvidia users because with technicals papers at hand there's no reason that the 680m is that underperforming.
It was just a supposition since there are some similar issues happening with new EM and HD 7970m.
But probably he's right, everyone who got these problems are inexperienced users that dont know how to properly install a gpu driver, let alone use a laptop.
Also I'm a non trusted user and a donkey lol -
...
-
Meaker@Sager Company Representative
We dont know the settings nvidia used. It could be at 1080p max details the lack of memory bandwidth comes home to roost.
2gb and 4gb models were always going to perform identically. -
Well, new AMD drivers and slight bumpseed puts is back ahead as the performance leader in general....
The 680m seems a little bit too cut down maybe? Nothing that slight OC can't fix. Since Kepler cores are running at a considerable slower speed than their fermi counterparts, maybe thats why they are so affected by core clockspeeds.
Even if the GTX670 is faster than the HD7950 (which if I remember the HD7870 can match with some OC), it got cut down in clocks a bit too much compared to the AMD card?
If I remember correctly, the reference 7870 is basically identical to the HD7970m, and we can OC that to the GHz edition? Maybe Nvidia would have fared better with a reduced core count and high clockspeed, similar to the GTX680 to GTX670 trade in specs. -
-
It also could be that the gtx 670 does not scale well underclocked, (what the 680m is~)
-
-
How much more memory bandwidth has the 680m got over the 580m? -
Meaker@Sager Company Representative
900 vs 750 so 20%.
-
1500 vs 1800
it could very well be the problem
it would explain the higher score in 3dmark 11 and the lower score in Unigine Heaven
Im assuming its memory related because on the gt 650m ddr3 vs gddr5 the diference is the same. -
-
Meaker@Sager Company Representative
Lol cloud, pot, kettle, black
I still think this chip could have loads of ocing headroom if it's designed right. -
SlickDude80 Notebook Prophet
I'm still dumbfounded to this day why they put 900mhz vram in the 680m for 3.6ghz effective. This is too low for a highend part. We may be seeing the results now.
The 680m may have hit the 100w ceiling...i.e., this is the best it can do at 100w. Someone mentioned scaling above and I think that's the issue...take a desktop 670 and make it work at a 100w ceiling and that's what you get
Now the only thing left is the OC'ing ability. If this thing doesn't overclock, it is doomed. -
580M: 96GB/s -
Buy what you like and be happy with it. If Nvidia doesn't provide what you want, don't buy it. I see no point in raging over it. -
Lolz.
Im sure 680M will get better with drivers. But so will 7970M.
Looks like a tie to me. Only green card is more expensive. -
Meh, seems like 480m part 2. Nvidia was too ambitious and tried to stuff a too power hungry chip into a notebook. Though I guess they didn't really have a choice considering there's nothing to fill the gap between the gt640 and the gtx670. Look for the 685m, folks.
-
-
If the 680M can play every game nearly 50% faster than the previous 580M as Nvidia claims and plays every game at 1080p at high details, what is the problem? Because it doesn't beat 7970M on some benchmark? And it's not even out yet...
Heck, I recently read about an unhappy customer who blamed Sager for their problems and multiple RMA's and not replying to phone calls and emails from a reseller, but reseller has logs of the emails and calls to refute that claim. As I said, people lie on forums. It's always to blame someone else. -
So it seems slickdude may have been on the money re memory bandwidth. I was in agreement with the prediction but secretly hoping he was wrong. I wanted the 680m to be good.
Makes a heap of sense that the 680 does not exceed the 580 by so much. Cloud.. please tell me you're not going to give nvidia's marketing benchmarks any credibility... cause if you do, you're kindof losing any credibility yourself... along with those benchmarks.
The fact nvidia include 4gb of memory, from a marketing perspective, screams to me that they're long ago aware of the lack of performance.
Everyone knows bigger numbers sell better. They had to make it "better" at a glance to the regular consumer than the 7970 somehow. I see no other reason for them to waste money on memory that will never be used. I dare say.. if the 680 was actually a lot faster, it may have come with 2gb?
-Tristan -
Seriously?
Do you guys really think a big GPU company will release a GPU that is not tested? Nvidia have done the math. They have calculated whatever polygon this GPU have to process and how much toll it will have on the memory. They have tested all games out there with this GPU.
One thing is for certain, they don`t just throw a flagship GPU out on the market without making shure it have enough memory bandwidth
I will wait for Anandtech`s review (or someone else) to see what conclusion they get. Something must be off with the Notebookcheck test -
-
-
SlickDude80 Notebook Prophet
What i'm thinking is that, nvidia just couldn't tweak more given the 100w ceiling. It could be an issue with power, or heat or whatever
-
-
The 680m has to be bottlenecked by something. My chips are on the vram. -
Jen-Hsun Huang is seeing raging bulls.
-
well to be fair the GTX 680M's performance compared to last gen is quite a step up. It's quite a feat when you think about it. It's just nVidia's PR and its exorbitant price scheme that upset and disappoint people. Maybe after a few more reviews coming out they will drop down in price?
-
That being said, 4% worse can turn into 5% better if nVidia has a driver update for Max Payne 3 (which I think runs a bit better on nVidia cards in the first place). -
C'mon dude it happens daily. and most companies never get in trouble. And when they do, they pay a few million (or few hundred million if they're apple) in fines and move on... and the whole ordeal ends up worthwhile for them anyway.
You've got to be under a rock in lala land to think they wouldnt manipulate the results. Not to mention the lie could probably never be proven as they can simply claim the conditions were different at the time. So marketing department gives the thumbs up to publish bogus results. Ive been seeing this practice since the AMD 386 / 486 CPU days in the 90s. And it has never stopped.
***
Slick/Blacky
I see what you're saying about the extra memory / power etc. And what you say makes sense for sure.
Still, marketing is often far more important than performance. Money is the bottom line, not FPS.
there is a number cruncher in EVERY big company that will say "Will we profit more by slightly downclocking the GPU to keep within 100w and adding superficial memory capacity... or having a slightly higher clock with less memory?"
Often these companies do not give a damn what the bottom line performance is. Only how many units sell. And usually, how many they sell is not to do with performance!
They can easily get around the slight loss of performance from downclocking by simply saying "But it MUST be the best, because its nvidia!!" and everyone will believe them and buy one. "And it has twice the memory too!"
And all the kids buy one.
Check out cloudfire for example... He's nvidia hooked! And no matter what he will always back them up.
These companies do not and never have put the best combination of parts for the best performance as the highest priority
The priority is always simple - What is the cheapest way to make a product that APPEARS most attractive to the MAJORITY of consumers. And guys, forum readers are definitely not the majority. The majority doesnt even know what a benchmark is. And Appearing good is not a result of REAL benchmarks. Just good marketing.
-Tristan -
Maybe nvidia wants to cripple this gpu so that in the future they can came up with a better one aka gtx 685m that people "need" to have.
We are going to be a long time on 28nm so efficiency isn´t going to rise much more on this process, maybe it´s necessary to leave some headroom for future upgrades? -
It seems like the 680m is suffering from a bandwidth issue. It could be the GDDR5 used, the clocks etc. To me it seems like its choking itself.
Maybe Nvidia suspected that the card would be able to perform better with a higher amount of ram (I'm not saying VRAM as last time I check VRAM isn't used on consumer cards anymore as its SDRAM. Some people may refer to the video card memory as VRAM though), thus increasing bandwidth.
It may be a driver issue (I'm 30/70 on this as Nvidia makes a driver for a batch of cards compared to AMD/ATI. My 260 GTX, yes its old now, shares a driver with ~5 other cards).
All in all, AMD has won if these are the true performance output results of the 680M. The fact that the 680 is ~200 more then the 7970m for either less then or equal too (<=) results is complete rubbish. There will be people buying this card because they prefer Nvidia and so be it, BUT as a consumer you are feeding Nvidia's pockets with unwarranted profits. If people actually stopped buying Nvidia's cards for the premium they put on them, their prices would drop and THEN make sense.
I'm extremely happy with my 7970m, and even though there is still no AMD official WHQL drivers, this machine runs like a beast.
I was an ATI fanboy, and an Nvidia fanboy, but I've realised that wasting X amount of dollars on a pc is stupid. For me its all about price/performance. -
SlickDude80 Notebook Prophet
Here is my hypothesis:
THe 680m has turbo boost that technically can overclock the card as high as +15%.
HOWEVER, this feature has to be implemented in the bios of the laptop manufacturer.
What I'm guessing is that for the PR slide tests, nvidia boosted their cards the full 15% or as high as the cards could go.
But Clevo did not implement this feature because of heat, card longevity etc...so we are seeing the honest result of the 680m that isn't boosted
Again, this is only a best guess, but this hypothesis makes sense to me -
It looks like the Nvidia fans did not receive their Christmas present XD
-
The funny thing is overclocking the memory on the desktop GTX 680 gives like a 1% or less performance boost. I'm talking a 500Mhz overclock. There's probably a threshold where performance dips greatly without the bandwidth.
-
I chose to believe that the benchmarks provided by nvidia is based off of the gpu at max boost, which is then compared to a stock 7970m. Since their gpu boost is a built in feature which doesn't require much user end adjustments, they have every right to base their results like this in the comparison. I'm sure it's still biased of course, but it also suggests that an overclocked 7970m could be the better performer in the end.
This would explain the different 680m results coming from NBC and the other review with the Alienware m17x. The Alienware is capable of boosting higher results due to better ventilation.
So the ceiling with good ventilation could then be around 6282 3dmark11 with gpu boost alone.
The 680m would then need to have OC headroom outside of their boost feature to beat a properly clocked 7970m.
Those are my believes so far. Pure speculation I know, but it makes sense to me.
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.