Source? Haven't found anything.
-
SlickDude80 Notebook Prophet
and I'll quote from another thread by nbr member Long:
The repeatable benches that we have done numerous times are:
Vantage
3dmark11
Unigine Heaven
The numbers are off the charts.
All i can give you right now are game play impressions...and the card makes my laptop feel like a gaming desktop. that is all.
-
Do you have any evidence why you think so? I mean other than you like green stuff. -
-
My source are my eyes. I have seen the santa with my own eyes. Went to visit him in Finland. -
Slickdude how well do you think bf3 will run on a 1440p monitor on ultra ? You think still constant 30plus fps without AA (won't really need with 1440p). i want a 1440p or 1600p IPS or OLED 120hz screen next year hehe. I'll drop to high settings if i have to. these screens will be much cheaper next year and i just got a good job offer so i plan on getting one .
-
SlickDude80 Notebook Prophet
yikes...1440p? 2640 x 1440? i don't even want to take a stab at it. I think it would be very difficult for any laptop to maintain constant 30fps at that resolution. you may be able to do it if you drop the IQ level
-
Meaker@Sager Company Representative
You might want to look into a dual card machine for that sort of res.
-
Or play at lower details... Im sure medium-high details you will get your 40+FPS easily.
-
-
@micah with 1440p you won't need AA that much, after all it matters the most with lower resolutions (human eye won't perceive the jiggies in that kind of resolution), so I think you will be alright considering the might of 7970m
btw nice looking lappy buddy! enjoy the good times -
Slickdude, when you benchmarked BF3, the resolution you were using is 1920. 1080, right?
The youtube video used 1660 x w/e -
SlickDude80 Notebook Prophet
-
Well, according to this article, released just 2 hours ago, GTX 680M is coming out on June 5th with 744 CUDA cores, 4096MB GGRD5 memory at 100 watts.
37% faster than GTX 670M and its 3D Mark score is 4905 only.
Not even a full GK104 chip.
If true, I wonder if this chip would even match up to HD7970M.
NVIDIA GeForce GTX 680M Detailed and Pictured | VideoCardz.com -
SlickDude80 Notebook Prophet
This rumor about a 4900 3dmark has been around for months. For reference, the 7970m scores (depending on CPU) somewhere between 5500-5800 3dmark11's stock -
Those specs are exactly what i expected them to be. 4GB GDDR5 is serious overkill for that GPU tho.
Someone order this laptop asap please! Lets see the power consumption numbers!
http://www.monsternotebook.com.tr/P170EM-GTX680M-MONSTER®-Q61P170EM08.html
http://www.monsternotebook.com.tr/P150EM-GTX680M-MONSTER®-Q61P150EM08.html
Expensive one. approx 2100EUR and 2000EUR. -
I am also being pretty conservative with my estimates, as you can see just in the arena (which is by far not the most challenging scene) I am upwards of 45 FPS. In somewhere like the forest it will drop into the 30's.
Here are some witcher 2 screenshots I just took, settings used are included, I shrunk them to try and not destroy the forums, they were originally 1080p as shown on the settings menu:
My main point has always been that the 7970m is not SO MUCH BETTER that everyone should toss out their old cards and buy one today. It is a nice card, if you are getting a new computer right now it is a clear choice, but it doesn't do anything that a 580m can't do, there is no game (that I have seen) that is unplayable on the 580 and suddenly butter on the 7970m. -
If this is true we have to call 911 to Cloudfire's house. He's gonna have a heart attack. -
-
That 3DMark11 score is the 680M revision #1 overclocked. It was the 768 cores tested. Now they are suggesting that Nvidia went down to 744 cores and made a even worse performance?
We all know that Nvidia isn`t releasing a flagship GPU that is worse than 7970M. If they did, and it consumed 100W like 580M that would be an epic fail. Only other reasonable explanation is that this 744 core is the successor of GTX 570M, consumes about 75W and scores around P4500++ and they release the GTX 685M in September. But this isn`t their flagship GPU. No way
They are right about the date though. Computex in June like I said -
If true then the 680m is a fail, my 580m gets over 4000 with an easy OC, a card that only adds a few hundred points is a total flop. Will be nice for me though, totally validate my decision to wait for the 780m. -
@Yoda, when you said ultra I thought it was including ubersampling, I get 30fps with the same settings, you just OC to 740 really? I game at stock.
also yeah that 4905 is about for 3 months, I don't believe it is true, it was just a fanmade screenie to hype things up until AMD crushed it with 5.6k..
Also how the hell you can disable 22 CUDA cores from 768, they are aligned with groups and I am sure those groups are not a power of a prime number..
EDIT: sorry it was 744, so it maybe possible to disable 24 cores maybe, but I don't see it happening..
Btw, Yoda, do you think we will be able to keep our lappies for upgrade around 700m series? (Maybe PCI 2.0 speeds will seriously bottleneck this time?) -
SlickDude80 Notebook Prophet
See that is a little sneaky yoda...you're trying to push off your scores like any 580m can do that. When in reality, You are pretty close to the 580m's limit, and your card is crunching away to produce playability while the 7970m just strolls through the park to get what it gets. And why is depth of field disabled in Witcher...when i turn that off, i get another 5-7fps
I will overclock it to 990/1450 and see what kind of numbers i get. So far, everything reported is bone stock.
The bottom line...580m scores 3200-3400 3dmarks, the 7970m does 5500-5800. I'd say there was a huge difference, but you are blinded by your nvidia glasses -
-
Skyrim with ENBseries can also be fluently played in 1080p, which was not possible on 6990M or 580M. Plus in Sniper Elite 2, maxed out, the 580M has a little lag spike when you zoom in with the snipes, whereas the 7970M is 60FPS fluid throughout. This is what I have noticed really (only games I have been playing quite intensively). For my situation, I like using mods and stuff, the upgrade was well worth it. Bear in mind that the 7970M will run cooler, consume less power, etc, on games that it easily maxes at 60FPS, meaning quieter fans, etc.
Anyways, that 680M article seems to base itself from VERY old info.. the very first leaked benchmark and the card pic that was posted a few days ago.. -
Cinematic depth of field is off because I don't like mass blur, regular DOF both in gameplay and cinematics is on. That is why I posted my settings though, so you can match them if you desire.
And really I am blinded by the performance I get from my card. Yes, objectively the 7970m is a lot better, in real usage I just don't see the advantage FOR ME. I have also said many times I would not give up Nvidia features. I use adaptive vSync in every game I play, I often use ambient occlusion or FXAA, I love Nvidia exclusive features. I am not trying to hide that at all, it would take an awful lot to get me to go AMD, simply because of all the stuff I would give up to do it. -
) I will buy 680m just for the kicks, because I am nvidia fan
-
744 cores doesnt make any sense. Its surely 768.
-
-
-
-
^^ at the bottom of manage 3d settings
-
-
-
-
i got a 120hz screen is it realy needed for me what you recomend -
Hahaha. Interesting.
-
When I first started reading this thread, which I found as I was trying to figure out how to configure the P150EM I decided upon as my next notebook (after years of Dell machines, but that is fodder for another thread!) I was surprised at the passion of people for "their side" - i.e. NVIDIA vs. AMD.
Then I remembered way back when - yes, I'm an old fart - and the graphic card wars in the days of 3DFX.
Anyway, in the FWIW category, I worked closely with a variety of chip designers (NVIDIA, AMD, Intel, IBM, TI, Samsung, etc.) and fabs and foundries over the years in my role as a tech director for a company that provided "stuff" to the chip makers. I've been in the tsmc foundries in Taiwan many times, and I can tell you the French restaurant and wine the fab manager in northern Taiwan prefers.As well as his main goal for the year from his bosses (30% cut in overall cost per wafer.) I've also dealt with product managers from NVIDIA and AMD and Intel (and other companies.)
My speculation, and it is only speculation from past experiences:
1. NVIDIA is hugely important to tsmc. They have, in the front lobby, several plaques and the like from NVIDIA. While they make chips for a lot of people, NVIDIA has a place at the front of the line. And, if NVIDIA tells tsmc, we have a new chip design we need you to make and we need it made immediately, tsmc will find a way to do it. The pressure they will put on their own people, and their suppliers, will be immense, but they will do it.
2. It would not surprise me at all if NVIDIA is actively trying to leak performance rumors around the 680m that will give people pause in terms of buying the 7970m - i.e. influence people to wait and see before buying the 7970m for fear the 680m will come out a month or two later and blow it away. Even if they don't have the chip completely designed yet. They know that if a lot of people buy the 7970m now, those people will be unlikely to buy the 680m a few months later, they are competing for that one pool of customers. So they are incentivized to try to keep as many people as possible from buying 7970ms right now.
3. There are jobs/careers at stake in NVIDIA right now, depending on not allowing the 7970m to keep the crown of king of the mobile GPUs.
For those of us who have no vested in interest in either company, these are good times to be in the buyer's seat! -
^^ indeed! they are sweating their a$$ off to come up with better and better performance, consumers win!!
-
We dont win anything if the latest rumors are true and GTX680M is indeed weaker and more expensive than 7970M.
-
That post regarding the 680M made me very sad...
Still if the 680M is only a few weeks away as it suggests it might be worthwhile for those torn between the two to wait and see what it can do.
Even if the 680M is say 10% slower than the 7970M, it would still be a worthwhile option for those who prefer nvidia drivers (as long as the price reflects this performance difference, which i somehow doubt it will...) -
Theres still hope that its on par or bit above the 7970M. Perhaps those early drivers were (incredibly) buggy.
-
A slower 680m would be terrible. Late and slower? That would be a massive fail for Nvidia.
-
Isn't that picture used for the 3dmark score the old one that was leaked?
-
Forget about benchmarks, I want to see 7970M numbers for games, especially in Crossfire mode. $20 says AMD's crappy driver support will rear its ugly head as usual.
-
If you havent noticed, CF works wonders now with HD7K series. Often the scaling is 100%.
And what are the mysterious driver problems? I havent countered any with my HD7970. I think its more like a washer issue than actual driver issue for many. -
Yields are getting better each week too, so these poor times will end sooner or later.
Nvidia is losing potential customers for every day without the 680M, so I`m pretty shure they are doing everything they can to push it out to the market -
Yes. because NV simply never makes slower cards than AMD.
Oh wait... -
I'm entertaining the idea giving AMD one last chance.
In regards to your 3rd point, I'd never want to work as a chip engineer.
I believe XFX dropped nVIDIA due to the pressure AFAIK. -
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.