And let's be real. The GT 555M was not in any way an impressive chip.
The expectation for them putting out something significantly faster than it is an extremely low bar to set.
-
I seriously hope you guys are right, I would love to see +60 fps on BF3 maxed, would be so sweet
BUT I still think it will be 20% increase as the way computing is going nowadays is smaller size / more power efficiency, not brute force power notebooks, unfortunately that is just us... we are living in a world where ultrabook sales will represent up to 50% of laptop sales in the next 2 years, right? -
60FPS on BF3 maxed (fullHD) is a dream for notebooks...
7970 DESKTOP has difficulties to do that, let alone a mobile chip!
Maybe with Xfire/SLI setups -
More efficiency can translate to brute power. It's just a different marketing standpoint really. I'd be surprised if they didn't release something at the top end which is a real power muncher just like the current generation since they know they can charge a large premium for it and that people will still buy it.
Think of it this way, 75-100 watt cards are getting to the border of what laptops can handle. If the new generation can give 580m performance at 40 watts rather than 100 then that's great. However with the new tech you could equally double the number of cores and end up with an 80 watt card which is twice as powerfull again and will still sell. It doesn't take much work for them to do something along those lines and they know people will buy it, the only reason I can se why they wouldn't would be if they're holding back tech for the next generation (which will likely be merely an improvement on kepler, like the 500 series was with fermi) so they can go "OMG GAIZ, YOU'RE OUT OF DATE NOW, MORE MONEY PLOX". -
Each and everyone of us have different expectations, but for me this was more than enough. I was concerned that I wouldn't be able to even run it on high considering it was developed for PCs.
I think that a desktop 7970 should have no problems maintaining BF3 around 60fps and playing latest titles maxed out on notebooks is not a dream anymore IMO. Any increase on the new upcoming cards will make proof of that. -
maxed out means everything set to the highest possible settings. If you even set a single item off or less than highest then it is custom, not maxed. MSAA in BF3 with OFF/2x/4x gives very different results.
-
this is why I said nvidia cannot pull such a miracle as 7970 is about 30% faster than 6970 and GCN is also a very nice accomplishment (at least amd went crazy about it), I highly doubt nvidia can increase their performance by +50%, if that happens, our next GPUs will kill blockbusters like farcry 3, hitman absolution, and whatever (still cannot believe it), well, we will see soon enough -
after this last update i can max bf3 at 1080p with 4x AA on my 485m and get an average fps of about 33, with very few dips below 30.
-
Pinch of salt:
Nvidia's GK104 now scheduled for March -
funny to read some say they max it BUT without this or that setting not on max... that's not maxed out than...
and with minimum frames below 30 are just terrible to play with (RTS ok, but FPS genre...)
Games that cant be maxed with current gen (unless you go for SLI/Xfire setups), all @1080p:
Witcher2
Crysis2 (uncluding highres + DX11 patch)
Metro 2033
I don't say that is a disaster, that you can't run it at max @ 50-60FPS. But it's still not possible. Don't think you have to defend your rig, it's great you guys have a GTX580M or 6990M. I have an 5830 mobility which is much weaker but I'm fine with it (I prefer mobility but still with great performance). It can still run any game out there, just not on max, but it still runs.
BF3 on a 7970: -
It's the 4xMSAA that kills the performance. But why would anyone use 4xMSAA in the first place? A much better and higher performing alternative is FXAA (preferably using an injector rather than the built in one) which gives better image quality and much higher framerates. I get 60+ FPS on my M18x with all the settings on Ultra + FXAA enabled @ 1080p via injector. It is an SLi setup but I think a 680M will achieve at least 70% of the performance of 580M SLi. -
I'm not sure, but can ATI users also use FXAA?
and like you said, you run SLI/Xfire and I didn't say next gen would not run it max. But the current gen doesnt.
As you can see GX580M: still few games that can't run max. Alan wake for instance, The witcher 2, Shogun2, Crysis1 -2 (I think crysis2 benchmarks is without DX11 patch?), metro2033 -
-
-
Read long2905 post (page16) or kevin post here above, they understand the word max..
AA settings can be VERY important in some games, I understand that some games with 1xAA look as good as some with 8xAA, that's true.
But like above mentioned, what about tessalation, depth of view, etc.
Just go to the link I posted a few hours earlier, still even with a GTX580M there are like 4 of 5 AAA games that can't be maxed.
Seems like you don't want to see facts. Don't get me wrong, it's not your fault mobility chips are weaker.
But please accept facts -
) any game on any average GPU as long as you enjoy the slides, get it? I don't really understand you kind of people who understand what is meant still talks about technicality to come up with an argument, another fun thread.. I was precisely defending the fact that our mobile GPUs are not powerful enough to play games like crysis 2 fluently at max settings about a week ago in this forum in another thread, but because I state the settings, fps I get and unfortunately I use kind of (english?) max phrase, I need to defend myself to you... biggest loss of time on planet, anyhow, trust me I know what the word max means (math phd major)
-
http://media.bestofmicro.com/B/Y/323134/original/image017.png
than it must be clocked at 100% higher speeds than the 6990M... which will not have been the case...
Why is it so hard to accept. I didn't ask your framerate, I just stated the obvious, saying it cannot be maxed, up till today.
Max = Max, not turn off or step down settings, so hard to understand?
And afcourse with max out I mean a PLAYABLE framerate (40 average) and you knew that I meant that
If you play that childish way I can also proof you wrong that not every card can even run highest settings (non DX10/11, not enough memory, etc.) BUT THATS COMPLELETY OUT OF MY POINT... -
darxide_sorcerer Notebook Deity
-
also here is your superclocked 6970m, watch it, a nice video
6970m | Metro 2033 (Very High Settings) - YouTube
oh btw, 25fps is playable for me, 40fps is your idea, and yes, for some of us I am sure even 15-20fps is nice, heck, we had a guy who was showing his slideshow a few months ago on youtube, saying he enjoys it -
Luckily you read the description of the video:
-
Supposedly GK107 benchmarks. CPU was a 3960X@stock
3DMarks 11,06,05, Vantage:
videos were deleted from youtube -.-
http://www.youtube.com/user/TheXtremeAnalyst?ob=0&feature=results_video#p/u/0/uUzZJdOlT5k
3DMark11: P3818 (HD7770=3530)
3DMark06: 23893 (720p, noAA)
3DMark05: 31734 (1024x768, noAA)
3DMarkVantage: P16079
RE5 1080p, 16xAA, DX10:
Variable benchmark: 105.6fps
http://www.youtube.com/user/TheXtremeAnalyst?ob=0&feature=results_video#p/u/0/uUzZJdOlT5k
A GTX470 paired with an i7-3960X renders 133fps, but only with 4xAA: -
Do you have a written source of it gamba66?
-
it wa posted on the nvidia forums in the kepler speculation thread!
there were about 5 videos from that user but his account is now closed
i suspect the numbers are from the video description, supposelyfeom the gk 107 engineering sample but not confirmed if real -
Oh look at this.....
It has to be Kepler, and not an overclocked GTX 570M, right?
Uh oh, suddenly I feel we've been bamboozled. -
lol that IS a 570M. A slightly overclocked one
-
But the GTX 660M is confirmed Kepler? Suddenly I have serious doubts about it not just being a 560M.
And what if the 675M is just a GTX 580M?
What if the whole 600M line is rebrands?
What if...
What if..... -
and B)
The rest of 600 series just have to be Kepler. We haven`t heard anything about mobile 700 series. 400M series and 500M series were all Fermi. I can`t understand the 600 also being Fermi. -
Okay, I know 100% that GTX 660M is Kepler. I just don't understand why Nvidia would need to reuse the GTX 570M.
-
Who said that they were? That GPU-Z screenshot is not fake?
Or maybe the 670M indeed is rebadged 570M and 675M is a Kepler instead? *confused*
That does NOT makes sense though since there are new revisions of popular laptops coming out in April that will only have the 670M. Like Asus G75. No way that Asus just go for a rebadged Fermi?! -
Well they did go from 460m->560m and that was a pretty small difference. But a picture from some random guys photobucket doesn't mean anything.
-
Result from 3DMark06
-
I NEED VANTAGE.
give it to me please
Oh and the fact that it's being tested with a Sandy Bridge, means it's in a current gen machine. -
I really think these are fake, even such a doubter (me
) I cannot believe nvidia will rebadge the 500m, BUT, kepler is about power efficiency, so maybe these are engineering samples clocked slower, and they will be somewhat similar in terms of number of shaders, however more more power efficient? still cannot believe gtx 670m has 336 shaders..
-
Karamazovmm Overthinking? Always!
well if nvidia rebadges again the high end and mid range cards (well the 640m and 650m should be new cards, since the 555m is the 635m) we are going to be left with most of the 400m series AGAIN!
lets see
420m = 520m
520exm = new
425m = 525m
435m = 540m
435m = 555m = 445m = a new chip found in the lg p330
460m = 560m
470m = discontinued
570m = new
485m = 580m
its terrible, at least the high end was last year chips (not counting the 560m), lets at least hope that the mid range doesnt suffer, and the high end gets some upgrades.
do correct me if Im wrong, but that is just gouging -
But isn't one of Kepler's notable features the lack of "hot clocks", which is shaders at a higher frequency than the cores?
That would confirm this as a Fermi chip. -
at alienware forum we are looking forward to some crazy gtx 680m for a long while... hope it happens
-
Clue: Look at device id. -
well at least if that 3dmark score is correct we can upgrade the GPU in our current specs.
-
-
I you see something the rest of us haven't noticed, please enlighten the group, before it spreads.
To me, the device ID looks exactly like what they'd do for a rehash, same basic structure as every other Fermi card, with a small revision at the end.
I'm not yet claiming it's real, but it came from someone who has the new 15" Clevo in hand, for testing. -
-
maybe because of nda? or to get a better comparison for the gpu?
remember that you can exchange cards with mxm slots so it isnt that unlikely -
Ivy Bridge is supposed to use the same socket as Sandy Bridge, so it's conceivable that the P150EM could be configured with a 2720QM; especially if Ivy Bridge is being delayed.
-
Long ago Intel said Ivy Bridge wouldn't be compatible with any of the current mobile chipsets.
They may have the old motherboard in the new chassis. -
The old motherboard lacks the connector for backlit keyboard though.
-
Hmm, you are right there.
I think the precise thing for me to say is that we have no definitive proof that the test was not run with a regular P150HM, despite the fact that a picture of the new model was posted alongside the test results.
With the 670M being a rebrand and not Kepler, it would definitely plug in and work with the current chassis.
But Intel was 100% about Ivy and Sandy not being compatible chipsets, and have never released a statement saying they changer their minds. -
So I don't think it'd be too unlikely for a 2670QM to work in an HM77 system. -
No, VERY interesting, because that means I may be able to buy a completely empty P170EM shell, and then transfer all of my gear into it. -
Man how am i supposed to find a M17x...R4? barebone? =)
-
http://vr-zone.com/articles/nvidia-kepler-power-circuitry-revealed-300w-tdp-/15011.html
bad news?
My guess: 28nm rehash of the fermi chip, but inefficient
A fairly reliable source from semiaccurate:
http://semiaccurate.com/forums/showpost.php?p=153832&postcount=1180
Gtx 660m
Discussion in 'Gaming (Software and Graphics Cards)' started by Oats04, Jan 11, 2012.