That's an article from 2 years ago, and the chart in it is totally wrong. Kepler has less DP FP performance per watt than Fermi, not even close to the 2x/3x perf expected.
-
-
good find btw! +rep -
They have a date for the 680M as well. Awesome
"Delivery date: 07/13/2012"
One Gaming Notebook M73-2N by: One - ONE Computer Shop -
so what is the final rumors about 680m? what's rumored 3dmark11? (I am not following this thread well anymore
)
-
weeelll..... approx. 200-300 $/€ more than 7970M, approx. 10-15% faster than 7970M (in Vantage), supposed to be presented soon during the computex, supposed to be available sometime in july/august.
did i miss something? -
10 points to jaybee
GTX 680M scored 14% more in Vantage than 7970M -
Cloud can you post a few screenies you found in the last couple days?
-
oh and we could pretty much say that a 7990M turned from "fabled" to "probable" sometime this fall, right?
-
@Cloudfire:
I read the original text (google translate sucks for someone that knows german) and yeah very probably in a few days we will see the review and it should arrive next month. But the whole discussion was that one.de lists often things and they can't keep the promise for the delivery date so that's why I say "probably".
The 7970M is already a monster....what about the upcoming 7990M?
Hard times for Nvidia... -
Second of all, where on earth will AMD get the extra performance for 7990M? The 7970M is already at 100W, the max consumption the MXM can deliver. They can`t just magically get more performance out of the same consumption. Well there is one way, but its through architecture improvement, like the 560M to 460M (both Fermi), but AMD won`t be out with 8000M series until 2013...
AMD went all in with the 7970M and crushed everything previously released. Now they can just sit and prey that Nvidia won`t release something that will beat their flagship. But either way, 7970M will sell like crazy because of the cheaper price.
You AMD fans keep finding excuses for everything -
dude, chill out
im aware of all these points @100W already reached, etc.
second, im neither a green nor red team fanboi, thank you very much ^^
thirdly, apparently u dont read the sources u post here very carefully? you urself just quoted the 7990M part here
my guess is an OCed 7970M, and why not? seeing that its easily OCeable up to 20%besides, the 7970M is rated at 100W but that doesnt mean it consumes the full 100W. lots of users have already posted that the 7970M consumes significantly less power than the 580M, altho both cards are rated at 100W. add to that the jump from 485M to 580M which also just took 5 months (and also included no real arch improvement but rather just a simple 9% OC)
thus, a release in september or october wouldnt be that far-fetched
cheers -
SlickDude80 Notebook Prophet
Diablo3 initially used Havok, but it was replaced with a custom physics engine that works on all platforms
I think you are confusing Nvidia physX with game physics.
PhysX is nvidia's implementation of physics but is proprietary to nvidia cards...meaning it only works on Nvidia cards. Generally speaking, game devs like to use technology that doesn't require you to have specific hardware...but nvidia has deep pockets and may "sponsor" a game to use it -
There is no way they will release a fully 7870. AMD wasn`t capable of releasing a fully 6870 last generation either. The 6870 was clocked at 900/1050, the 6990M which was their greatest was at 715/900. -
-
Really hoping that most major releases won't be exclusive one way or the other - especially GTA5!! -
-
What we need to worry about here if we will be stuck with a horrid console port on PC.
But for non-Nvidia hardware i recommend to turn on PhysX.
Its crippled when running on CPU. -
u mean turn off PhysX for non-Nvidia hardware? ^^
-
HaloGod2012 Notebook Virtuoso
hmm, so it looks like with my overclocks and scores my 7970m is up there with a stock gtx 680m
-
-
whats so depressing a 680m was gonna be a low end card before amd 7970m came and nvidia changed there whole graphics for 680m. Theres a reason why a gt650 660m kepler is close to 670m or 675m in performance as nvidia were trying to make as much money as possible while amd did what nvidia should have done release a top end card with real performance not try to get a 600m series with minimal increase performance.
-
-
What would have been depressing is that if 680M was the weak P4900 like the first sample they had. Not much competition there to drive the performance up on future models. We should be glad they keep competing like Joker says
As for the previous Kepler models, Nvidia should have settled with GT650M. It seems like a sweetpoint, both in performance (GTX 560M) and TDP 45W (compared to 560M with 75W TDP). I feel that GTX 660M was a half as*ed attempt from Nvidia and didn`t bring anything new to the table.
The rest of the 600M series, 670M and 675M is pathetic and I wish that Nvidia would stop lying to their customers abut top models (renaming) and instead focus on bringing the price down for the 500M GPUs. Look at AMD. They do it the right way -
SlickDude80 Notebook Prophet
the 680m will cost a fortune and availability is still months away. Enjoy your 7970m. It is currently the fastest mobile vid card in the world
i will confirm or disprove the numbers when i bench the 680m -
You will buy that too?
-
SlickDude80 Notebook Prophet
the simple answer: yes i will.
the convoluted answer: if i can get one at a reasonable price, yes. What i hear is that they are still at the ES stage. The early ES cards are essentially impossible to get unless they are stolen or you work for nvidia engineering. When they get to the qs stage and there are some of those cards floating around with vendors and laptop manufacturers, i will get my hands on one...but like i said earlier, only if the owners of said cards don't gouge on the priceI hate paying a fortune for something that they essentially got for free
-
Niiice. Who is telling you all of this? Hook me up with one please
-
relax guys :-s. Its just the wording that makes it seems (to me) that he was aiming for a high end desktop card or something and subsequently demean the value of the 7970M.
And your last line doesn't help slick -
hahaha, classic slick with his special connections
-
I'll get one as soon as it comes out. Dell have been pushing to replace my system but I've been holding on the 680m at the moment. I've already benched the 7970m and shown what it can do, which is amazing (6951 3DMark 11 tess on
).
I have high hopes for the 680m but I'm keeping it all reserved. I expect it to be good but not amazing because it will be let down by price. Anyways, I have no intent to be dethroned from my Alienware pedestal
PS. I strongly believe that with those newer numbers, the 680m will be similar to the rumoured 660 Ti with 1152 shaders. I'll be really P'd off if that wasn't the case but instead it was 768. -
TypoCeption -
rofl, good one
-
To compete against the 7970M while using 768 Kepler shaders, nvidia would have to clock the 680M to something like 950-1000mhz, even more to get it 10% faster. Given's Kepler's architecture, this is possible. That, or it's more than 768 shaders in actuality. -
I'm also looking forward to see how the 680 overclocks, I'm willing to bet it has one of the lowest ceilings we have seen from a card yet as all this time has probably been spent trying to figure out ways to push performance out of a "low-end" card (which is slightly impressive I'll give them that). -
- given the a clock somewhere 950-ish I would probably buy a P150EM barebone w/ the 680M, a cooling fan and let the OC begin!
dear god please make this happen
imagine the power... I can't -
Kingpinzero ROUND ONE,FIGHT! You Win!
Guys i understand defending each side, either green or red.
But also i want to stop a second and thing how the technology evolved. Either if youll get a 7970m or 680m - those two cards have a technology evolved to a point which 5 fps difference doesnt matter anymore, because basically they can output almost any game at 60fps and beyond.
10-15% faster translates in a bunch of FPS. Now when the difference hits 10-12fps, thats a thing to keep in mind.
If you got the 7970m, you got a beast of card. If you/will get the 680m, probably it will be a bit faster, but it will be in the same range as the AMD beast, not a huge jump "above it".
Then as an nvidia fan and user, i could say that drivers are far better than AMD, but some experienced AMD user can call BS on me because they could affirm the same.
Either way enjoy what you have because theres a lot of power that makes 5-10% speed bump not that important. -
besides power numbers... nvidia has better color?
-
-
No nVIDIA doesn't have better colours, nor does the GPU look better. -
-
Meaker@Sager Company Representative
There are only 2 sides, companies and consumers. Any other division is silly.
-
Kingpinzero ROUND ONE,FIGHT! You Win!
They kept doing this after each refresh. Maybe its their policy to discountinue legacy gpu very soon compared to nvidia, or maybe its a way to admit their problematic driver development.
Cutting out old cards means having more resources to work on newer gpus, but also this makes them not reaching a point where a driver can be called "mature" or "stable".
Months ago with Rage and Skyrim they basically acknowledged that a driver cant contain fix for both games lol. Or either you use a driver for 1 game, or the other. Then some tweaks came to light and things have been sorted by the users, and lately from amd itself.
What i find funny is that they keep on release the "fastest" gpu around, each year is the same story.
Lets get this straight: the hardware is sublime. It has numbers, powers, whatever. But it lacks, seriously, drivers.
Now before someone start flaming me, i just want to explain why i usually say it "lacks" drivers. Obviuosly it doesnt, because drivers keep being updated, but if you came from nvidia (and im speaking about having nvidia as a main brand for at least 10 years) you know what i mean.
Nvidia has issues as well. Their driver used to kill gpus. It doesnt happen always, but they can fix something and break another. Their gpus used to die without reason (everyone remembers right?), and some of them keep on doing that - but - in the majority of the cases, everything is like 1) get a good gpu 2) install drivers 3) play the game. 99% Of the cases drivers will perform exactly as they should even with newer games that hasnt been supported yet.
This percentage of "success" is by far, more and more "wide" on nvidia drivers than AMD.
My experience with AMD hasnt been great. The Xfire desktop setup i had (5870) was not even pushed to decent levels because of the worst drivers i ever seen. Back in time their habit was to release a driver update after....after who knows. Ive been left with no new drivers for what, 3 months. While nvidia users with cheaper cards at the time were releasing betas each 2 weeks. And what made me mad is the fact that their cheaper cards used to kick my Xfire or Single in some games where i was supposed to destroy them.
Its love and hate i guess. Im ready if god wants to grab a 7970m and turn to a red team believer. But seriously they need to get their things together. Denying their bad habit into developing drivers is like defying reality.
I wish that every AMD fanboi had at least an Nvidia GPU of the same segment of theirs for AT LEAST 2 months. Then i would like to know their POV possibly unbiased. -
-
-
Kingpinzero ROUND ONE,FIGHT! You Win!
Ati drivers gave me always the best color range, more vivid and brilliant than nvidia, and I reached a point to doubt my eye sight.
Then I did some tests on a calibrated monitor and I was impressed that my theories were right, althought the gap wasnt that great as I imagined.
It's strange to believe it since technically a color output should be the same on the same monitor; but then someone experienced in movie/photo business began explaining things like color luminance,compressions,video quality,signal quality,cable quality,distortion,artifacts and so on.
I was blown away on all the things that can make the same color displayed in the same way/palette so different.
Dunno how it has been lately because I have an nvidia gpu right now but I tell you: I resort every driver installation to the digital brilliance slider to get things at last not that WASHED like they are at stock.
Just my 2 cents. -
Sent from my GT-I9001 using Tapatalk 2 -
Nothing against AMD as they seem to have come around, but back in the day they had me hot. Maybe it's turned a new leaf...
Edit: contradictory info here: http://forum.notebookreview.com/ali...e-cpu-heatsink-7970m-crossfire-upgrade-7.html -
For me, AMD wins this one hands down.
For the simple reason that they got their hardware to market before Nvidia did.
I'm no fanboy, although I do have a 5650 card in my current laptop. The reason? When visiting the States before moving here, my old laptop died, and I had to get something semi-decent to game on, and Office Depot had an HP in stock that had the same.
In January, I came into position to buy a replacement laptop, and waited for Ivy Bridge to release as it seemed sensible to get the latest tech. And AMD came out with the best card on the market, so rather than wait around another 6 months and pay about $200 more for a card that might (we still don't know for sure) beat the 7970, I dropped my money on what was known and available at the time.
And I suspect I'm not the only one in this position. -
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.