what are those extra features?
-
Karamazovmm Overthinking? Always!
-
What extra features?
Features that are proprietary such as CUDA?
Just so you know both companies copy each other....just like everyone else. -
SlickDude80 Notebook Prophet
yoda is talking about physX, Cuda and 3d vision. All proprietary and all in danger of being extinct for open source tech...well, maybe not 3dvision, but most people don't like the FPS hit of 3d, especially on a laptop
-
-
Karamazovmm Overthinking? Always!
thought as much, but those are non issue for me thus, not an advantage
-
3d is stupid. I personally don't care for 3d. If I wanted things coming out at me I'd put my face in front of a pitching machine.
This 3D fad will die soon enough. -
SlickDude80 Notebook Prophet
-
2nd - Like you said, I also believe "Nvidia will hold back the 680m as long as needed to release a comparable product." . The problem is, how long will they hold it back? Does it worth waiting? Can you wait? Does it worth waiting months and months for a card that will possibly be just 5%-10% more powerfull and possibly more expensive, just for the sake of benchmarks? (cuz if ur concern is gaming, the 7970m is already a monster).
Like I said: today, depending on what you want the GPU for, Nvidea might offer advantages for you with the CUDA cores, but that is to change.
physX - You can count on ur fingers games that use it.
Cuda - If your concern is gaming, won't make a difference
3d vision - Not enough knowledge to talk about it, but I believe most ppl wont use it, and for those who will, AMD solution might be just as good for their real need. -
A randomly thought up pricing question, but if they have had to basically scrap the original plan for the 680m, respin the silicon and start from scratch, will nVidia NEED to pass on some of the cost of this to the customer?
I'm guessing that they might, as I'd imagine it's pretty expensive. I know they're known for being an expensive company any way, might this mean that they'll have to be more uncompetitive? -
No, because it's not as id Nvidia is busting out brand new, custom silicon for each mobile card. All they would've done in "scrapping" the original 680M, was change from which desktop core it was to come, then adjust accordingly.
If it's ultra expensive, it'll be for the same reason it always has been, which is because the people making the decisions are as-.. er, I mean jerks. -
The only reason I opt for Nvidia over AMD at the moment is that i'm one of those guys who detests horizontal tearing with a passion.
The ONLY game i have found the Nvidia control panels force vsync option to fail with is Starcraft 2... which has native vsync support so no s given.
On the other hand I am ye tto find a game in which AMD CCC force vsync actually works... OH DEAR.
With my last AMD GPU the 5870M my saving grace was D3DOverrider, however AMD GPU owners have been complaining that D3DOVerrider does not work properly if at all with Windows 7 SP1 for some reason and with no other decent method of forcing vsync in games with no proper / working native vsync support (Metro 2033, Dead Space 1 and 2 [native vsync 30 fps HURRR], Silent Hill 3, Gears of War, Crysis, and many others) I feel a little anxious about purchasing a 7970M
However if any AMD owners can confirm D3DOverrider works in Win 7 SP1 or have a viable alternative I would be over the moon as i;m ready to buy a 7970M right now;
Queue Slickdude "Dan just wait!" -
I've been using D3DO with Windows 7 since the Beta, and I'm still using it today with SP1, in every game I play. Zero issues.
-
Seems like GTX 670 might end up as a notebook GPU after all if Nvidia wants to go that route.
GTX 670 have about the same power consumption as 7870, but completely destroys it.
GG 7970M. It was nice to see you.
Review of 670:
NVIDIA GeForce GTX 670 2 GB Review | techPowerUp
-
Desktop 670 > Desktop 7970
670 is a great card there is no doubt about that.
In AnandTech - NVIDIA GeForce GTX 670 Review Feat. EVGA: Bringing GK104 Down To $400
All the popular games 670 did better than 7970 (skyrim-batman-battlefield etc)
670 is in fact better than 680 once you overclock it (overclocked 670>overclocked 680)
And at stock speed there is no "real" difference between 670 and 680
Amd need to drop 7970 price to 399$ and 7950 to 299$ otherwise they are getting slaughtered -
Yeah its so sick. And the best part is that it draw less power (Under 50W less!!!) than the GTX 560 Ti which the GTX 580M was based on. But 670 is 100x better. Man this could really happen
-
Now to see if nvidia would actually implement it and when and how much :-? If its reasonable enough (~6-700), otherwise i would sit this one out
-
I think you should take a look at average and maximum power consumption. This should be closest to its actual cooling needed, if we draw some parallel to notebook cards.
VS HD7870 it uses 41W more on average and 18W on maximum screen.
Also take a look that perfomance per watt is still 10-18% behind HD7870. So dont expect miracle P7000 here. You will be disapointed.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_670/29.html -
Meaker@Sager Company Representative
Nvidia are also struggling to get cores at the moment too.
We will have to wait and see what they can actually produce. -
670 is more than 100watt less during max power consumption yet it does outperform 7970 in most games, 670 is basically 680 with 5% less power, in some cases the shipped SC version perform better than 680.
Anyone who does downplay 670 performance/watt basically lose any creditability
If 680m was downclocked 670 it will outperform 7970m easily and will reach P7000 there is no question about it, the real question is how much and when, if nvidia take too long then AMD will have 8970m by then -
-
You compare GPGPU monster card to Kepler? Seriously? You may do it on any desktop forum but here we are trying to speculate upcoming GTX680 perfomance vs Pitcairn XT. HD 7970 is Tahiti core and it has absolutely nothing to do with Pitcairn vs Kepler perfomance on notebooks.
Its like comparing Boeing 747 to F18 if that example makes more sense to you... -
Benchmark Results: Sandra 2012 And LuxMark 2.0 : GeForce GTX 690 Review: Testing Nvidia's Sexiest Graphics Card
GPGPU Monster? More like God.
Imagine the results in QuadCrossfire; all GPUs @ 1.2Ghz. -
Take a step back. You are misinterpreting things here.
His point is is the performance level of the desktop 670 being well above the desktop 7870 which is the basis behind the 7970m.
The 670 desktop might very well fit into the mobile arena much like the desktop 7870 did. (indeed the data shows the two similar in actual power consumption)
Yes, much like the desktop 7870 was cut by 15% clock rate, the desktop 670 will need some cuts to make the power cutoff... but from the looks of things it could theoretically work.
This speculation is well within the topic. -
Off-topic subjects ≠ interpretation
And back on-topic I doubt the 680M will have 13xx CUDA cores.
It would need a beast cooler. -
-
-
580m is based on 560 to which runs hotter, use more watt and weaker in performance compared to 670
The metro chart does not show everything
http://images.anandtech.com/graphs/graph4135/35200.png
http://images.anandtech.com/graphs/graph5818/46462.png
Under max load 670 is cooler or the same as 560 ti heat and both of these charts are for nvidia reference design
You are ignoring the move to 28nm
Reality is Nvidia did cut a lot of power consumption from 680 to 670 and almost didn't lose much performance -
zero989 They are on about the desktop version being made for the laptop.
-
so to sum up all available data so far,its definitely in the realm of possibility that the 680m will be based on the desktop 670. that would indeed be amazing and really pretty much destroy the 7970m. on the other hand, nvidia is not really left with anything else to do,since theyre so far behind AMD in their time table. considering the awesone OCeability of the 7970m, one could speculate on an OCed version with a "7990m" stamp on it,thus closing the gap to nvidia without robbing them of their obligatory 5% perf. advantage / 50% higher price disadvantage
just my 2 €cents
Sent from my GT-I9001 using Tapatalk 2 -
I have no doubt that they CAN make GTX680M from GTX670.
BUT. Dont be naive here.
They need to cut lot of clocks/voltage, because GTX670 is still hot card and consumes more power than 7870 does. Its not linear curve we are talking about here. GPUs are efficent on fairly small range and im not sure that clocks needed for 100W fits in that range.
Also we have no idea, how GK104 reacts to such low clocks.
Besides. GPU voltage needs to be cut a lot as well. We dont know what would be the minimum voltage for it to run and how it will scale with clocks.
Its possible that GPU doesnt like too low voltages and crashes even with low clocks.
Im still very doubtful that underclocked GTX670 will be the new GTX680M. For me it seems too big task to accomplish.
I think they will shut down some more cores and perhaps then its able to fit inside 100W envelope. -
The only way for a 680M to receive 1344 CUDA cores is if the card were to run at 100W+, in which I could be wrong. It would also be the greatest engineering feat in the history of engineering. It suggests that nVIDIA would somehow reprogram their power management so that the card would never exceed MXM 3.0(x) limitations, and perhaps underclock the card in extreme circumtances as seen in OCCT?
I'm glad someone understands. -
I couldn't wait anymore on the 680m. The AAFES had a deal yesterday for 25% Alienware laptops. I got a M17x R4 with:
Alienware M17X R4
Alienware M17X R4 with Soft Touch edit
Operating System
Genuine Windows® 7 Professional, 64bit edit
Processor
3rd Generation Intel® Core i7-3820QM (8MB Cache, up to 3.7GHz w/ Turbo Boost 2.0) edit
Memory
32GB Dual Channel DDR3 at 1600MHz (4DIMMS) edit
Keyboard
English Keyboard edit
Display Panels
17.3-inch WideFHD 1920 x 1080 60Hz WLED edit
Video Card
2GB GDDR5 AMD Radeon HD 7970M edit
Hard Drive
500GB 7,200 RPM Storage + 64GB mSATA Boot Drive edit
AlienFX
Mars Red edit
Adobe Reader Acrobat SW
Adobe Acrobat X Reader edit
Hinge Up
Stealth Black with Soft Touch Finish edit
Audio
Creative Sound Blaster Recon3Di with THX TruStudio Pro Software edit
Optical Driver
Slot-Loading Dual Layer Blu-ray Reader (BR-ROM, DVD+-RW, CD-RW) edit
Wireless Networking
Intel® Advanced-N WiFi Link 6250 a/g/n 2x2 MIMO Technology with WiMax and Bluetooth 4.0 edit
Adapter
Alienware M17x 240W A/C Adapter edit
Documentation
Alienware Documentation edit
Office Productivity Software
Microsoft® Office Home and Student 2010 edit
Shipping Material
Shipping Material - Black edit
Security Software
No Anti-Virus Software Selected edit
Additional Software
Additional Software edit
Primary Battery
90WHr 9-Cell Primary Battery edit
Alien Wallpaper
Alien Red Glyphs edit
Hardware Support Services
2 Year Basic Plan edit
2145 OUT THE DOOR!!! I couldn't pass that up. It's going for over $3K now.
I still can't believe they don't have a Bluray Burner for this thing -
wow, that indeed is an amazing deal, good job buddy!
-
This is what the 680M would likely be:
CUDA 960 or 1152
700-900Mhz .9v?
256-bit GDDR5 128GB/s
TDP 100W
I believe it would still be faster than the 7970M.
Edit: nm the 7870 is too close to the 7950. -
HSN21, just give it up. There are way too many AMD biased people in this forum that will never see the light of the day wether it is right in front of their damn face.
This GPU completely devastates the HD 7870 which the 7970M is based on. If Nvidia decide to make a GTX 680M out of this baby, we should expect a downclocked 7970 since GTX 670 trade blows with it.
BUT as a precaution, we don`t know if Nvidia can scale this GPU down to a mobile version equally good as AMD did without sacrificing more performance.
FACTS:
- GTX 580M is a downclocked GTX 560 TI
- GTX 670 draws 56W less than GTX 560 TI with the GPU on 100% full. Maximum is tested with Furmark.
- GTX 670 draws 7W more than 7870 (which the 7970M is based on) under heavy gaming, i.e Metro 2033.
- GTX 670 draws 4W less in average than GTX 560 TI while playing Crysis 2 1080p Extreme profile.
As for temperature I really don`t care because Nvidia will downvolt this GPU to in to the thermal envelope of notebooks anyway, and that will have *NO* performance hit. Anyhow:
- GTX 670 runs measly 5 degree warmer than GTX 560 TI, 670 playing Metro 2033 and GTX 560 TI Crysis.
http://images.anandtech.com/graphs/graph4135/35199.png
http://images.anandtech.com/graphs/graph5818/46461.png
- TDP of 670 is 170W, TDP of GTX 560 TI is 170W. They should get equally hot in average.
And Supranium, please, 99% of people in this forum wouldn`t give a rats a** about GPGPU performance. -
Fact: The wattage output in different situations is proven to be higher than the 7870.
And it's not bias, it's fact. I've owned more nVIDIA gpus than AMD as well.
My eyes are set on the GTX 685 (desktop) this year so, I don't see how there's any bias.
Also the 3DM11 scores would be much higher than 6.5-7K if the 680M rev 2.0 had 1344CUDA cores. -
Karamazovmm Overthinking? Always!
Im hoping that the 670 is a 680m downclocked and so forth, I do hope that it shreds the 7970m, we have to see how the clocks are going to scale since its a new arch, lets hope it aint much of a hit.
-
-
Karamazovmm Overthinking? Always!
and perf per watt might not be everything, but the higher the tdp and the consumption is, the lower the clock and the voltage are going to be.
-
Double post.
-
http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_670_Direct_Cu_II/27.html
670 is one of the coolest running GPUs ever in its range, in most cards the fan is running at very low speeds, just because they appear in benchmarks at 70C it doesn't mean much since these GPU ARE programed like that (70c is normal and the fans don't need to kick in faster to cool it, A gpu with fans running at 80% speed with 78C is not identical in heat to a GPU running at 78C with 30% fan speed, since if you manually boost the fans speed at the second card to the level of the first card it will end up much cooler)
and since 7970m is based on 7870 which does better job therefore 680m if was based on kepler should be weaker than 7970m
We got it, we laugh at it because it's proving plenty of times and posted yet you decided to ignore, look at 560 ti and 580m Performance per watt is not identical neither is 670 and 680 you fail to realize that your argument is baseless -
580m MATCHED the CUDA cores in the 560 Ti.
Are you saying the 680m will match the CUDA cores in the 670?
You seem to be ignoring that.
670 stock score is 9K in 3DM11
7870 stock score is 6.86K in 3DM11
The 680M will likely still be faster but it won't be AS based on the 670 desktop as the 580M was/is to the 560 Ti.
And your fan % doesn't help your point because cooling capacity is based on total wattage. The HSF is not the same as the one on the 560 Ti. The complexity has increased though with the Kepler and so its thermal & power management somehow shine without increasing fan speed. -
GTX 670 have the exact same BLOWING fan type as GTX 560 TI. No vapour chamber there.
Do you really think the official TDP from Nvidia is a lie? TDP of GTX 670 is 170W, TDP of GTX 560 TI is 170W. End of discussion. It is not much hotter, if anything. And just for fair comparison, 7870 have a TDP of 175W, it should run equally hot too. But like I said earlier, they will undervolt the desktop GPUs to make them cooler anyway and with no performance hit, so the heat is not so important imo.
http://wccftech.com/wp-content/uploads/2012/03/02a_800x445-635x353.jpg
Yes the GTX 670 draws more power than 7870, I am not arguing there, but listen to this:
GTX 560 TI which the 580M is based on, draws 159W in Peak, aka when gaming in extreme mode. HD 6870 which the 6990M was based on draws 128W in Average. AMD only had to sacrifice 28W, Nvidia had to sacrifice 59W to fit to 100W. GTX 580M STILL beats 6990M although Nvidia had to sacrifice more.
Now we have GTX 670 which the 680M is based on, draws 152W in Peak. HD 7870 which the 7970M is based on draws 115W. AMD only have to sacrifice 15W this time, but Nvidia have to sacrifice 52W.
Don`t you think we should see some similarities this time too as well?
BUT last time, with 560TI and 6870 competing, there was very little performance difference between them. Now we have GTX 670 and 7870 competing in the notebooks, there is a HUGE performance difference between them. That is why I think Nvidia will crush 7970M this time. -
That's my ONLY question. Blowing fans doesn't mean it's the same HSF. The fan on the 560 Ti is in the centre and exhausts in the case? The 670 exhausts outwards. You guys are really odd. The HSFs are different designs and have different cooling capacities.
-
1344 cores like the 670 model.
I shure hope so -
670 at load is is less nosier than 7870 at idle and you still talk heat? zero review complained about 670 temperature, the 670 is not struggling to keep the GPU below 80c even at minimum fans speed unlike amd cards.
even if 7970m were to run at 10c at load in theory no one cares since Nvidia has to compete within the limit of around below 90c not 10c so again all of your 7970m info is not relevant and pure nonsense
You were proven false at watt issue so you moved to butttt the "temperature!" and you were proven false at that too so we await more nonsense argument from you -
Then why does it score 6.5K-7K unless we're getting a 680M rev 3.0?
if the 680M has 1344CUDA I will literally post in this thread apologizing along with admittance that I'm wrong.
-
-
-
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.