Just waiting for the Dual 680M SLI![]()
-
the 680 is stronger than the 7970, but...you have to go way way way beyond the normal call of duty to prove that fact. this goes for vantage and 3dmark11.
but again, it's just too much work to make that happen. the 7970 can jump out the gate doing these things up to a point...then of course more in depth means are needed to go farther.
as of right now...they are not maxing out the 3gb cards in the dt world so 4gb might play out long before the next cards roll in..(speculation of course) -
Meaker@Sager Company Representative
Considering even at 3 monitor resolutions that SLi 680s with 4gb see little to no benefit, I doubt even 2 680Ms will at 1080p.
-
I knew it nvidia must have though oh s... the 7970m is 30% more powerful so they did the same to get 6000 ish but now it should be 100w afterall kepler is similar to amd gcn performance per watt.
Imagine if amd weren't around nvidia would be selling crappy p4600-p4900 gtx 680m at a ridiculous price. -
It is not 100W according to some tech heads in this forum who have studied the MXM card. Atleast thats what they guess
-
can you give me a link to this?
-
Magic has not yet supplanted physics.
You need to give up on your dream. The 75W rated GPU is already here, and it's called the GTX 660M. They are not going to double it's performance at identical consumption. -
i was just about to say... u cant just choose to combine the picture of the early prototype with the supposed 6000+ score of the alleged newer version. that wouldnt make any sense cloud
Sent from my GT-I9001 using Tapatalk 2 -
o
Yes it does make sense. What Ive been trying to explain to you people is that the P4900 score is from the first 680M, NOT the one in the picture. So it might score P6000.
@Kevin: They can make a 75-80W if they want. We had 75W 560M and the greater 80W 570M earlier. Well not this high jump in performance though but hey, it might happen. Anyways I would be satisfied with P4900 too
Plus with 80W 680M it makes room for a 100w 685m
Anyways, the people who knows their stuff thinks its 75-85W. Lets just wait and see if they are right or wrong
-
Logic does not follow in the footsteps of your conclusions................
Nvidia is not going magically pull a 75W mobile card that is 15-20+% faster than the 7970. I'm sorry it is just not going to happen, even if they could they would not be putting 4gb of RAM onto it.
We all want Nvidia to put a monster mobile GPU out, but looking at this from a non-biased logical perspective that just doesn't seem to be in the cards for Nvidia atm. -
Yes they will probably release something that scores 4800-5200 and later this year release an updated model, like they did with the 480m and then 485m.
-
So there is a chance there is going to be a 685M and maybe an AMD rival 7990M later this year?
-
Don't know... can say for sure though that something faster will be released at some point in the future
It depends on whether it is needed. If the 480M wasn't crap, there wouldn't have been a 485M. If the 6970M wasn't slower than the 580M arguably there wouldn't have been a 6990M which was similar power to 580M for a much cheaper price. -
You could bet on this and win a LOT of money.
Kidding, but my real question if this year there is something better coming than 7970M/680M because of course next year there is going to be something better. -
hi guys, just found this on the pcspecialist.uk forum, which is a uk sager resseler : "We are planning our Q3/Q4 schedule and as many of your are aware the 4GB Nvidia GeForce GTX 680M will be released during this period and we expect to receive stock from September onwards, according to the current schedule.
We are seeing pricing that is approximately £180 more than the 2GB HD 7970M but according to benchmark performance tests it will only be around 10% faster than the 2GB Radeon HD 7970M" -
That's just ridiculous...
-
4GB is just ridiculous.
-
Hahha. At least its going to be more poweful then.
ETA: September and onward? Happy waiting nvidia fanboys.
-
So it should be around the same time that AMD sort out their drivers then for the 7970m
-
Seems their crazy marketing is working just fine!
And how is that extra RAM supposed to help performance??
-
You nvidia fanboys are ridiculous... The drivers are not that bad at all... You really need some new ammo
-
So let's say most people that want a 7970m have it by july 1, Nvidia fans will end having to wait 2 to 3 months and pay more for minimal performance increase.
Yea i think i am happy that i bought my notebook now. -
didnt the leaked picture and the P4.9k score go hand in hand? so how can that then already be the revised gpu model??
-
By not bottlenecking something like Crysis 2 + ultra res texture pack + 4xAA/AF etc which happens on 1.5Gb and 2Gb Nvidia cards that CAN run those visual settings. Not arguing though that overspeccing on VRAM is rife, esp on the low end cards which don't have the grunt to run settings that would fill it half full.
And Nvidia had a slightly faster card for much-more-than-slightly more $$$ with 580M vs 6970M, while Nvidia does it far more obviously - that's what both sides do most of the time when they have "the fastest".
And can the AMD fanboys argue that AMD driver *haven't* deserved their reputation? -
Meaker@Sager Company Representative
6990M is slightly behind the 580M, to the point where the better dual card scaling with the 6990M puts it on par or slightly ahead of 580M SLi.
It's just the 580M happens to have extra voltage options in the bios that lets it pull ahead after tweaking. -
Maybe there is some new 23'inch Gaming Laptop with a 2560x1600 resolution upcoming and only Nvidia knows about it?
Or maybe one with 2 or evem 3 screens?"lol" -
Meaker@Sager Company Representative
TBH on 17 and 18 inch machines 2560x1600 would be great.
However ram would still not be an issue. -
notebook with 1600p resolution is just not possible at the moment, let alone notebook with 2 or 3 screens
anyway 10% perf increase for £180? pfftt
-
what would be in a few years the new standard ?
2160p (1080p x 2) called Full 2HD ?
Edit:
Answer
http://en.wikipedia.org/wiki/2160p -
Kingpinzero ROUND ONE,FIGHT! You Win!
Remember also that althought a "gimmick" as people's say, nvidia own 3dtvplay and 3dvision can be used with an hdmi output and a compatible hdmi 3d tv set.
That requires a lot of VRAM. Still I think 4gb is a bit overkill but if the 680m turns out to have the same power as a desktop 670 which is only marginally inferior to the 680, the big VRAM would help games such Witcher and Crysis even in the near future plus their 3dVision feature.Also Kepler can output on 3 monitors like eyefinity, but at the same time along with Nvidia Sorround you can output 3d as well. So it would be Nvidia 3d Sorround.
And that requires a lot of VRAM as well.
As an example 2 gtx580 with at least 1.5gb of ram are essential for Nvidia Sorround setup.
Now that Kepler doesn't need Sli for multi monitor setups it still requires a good amount of ram. -
WOW!
When are we gonna see Laptops with higher resolution than 1080p?
@AhPek00:
Just think of a Nintendo 3DS built Laptop.
@Kingpinzero:
GTX 670 desktop performance would be GREAT! -
Kingpinzero ROUND ONE,FIGHT! You Win!
Indeed it will. Even if nvidia comes out with a gtx 660ti, which can be used as a base for our "probable" 680m, it would be enough for everything.
Since mostly it would compete with desktop 7870 which is what 7970m is based on.
The current state of things on desktops shows that nvidia has two cards which compete with amd 7970, gtx 680 and 670.
I really hope they will come out with something decent. -
It's not, Cloud is just how do you say.............floating amongst the "clouds" (chuckles).
-
1600 would be nice. and dual screen laptops .... sure
Lenovo ThinkPad W700ds -- A Review of the Lenovo ThinkPad W700ds
dual screen laptops by gScreen, SpaceBook is a dual 17-inch notebook
they are and have been around -
Meaker@Sager Company Representative
2160p would be 4 times the pixels of 1080p. -
Meaker@Sager Company Representative
I missed this post cloud.... What? You put 1W in, you get 1W out, law of conservation of energy. -
But not all of it is heat. So what Cloud is saying about a power/heat constant is plausible.
-
All of it IS heat. There is no macro level potential, kinetic, chemical, or nuclear energy created. The energy is completely dissipated in resistive heating as currents travel through traces and vias.
-
Ummm are they really going to make this
would be epic
-
No they didn`t. Its all the sites who are slow and didn`t pay attention to what was happening in the chinese scene. I made the whole Kepler thread, I found a bunch of stuff. Believe me, I followed everything.
The P4900 score was leaked in early april on a chinese forum. P4600 was the stock score, P4900 was the overclock score. They couldn`t get a GPU-Z screenshot from it, but they noted that it was reported as GK106. Look up the "HURRAY, 600 series not only Fermi" and the M17xR4 speculation thread. I posted the scores there.
Then the 7970M was tested in this forum in late April followed by the official launch.
First in early May the photo with a guy holding the MXM 680M card was posted on the internet. It was someone in the barebone MSI community that got a hold on the MXM card. They reported it as a revisited 680M, and it was now a GK104 680M. And the score I posted yesterday from a chinese forum have seen the GK104 680M score and it apparantly broke P6000 in 3DMark11. Tech people in this forum have studied the MXM card and thinks its 75-85W.
So the P4900 and the MXM picture you see in all the 680M articles right now are not related, at all. They are mixing a GK106 score with a GK104 picture.
This is what happend and its a complete mess since rumors after rumors have followed the mobile Kepler. Its all guess, based on very little information
----------------------------
And no I`m not doing the whole heat/watt discussion again. We had that month ago and I`m done with it -
chances are the 4k resolution would come into play. 3840x2160 or 4096x2160 as were already editing video in these resolutions today and there are lots of cameras and projectors that use it, its just coming into consumer displays though
http://en.wikipedia.org/wiki/4K_resolution -
Uh, no. Power (e.g. Watts) = Resistance * Current ^2. By virtue of being a closed circuit, there has to be a current flowing out of the CPU (otherwise the CPU wouldn't have power because there would be no current flowing through it, i.e. an open circuit). And by virtue of the fact that CPUs/traces/wires/etc. are not made out of superconductors, there will be resistance. Therefore there is power leaving the CPU. So what you're saying is that the CPU turns all the input energy into heat, and then magically creates more energy to output. Which is clearly untrue.
The obvious explanation is that the CPU does not "use" all of the energy. Just like an electric motor does not turn all of the input energy into mechanical energy, a CPU does not turn all of the input energy into heat. The heat output will be slightly less than the power going in minus the power going out. And since there will always be power going out (by virtue of being a closed circuit, as explained above), the heat output will never be equal to the power going in.
Also, even if all the energy going into the CPU did get transformed into some other type of energy, the would be some kinetic energy, manifested as the physical degradation of the CPU. Also, there would be a small amount of EM radiation and sound. So either way, not all of the energy gets turned into heat.
Well, that's as far as I figure it at least. -
Yes - after four years of education in electrical engineering with an emphasis on high speed circuits, I am fairly well aware of the closed loop nature of circuits.
Power is at the same time defined by Voltage * Current (assuming zero phase difference between the two). The port t which current leaves the circuit has no energy with respect to the ground of the circuit. That is not to say that you cannot extract more energy if you connect that to a more negative potential, which will indeed cause more energy conversions. When a manufacturer says a chip consumes 100W, it does not mean that it will somehow have 100W pass through it and then return an amount to the network. That's simply how electrical power works.
Essentially all of the power loss in a chip is due to IR^2 heating, as you mentioned. I noted specifically that there is no MACRO kinetic energy (coherent, unidirectional motion of large masses). Random motion by individual atoms in a lattice structure such as that of silicon is characterized as thermal energy. Temperature is a function of average kinetic energy possessed by the particles.
Furthermore, electrons traveling in a circuit is not like water rushing out of a pipe. There are two types of currents involved in silicon - diffusion and drift currents. In both cases, at the crystal lattice level the electrons do not in any way travel straight. If you look up the figures of diffusion coefficients or electron mobility and factor in the typical carrier concentration differential and the electric fields established, the net, directional velocities of electrons is on the order of at most centimeters per second. This is because electrons bounce back and forth within the structure, and the NET motion (eg 1 cm forward, 0.999 cm backward, 1cm forward, 0.999 cm backward) is the "current" flow. It's not like 1kg of water rushes out of a pipe at 1m/s possessing 0.5J of macro level kinetic energy.
Lastly - yes, the device will cause time varying EM fields due to switching currents. However - not all transistors switch at the same time, and not all currents flow in the same direction. The net EM field emitted by the chip is usually low in magnitude, as sub components cancel. As well, in order for actually significant amount of energy to be transmitted to an environment through electromagnetic coupling, you'd need a certain amount of inductive / capacitive effects between the chip and the environment, which you don't have. -
I think you misunderstood my initial statement. My statement was that not all of the power going in will get turned into heat. What you seem to be saying is that the power that does get consumed gets turned into heat.
So using the previous example, if you put 1w in, you will get 1w out in some combination of energy, but not all of the 1w will be heat. -
LOL!
The design is just ridiculously funny! Thanks for the links.
-
R3d and min2209, I'm just gonna go ahead and give you two some rep. Nice round you had going there.
-
the w700ds may be funny, but that unit was a dream for content creators, too bad it never caught on more.
we have one at work still -
Looks like 680M will be more expensive than 7970M yes. Atleast in Turkey. Is $170 extra worth it?
MONSTER® Q61P170EM07 17.3"
3610QM, 8GB DDR3, 500GB HDD, 2GB 7970M
Price: 4507 Turkish Liras
Price: $2429
MONSTER® Q61P170EM07 17.3" - MONSTER® Notebook » 17.3" P170EM ( HD7970M )
MONSTER® Q61P170EM08 17.3"
3610QM, 8GB DDR3, 500GB HDD, 4GB 680M
Price: 4826 Turkish Liras
Price: $2600
MONSTER® Q61P170EM08 17.3" - MONSTER® Notebook » 17.3" P170EM ( GTX680M ) -
It is worth it for those of us who i would not necessarily label as "Nvidia Fanboys", but who have come to consider the level of compatability and efficiency of Nvidia drivers as the standard.
I have nothing against ATI and recognise that the 5870M and 6990M trumped their Nvidia counterparts in terms of price
erformance ratio, however poor driver support (relative) and lack of proper vsync and triple buffering support via CCC always lead me to opt for the Nvidia counterpart.
Again as I currently own a 580M system, although I have been salvating over the 7970M for a good few months now I would gladly wait until September to pick up a card that may only be 10% faster at a 50% price increase (10% overall system price increase for me), but also provides me with peace of mind. -
Well you are not alone. There are many many people who buy Nvidia year after year because they are satisfied with the driver support, Optimus, stability with their GPUs and the extra performance over AMD. Its really bad though, because it sort of gets a habit after a while. Its like if you want to BIOS update. Why change when you have been satisfied with Nvidia for so many years? (For those who didn`t catch what I meant: They say you shouldn`t update BIOS when there is nothing wrong with your machine)
Plus not everyone (like myself) care too much about price. Atleast not $170. Thats not even a half SSD.
But I do really understand those in the opposite end: Why pay extra for Nvidia when you are satisfied with AMD?
AMD 7970m vs GTX 680m
Discussion in 'Gaming (Software and Graphics Cards)' started by x32993x, Apr 20, 2012.