Hello everybody!
I plan on buying a new laptop and I can't decide between two options.
Should I buy:
1) Intel Core i7 3610QM processor together with NVidia GTX 670M graphics card
or
2) Inter Core i7 3720QM processor together with NVidia GT 650M graphics card.
I'd love to buy 3720 + GTX 670M, but this combo is a couple hundred bucks above my budget. The two options outlined above are about the same price, but I can't decide which one is a better choice.
I want to buy a laptop that will not become outdated for at least 3 years and, although I am buying it primarily for gaming, I won't play the most graphics hungry shooters. The laptop is expected to play Civ5 and GTA IV on high settings.
-
Meaker@Sager Company Representative
Well the 3610QM + 670M is more powerful but if you let us know your location and budget we could advise you to perhaps get an even faster machine.
-
Depends on the resolution you aim to play the 650M GT kicks hass at 720p/768p especially with a core i7 3720qm.
-
Meaker@Sager Company Representative
Especially after clocking the 670M will leave the 650M in the dust though.
-
-
Meaker@Sager Company Representative
Thats the base price of an alienware m17X with HD7970M. I bet if you called them you could grab a 1080p display too for that much (or find a voucher).
-
There is no doubt that the 3610QM + GTX 670M is the better choice.
-
The 670m. However, if your budget is $1700, I would strongly suggest you start a thread on the "What notebook to buy" sub forum.
-
Sager NP9150 / Clevo P150EM - Gaming Laptops
FORCE 16F3 / MSI 16F3 - XOTIC PC - FORCE 15.6" (GTX 670M) Custom Gaming Laptop -
-
Karamazovmm Overthinking? Always!
the malibal satori is the same model that ikas posted, however the 17'' variant. its called np9170.
I would leave everything at stock, upgrade the gpu to a 7970m. If you want the price on the crucial m4 is quite good in there, I would get the 120gb version or even the 256gb. And just be call it done.
1700 is quite a high budget no need to settle for those gpus.
You also give a AW a call, and haggle there, since a 1080p + 7970m m17x would go for 1850, you can try to shave those 150 -
670m should take arond 50w more for about 20% max boost in performance over a 650m gddr5 so about at least double the power consumption to perform better.
A gt 650m could be oc'd far better while the 670m is on the limit and I doubt the power supply could handle it. A 670m oc'd would take around 250w vs a 650m oc'd would take around 150w as its newer tech and the 650m should oc better as it is no where near the limit of 28nm.
At the end of the day if you are racing a formula 1 car that is the fastest car round the track on qualifying but finishes last in the race as it takes to much energy to achieve that rate in qualifying, thats what the 670m is but a 650m is like a formula 1 car that is reasonably fast in qualifying but has twice as fast race pace then the 670m overall. That shows a massive difference. You could have 2 people playing on a 650m computer oc to same performance of a 670m and take same graphics power consumption as a 670m. -
That formula 1 analogy just confused me. I also don't understand why half of your posts include a car analogy.
Either way, it looks like the the OP is taking the wise advice of some of the previous posters by getting a Clevo p170em with a 7970m. -
Review MSI GT70 Notebook - Notebookcheck.net Reviews
217.6w on extreme use of cpu and gpu and 40nm oc means the watts will go up easily if you oc.
650m
Review Clevo W110ER Barebone Subnotebook - Notebookcheck.net Reviews
Alienware M14x R2 Review - Notebookcheck.net Reviews
Review Acer Aspire V3-771G Notebook - Notebookcheck.net Reviews
Normally takes 112w ish on stress test vs 138.6w. I know the 670m may take 180w in other machines but still the peak stress shows how power hungry with everything extracted it can get to at 217.6w.
It seems about right though as the 670m takes about 20w more then a 7970m and the 7970m is twice as fast. -
-
Are nvidia crazy £295 more for a 680m over a 7970m. You could have 2 7970m for that much that perform basically twice as good.
-
Charles P. Jefferies Lead Moderator Super Moderator
Depends on how much you're willing to spend, but the ASUS G75VW might be within your budget. There are some lower-priced models in the $1k range.
Here's our review - one of the best mainstream gaming notebooks out there:
ASUS G75VW Review: The Best Mainstream Gaming Notebook Gets Better
The GTX 670M models might be out of your range though. The GTX 660M isn't as fast (15-25% behind the 670M) but it's a heck of a lot faster than the GT 650M. -
-
Meaker@Sager Company Representative
At 800/950 the 670M is going to leave the 650M so far behind.
-
-
Meaker@Sager Company Representative
Every decently designed board has built in overhead.
-
670m car terms is like a car that can do 0-200 in 10 seconds and have 10mpg vs 650m 0-180 in 10 seconds and have 25mpg. Which one would you prefer thats the question. -
Please just stop with the car analogies. Contrary to the purpose of analogies, they make it harder to understand what you are trying to say.
Ok, so the 650m gets a little more performance per watt. But who really cares? You have an expensive gaming rig and the purpose is to play games. Unless you plan on playing on battery, power consumption shouldn't matter at all. If you are dropping $1700 on a laptop, who cares about at most a $1 difference in electricity usage every month at the cost of a big loss of graphics performance? -
The GT650m isn't a bad card by any means but definitely isn't stronger than the 670m. It's perfect for someone who wants a more portable computer with a longer lasting battery but can still game relatively well without breaking the bank. For straight up gaming, however, the 670m kills it, and is in turn, killed by the 7970m. -
With such a budget, I would go for the 7970m.
Its roughly 50%-70% better than 675m (which is a renamed 580m, and 670m is a renamed 570m if I'm not mistaken) -
A little so 80w gaming vs 160w gaming is a little difference lmao. There is a reason why its 2x performance per watt.
The 650m gddr5 is virtually 10% slower then a 670m. Now put it this way anyone can oc a 650m by 10%.
Alienware M14x R2 Review - Notebookcheck.net Reviews
In games 650m gddr5 takes 82.1w vs 670m which takes 120-130w on 3dmark06 and with a 220w power adaptor and a 3610qm cpu it takes 217.6w vs the 650m 138.6w which is basically 80w difference. I am sure those manufacturers putting smaller power adaptors are throttling the 670m if the 670m wanted to be oc'd and you get better performance in a stress test you would need at least a 250w power adaptor to have a chance.
Review MSI GT70 Notebook - Notebookcheck.net Reviews
Review MSI GT70 Notebook - Notebookcheck.net Reviews
It makes sense as kepler a 20w power consumption card should be equal to a 40w power consumption card. Now I wouldn't be suprised if the gt 620m was a 17w part performing the same as a 540m 35w part. -
The 670m is a stronger card than the 650m. Period. Nobody is disputing that the 670m is less efficient and draws more power. For straight up gaming performance the 670m beats the 650m with no ifs ands or buts. Besides, who games heavily without being plugged in?
P.S. Thank you for not using a car analogy. I appreciate it. -
Well I prefer lets say a 140w machine with latest gen components now took 250w in 2008 or 200w in 2010.
-
Meaker@Sager Company Representative
650M is what? Around 2200 points? The overclocked 670M can reach over 4000 points in 3dmark 11.
That's not some tiny difference there. That puts them in different performance brackets. -
imo you have to be crazy to get a 670m or 675m just because it outperforms a card that takes similar power to a gt 540m.
Google Translate
85.5w stress test is not bad and seeing noytebookcheck user htwingnut power consumption on the gddr3 version:
http://i.imgur.com/BrWp7.jpg
I prefer that compared to a power hungry monster on a laptop. A gt 650m is more of a low profile card while the gtx 670m is a power hungry beast that needs lots of fans just to cool, good luck ocing especially with power adaptor constraints. Kepler is 2x power efficient which is massive.
Imagine people using desktop pc's that take 300w 4 years ago for how much a laptop takes now around 85w to do now or if its fermi 40nm 150w.
look at 8800ultra take 371w in 2007 with a c2d x6800:
NVIDIA GeForce 8800 Ultra review
NVIDIA GeForce 8800 Ultra review
Now look at the 3dmark06 score is 11864 with an x6800 cpu:
NVIDIA GeForce 8800 Ultra review
The gt 650m scores 12557 on 3dmark06 with an i5 3210m cpu which scores 3580. 2608 is around x6800 cpu score: Intel Core 2 Extreme X6800 & Core 2 Duo E6700 Processor Review - Page 9 | Sharky Extreme
so overall a laptop now that takes 85.5w outperforms a desktop by 30% cpu wise and gpu just about vs a desktop that is 5 years old that takes 371w which is over 4x as much electricity. When you think of it like that its amazing.
If people are desperate to get the top of the line old generation over the low-mid range gpu like the 650m of this generation then go ahead. Put it this way if nvidia released a 75w card it would easily have beaten the 675m by 30%. -
actually, the GT 650m ain't that bad IF you get the GDDR5 model. Please note the chart on Notebookcheck exaggerates the difference a little since the GT650m and the GTx660m have the exact same Kepler core. with GDDR5, the 650m essentially equals the 660m minus the 100mhz baseclock.
I noticed these mobile keplers can boost about 115mhz above the baseclock while MSI afterburner can push it beyond the boost up to a certain limit.
The GDDR3 650m model could get to 1085mhz clock on stock voltage using MSI afterburner (which was the limit that MSI afterburner reaches by the way), i expect the GDDR5 model to reach at least 985mhz accounting for the 100mhz baseclock deficiency.
You could get quite a beastly card with the 660m with 1085mhz clocks AND GDDR5
According to Notebokcheck, the 660m is within spitting distance (2-3%) of the GTX 570m, so I would assume the 650m GDDR5 is about 10% behind due to the loss of 100mhz clock speed.
I think the common overclock for 570m is 800mhz? so this is a 39% overclock.
I believe at 985mhz for the 650m, its a 34% overclock, if the GT650m could reach 1085mz then it equals the 660m.
Assuming good overclock scaling for each, I think the gt650m would only be about 10% behind the 800mhz 570m judging by the numbers and the 660m would be about equal but you would enjoy a roughly 25W power advantage, which means cooler.
Therefore, I don't think the GTX 570m is a good idea it doesn't seem to fit in to high performance or high efficiency, for this budget, I'd recommend going the whole stretch and get the 7970m for absolute performance (the 680m is a little too pricey for my taste) (e.g. a Clevo P1510) or get a 650m/660m in a smaller/thinner laptop for a more balanced machine, plus maybe a bit of pocket change for a nice SSD too. -
Meaker@Sager Company Representative
The 650M only gets close to the 570M because it is already using its turbo in those scores.
You are not going to get much more out of them where as the 570M/670M have everything still to give.
It will make the difference at 1080p.
Also the difference between the 650M and 660M will be larger as not only is there a 100mhz base clock difference, but the increased TDP of the 660M will allow it to turbo more and longer. -
I am amazed how people are so deluded especially someone with a 7970m. At the end of the day a 670m takes 30-40w more then your 7970m and a gt 650m takes half your 7970m so using your logic a 670m is an embarrassing rebrand of a card. In simple terms you could run 2 laptops with i5 dual cores with gt 650m gddr5 and it would take less to run then someone using a 670m laptop. -
I don`t even know where to begin.
Meh, I will go make me a sandwich instead -
or if you have been brainwashed with nvidia rebranding would you buy a gtx 570m or even a 470m oc version and slightly more shaders less then 670m over a gt650m gddr as thats what a 670m is.
So lets say you were going into a shop choice between a brand new laptop one with a month old technology or 2 and a half year technology. You would prefer to buy outdated hardware just because it performs slightly better but needs a 220w psu vs a 120w one.
Using my car analogy lets say the 2010 model did 30mpg and the 2012 model did 70mpg and weighed less you would choose the 30mpg one just because it was 10-20% faster. -
What reason is there to get the 650m? That it uses less power is pretty much it. If you don't plan on gaming on battery power, there is NO REASON to get the weaker card.
The 670m is quite a bit more powerful. This is a gaming laptop. The point is to play games. The 670m does this significantly better. And being more powerful, it will have a longer useful life. The one rule of buying a laptop to game on is to get the most powerful graphics card you can, as that is the limiting factor.
-
the gt 650m will have a longer life as a 670m will be power hungry and more often or not fail. Using your logic why not get a 7970m afterall its twice as fast as a 670m. and takes around 20-30w less.
-
Ok, we get it. The 650m uses less power. Does it offer anything else over the 670m? No, it doesn't. So there really isn't any reason to get the 650m over the 670m.
-
there is new features:
What GK107 also brings to the table is NVIDIA Control Panel support for FXAA and TXAA, as well as NVIDIA's dedicated video encoding hardware NVENC. There's also DirectX 11.1 support and a notebook analog to the desktop's GPU Boost: the GK107 is able to dynamically increase its clock speed depending on the current thermal load within the notebook. We're not sure exactly how this is done, but it's essentially a GPU version of the Turbo Boost and Turbo Core technologies we've seen from Intel and AMD CPUs for some time. Finally, NVIDIA's Optimus graphics-switching technology makes its return.
NVIDIA was happy to announce a series of design wins with Ivy Bridge, but while we're privy to them we unfortunately can't share any details with you right now. We'll have to wait until Intel's Ivy Bridge embargo lifts for that information. Suffice it to say, expect to see a LOT of GeForce 600M GPUs on the market once Ivy Bridge is launched.
With that massive info dump out of the way, let's take a look at NVIDIA's 600M series proper.
AnandTech - NVIDIA's GeForce 600M Series: Mobile Kepler and Fermi Die Shrinks -
Power draw alone has nothing to do with a chips longevity/life. If the chip is operating within its specified limits, it is just as prone to fail as an entry level 520M, etc. Using your analogy of cars and computers, its like saying a V8 is more prone to fail because it has two more cylinders than a V6. Not true.
In reality, it's the cooling of the chip and voltage that matter most with chip life. I've seen 555M's in the 100*C's and my 560M is at 80*C tops; if anything, the 555M is more prone to fail than my 560M because it isn't operating within its thermal limits even though my 560M has a TDP 30W higher. The 560M is able to sustain higher cores and clock speeds naturally because GPU cores are binned; the 555 is a defective 560M, 650M is defective 660M they are based off the same core GK107. Your argument is very weak, especially considering notebook GPU's are not generally upgrade friendly. -
xonar the op also said this:
I want to buy a laptop that will not become outdated for at least 3 years and, although I am buying it primarily for gaming, I won't play the most graphics hungry shooters. The laptop is expected to play Civ5 and GTA IV on high settings.
.......................................
edit: these games should run fine as long as the cpu is fast.
the technology with a 675m really is 2 and a half years old already so basically in 3 years time it will be basically based on 5 and a half year old technology.
in 2 years time an entry level gpu most likely beat a gtx 675m something like a gt 820m gt maxwell or something. It would be funny as this shows 16dp gflops per watt vs fermi which is around 2:
NVIDIA Exposes GPU Roadmap: Kepler Arrives 2011, Maxwell in 2013 - HotHardware
Thats why I am waiting now for maxwell. I know nvidia are maybe slightly to optimistic but put it this way 675m would look like what a gefore 6800ultra looks like now to a 675m once maxwell comes out. -
Nvidia's next big graphics chips explained | News | TechRadar
interesting although it is 2014 i believe now:
The next architecture will be based on the 22nm Maxwell GPU, coming in 2013. "The improvement from Tesla [the chip Fermi replaced last year] to Maxwell should be about 40 times," Hsuang told TechRadar. "Fermi is about four times faster, relative to Tesla, so it's ten to 12 times faster from Fermi to Maxwell." -
1. Thank you for the quote, I almost forgot to read the FIRST POST of the thread and I forgot that my previous post was not at all in relation to the original post.
2. The 670M is a better GPU than the 650M. It is able to handle games at higher resolutions and higher FPS than the 650M, therefore it will become outdated LATER than the 650M. I don't see how power efficiency matters one bit when GAMING PERFORMANCE is the value being compared.
3. What a terrible article to link. Kepler came out 1.5 years after that article stated it would and the article itself is from 2010. What makes you think Maxwell will be out in one year after the MASSIVE delay in Kepler? Keep waiting all you want, it's your choice. -
I really need another car analogy right about now because I'm trying to figure out which card is faster. A plethora of synthetic benchmarks and multitudes of gaming benchmarks with hard to understand frames per second comparisons are just too confusing right now. My brain can't handle this and could really use a car analogy. Oh please, one can only hope.
-
It's like driving a 2012 Ferrari when you could be driving a 2013 Honda!
-
what you have to remember is that a laptop a portable thing. If the op wants to game on battery then a gtx 670m laptop will just eat it up. I think motherboards.org review of 11,6 said you could game for 4 hours with the clevo with gt 650m. Anyway my point is what is the point of buying something basically 2 and a half year old for more performance.
Its the same as someone buying a qx6850 desktop over lets say an ivy bridge ulv 17w cpu just because the qx6850 beats it, it doesn't matter if the computer takes around 200w more to beat it etc by 20-30% the qx6850 is faster so you should get a desktop with qx6850. Thats basically what people are saying.
Another thing people are saying is that they are basically saying they would take a 130w pentium 4 extreme edition over lets say an atom single core cpu just because it performs slightly better over a 3.5w single core atom. The new 32nm 3.5w dual core atoms beat the p4 extreme edition so basically thats all I am saying, whats the point getting outdated tech afterall the performance you think you are getting is fast is in fact virtually outdated hardware.
When I bought by laptop in january 2009 mine was out for what 5 months, 670m laptop is basically based on 2 and a half year old technology. -
You can't game for four hours with GT 650m unless you have a 160WHr battery. You might get 2 if you take measures like limiting framerate, turn off boost and hyperthreading, and underclock the GPU.
The GTX 670m is significantly faster than then 650m, period. Regardless of tech used, it is a powerhouse. -
NVIDIA GT 650M Demo: 11" Gaming Laptop Running Battlefield 3 through 60" HDTV (AVADirect W110ER) - YouTube -
I know the 670m is faster as it would be embarrassing if it wasn't. My point is its like saying to someone in an extreme way a 7950m gtx sli is faster then an intel hd 4000 so get that as it performs better as in the nvidia solution. Point is 7950mgtx is 90nm vs 22nm hd 4000. Massive difference in power consumption. Power consumption is a massive factor in a laptop and electric costs a lot if you are using a 200w gaming laptop over a 100w one..
-
OMG my head is about to explode. You know too much nissan
3610QM + GTX 670M or 3720QM + GT 650M
Discussion in 'Hardware Components and Aftermarket Upgrades' started by fizikis, Jun 10, 2012.