what program do u use to monitor ur cpu usage during gaming? should be interesting to see how much my cpu is stressed during crysis I+II, duke nukem forever, FEAR 3, Dungeon Siege III, etc.......![]()
-
Depending on which way you look at that figure, real usage could be 60% since it is taking into account hyperthreading. In benchmarks, some games may show a low CPU percentage usage while not gaining any benefit from the extra Cores so real usage is a lot higher than we think in a sense. It all depends on whether a game performs better with the extra Cores or not.
-
hmmm, interesting info LaptopNut. i guess the best compromise would be to just monitor the wattage of the cpu, since its directly linked to its load? either that or (if possible) measure the load of all the transistors on the cpu die (including the L1-L3 Caches). otherwise, ud get falsified results, whether ure monitoring real or virtual cores, dont u think?
-
i was just using the task manager, but i will look at the individual core usage.
edit: i attached it and it looks like its using one core mostly and then doing other stuff on the other cores. Would it help to disable the hyper threading performance wise or no?Attached Files:
-
-
Use playclaw and enable its CPU and GPU overlays, then in settings, enable per-core temps and load for CPU, then GPU utilization and Frame Buffer utilization. Then you'll see in real-time what's being used. And for the person who said "in reality 60%", that's not really true, I've seen games take 80%+ on like 4 cores, and my recording program use the remaining cores without a hitch. It does make a nice difference.
-
I have observed a game reporting an average of 50% CPU usage in its benchmark that was reported as around 90% once Hyperthreading had been disabled, yet both gave exactly the same gaming performance. It was in an ancient NBR thread and completely confused everyone until realised.
-
According to Soviet Sunrise, Hyperthreading only make a noticeable difference when your CPU is being pushed to its limits.
-
Have a 580M sli M18x on order. They are VERY orderable, fyi.
-
Larry@LPC-Digital Company Representative
The 580M SLI has been shipping for a few days now in the Sager NP7282 and Clevo x7200 R2 for a $650.00 upgrade price...
_ -
right! funny thing is that i installed playclaw just a few days ago, only that i havent really checked it out yet! ^^
-
Is the heatsink the same between the 485m and 580m?
-
Seen to be little difference in power design.
-
hmmm....good question:
485M <--> 580M front
485M <--> 580M back
as far as gpu core and memory chips are concerned, the layout is exactly the same, with the only difference being two additional voltage transducers (not sure, just speculating here
) on the 580M. are those included in the heatsinks coverage, thus having their own pads?
cheers
edit: lol nice, we posted the same pics at the same time, u juuust beat me to it@kaltmond
-
Some tests and Benchmarks on notebookcheck.com (in german, I guess English will be ready in a few days, but you can use google translation):
Test Nvidia GeForce GTX 580M Grafikkarte - Notebookcheck.com Tests -
Interesting. Pretty much confirms a. +10% upgrade over the 485x.
-
NotebookCheck.com has some concerns about the 580M overdriving the 180W power supply in the P150HM... I hope that's a bad 580M or measurement.
-
link? I can't find where it says that...
-
From Tirenz link above (translated to English for you)
Read down to Energy Consumption -
lol for the 3d mark 11 test i get better results with my 485m i get around p3580, and for bad company the 485m did about 3.5fps better. I'm happy i didn't shell out 200 extra so i could get the newest card. also it seems like its actually less efficient, because according to the power consumption test it was pulling out 214 watts at one point. and bartman isn't that pretty much the same rig as you are getting?
-
Yes, I went for the 580M that just went into Phase 2 today. If it throttles under heavy load, I'll return it in a heartbeat.
-
Larry@LPC-Digital Company Representative
Please remember this: Some of you may remember when the 485M started shipping in January this year the early numbers for benches and such were not that impressive.
It wasn't until late February when the updated drivers came out designed for the 485M that really demonstrated what the card could do.
The same will be for the 580M. The driver situation for this card at it's release are not very good, just like it was for the 485M when released. There are no certified nVidia drivers for this card out yet. But stand by...
_ -
nVidia is really funny. Throwing out a card with no drivers... is like selling a car and you can´t get fuel for it
-
yes but also keep in mind that this new card is pretty much the exact same thing, but just "power optimized". and according to their tests it sucks up juice mf. if the drivers improve the 580m they will improve the 485m as well.
-
Larry@LPC-Digital Company Representative
There are really good certified drivers for the 485M, but not the 580M. I would not expect the 485M's to see any real benefit to the nVidia certified drivers for the 580M. At least not much.
_ -
Phase 2? is this like a batch 2? I'm getting the same spec as you--except with the 500GB Momentus XT Hybrid SSD.
-
Sorry.. XoticPC's "Phase 2" - Sent for Custom Building / Testing / Burn In & Final QA. I've agonized over this purchase since January, and I would hate to be wrong about the 580M.
Good choice: I originally ordered the Hybrid SSD too, but changed my mind since I will move my Vertex2 to my P150HM. -
I sure hope the power usage thing is not right and that the heat is not in the 90s as indicated. Thats two different tests I think I have seen with temps in the 90s and now learning it may use more power thats crazy.
The 485M you can OC past the stock settings for the 580M and it don't even get into the 90s and still uses less power than what this is showing the 580M at.
I got a 45W CPU too so my poor 180W PSU may take a crap with that plus this card
and no one makes better PSU without making your own end on it some how.
Im sure the drivers will help the speed of it be better, but dont see how it can help with heat and power short of holding back some potential or something. -
what is surprising is that the voltage was significantly lowered as well.
-
Yea and I forgot about that. I dunno, I hope all the tests done are with Eng. Samples and not real ones and something is better. I just went through hell with my laptop order and will cry if I need to return it because heat issues and power issues.
Nvidia lost some points in my book on this. Such a minor clock increase, lower voltage yet runs hot and uses more power... But I have seen some stuff if the voltage is too low it runs hotter. While more power normally means more heat, sometimes too little power can strain things and that makes more heat too, but I doubt thats the case here. -
The test also said that the 580M in X7200 only consumes slightly higher than 485M ... *sigh*
I wonder how the card behave in 8170 ... -
Lower voltage doesn't always mean lower power consumption, but it does reduce the amount of heat produced by the chip since that resistance is also reduced. I guess when nVidia power optimized this chip, they made some changes to the transistors so that they have lower resistance.
-
power consumption = watts
watts = volts x amps
scenario: wattage = 100
485m 100 watts = 1 volt x 100 amps(I know its not this much, its just an example)
580m 87 watts = 0.87 volts x 100 amps
so yes according to the laws of physics it should be lower consumption, maybe not that much, but it shouldn't be higher. -
supposedly the amd hd 6990m is 25% better than the 580m, so i guess its 10% better than the 485m. incoming 585m
-
confused
I'm only taking it for a grain of salt atm. Though for all those GPU techies out there, looking from the specs of the 6990m...is 25% believable? -
The fight will never end
-
Larry@LPC-Digital Company Representative
No it is not.
_ -
slightly less than 20%(nvidia) balancing it out to slightly better than the 580m, or about the same as the 485m
-
The unpublished paper with the "fine print" for AMDs benchmark says that the GTX580m was "emulated"...
-
yup, it was actually a desktop 560 Ti with lower clocks to emulate the 580m
-
That's all the 580M is, anyway.
I think both teams are done, until 28nm rolls out. What core could Nvidia use? Attempting to put the GTX 570 into play would result in another ultra-downclocked GTX 480M disaster, at best. The most logical move, would be to slash the price by $250, if not more. That alone would get me to go green again. -
I've thought about this. technically Nvidia "may" come out with another 500m series flagship, but AMD dosen't seem plausible--why? purely naming schemes. AMD's highest desktop GPU is the 6990, while Nvidia's is the GTX 595. Until AMD goes 7000s on there desktops, 6990m will remain top end, while Nvidia, by name, can still go higher.
-
I think extending the Fermi architecture one more time for fastest mobile graphics 'bragging rights' would be a terrible waste of resources on Nvidia's part. If the 6990M eclipses the 580M, that 'battle' is lost. They should focus on Kepler.
Nvidia doesn't move enough high-end mobile graphic chips to make this incremental R&D effort worth it. The Fermi horse is done... shoot it and send it to the glue factory. -
The battle is lost for Nvidia. The Radeon 6990m has been released and it looks like it beats the 580m in all tests ( I wish if Ati get and directly support a mature 3d solution like Nv3dvision.. )
-
LOL @ all you marketing donkeys taking the AMD slides as absolute proof and calling it a "test".
-
I am still an NV fan and I am not marketing for Amd. look for your self in notebookcheck.com and see the benchmarks before shelling out conclusions. yes it is a clear truth when you compare both cards; Ati=cheaper and faster. although I am still dependant on their Cuda for rendering and was so keen to try out 3dvision for gaming which has not even been certified for the 580m in both Aw or Clevo laps... damn
-
Most likley real performance of the mobile cards will be equal--but the 6990m is quite a bit cheaper.
-
Not all the eggs are hatched yet...
Even if the 6990m is indeed faster than the 580m
-AMD's tests might not give a real indication (emulated and not done by 3rd party... so may be extremely biased)
-Pricing is the main point. If all is as AMD predicts and then Nvidia drops the price then we have a great situation here. Even more fun will be the 570m price which should be even lower. (even Nvidia is aiming at the 6970 with the 570m, so it shoudl be roughly the same price)
-Actual implementation of the 6990m...
We simply don't know how well this will turn out. Heat and power consumption as well as manufacturing bugs or worse manufacturing process problems (creating rarity and thus higher prices) are all possible. Right now AMD's claims are little more than "on paper".
Don't get me wrong... if AMD really does pump out something CLOSE to what they claim (faster than the 580m or even 1% slower) at the price it is rumored to start at, AMD will hands-down win this battle and Nvidia will have nothing to do but pricewar.
This is all good news for us
-
The 6990m is USD 180 cheaper than 580m per AW site. (this is after discounting around 100 out of 580m price)..
-
AW's prices are somewhat odd and quite a bit differant from the resellers. Right now...I wait for the reseller pricing.
-
Yes, AMD is cheaper and Nvidia cards are overpriced as hell. But the only benchmarks you will find on notebookcheck are still the desktop GPU clocked to what 6990M is supposed to run at. I don`t know about the rest of you guys, but I don`t trust anything of this until we have a real 6990M GPU running these tests.
Gtx 580m
Discussion in 'Sager and Clevo' started by materax, Jun 8, 2011.
