Here is the result I got with my i7-3820qm.
-
Thanks to the guy here who mentionned it, i have put a 40 fps limiter on my 650M GT via NvidiaInspector. Result : From 90-100W when playing BBC2 i get now 75W only and also lower temps and noise of course. Then i also locked my 2720qm at 1596 mhz on four cores. Result : Only 60W on BBC2. I don't hear my laptop fan when playing BBC2 with such low power consumption, almost no noise cause it spins at very low rpm, that's fun
-
Hello everyone,
I've taken some time to peruse this thread although it's a bit long to digest in its entirety. I just got my Sager NP6110 (specs attached below).
First thing I did was buy a 22" 1080P TV (I will eventually upgrade to a monitor with built in speakers, but until then this will do) and wireless keyboard/mouse.
Everything is great, except I am having an issue with the nVidia settings. When I try to access them it says:
nVidia Settings not available - You are not currently using a display attached to an nVidia GPU.
Can someone explain this to me?
I think not using the 650M may be hurting my gaming performance. Company of Heroes seems a bit choppy for a 2006 title :/
I get the impression Intel HD 4000 or something is running the show here. How do I get the machine and games to recognize the 650M? From what I can gather this means I'm running on an integrated Intel GPU - not what I paid for!
Thanks a bunch for your help!
Specs:
nVidia GeForce GT 650M 2GB DDR3 DX11 w/Optimus Technology
3rd Gen Intel Ivy Bridge Core i7-3610QM, 2.3-3.3GHz, 22nm, 6MB, 45W
8GB DDR3 1600 Dual-Channel (4G X 2) (Standard) 0.00
500GB 7200rpm 2.5-inch 9.5mm 16MB SATA II 3Gb/s (Standard) -
Have you installed both the Intel and Nvidia graphics drivers, as well as everything else? That would be the first thing that could cause issues. It could also be the case that your GPU is defective D: Check in the device manager to see that both GPU's are present as well.
-
Is a score of 5.9 for my primary HD a good score using WEI? I have the 750 GB Momentus hybrid drive installed, so I was just wondering. How do I even verify that drive is installed without opening the back cover?
-
Under Device Manager it lists the following:
Display adapters:
Intel HD Graphics 4000
Standard VGA Display Adapter
NOTE: The standard VGA Adapter one has a yellow exclamation point. Under device status it reads:
"Windows cannot load the device driver for this hardware because a previous instance of the device driver is still in memory. (Code 38)" -
Alright what you need to do is hold the power button until it turns off, then boot into safe mode. Left click on the "standard VGA display adapter" and click uninstall, make sure the box is checked to delete driver software. Then reboot and install the nvidia drivers from the driver disk (or the website).
This is what I did when I accidentally updated my 650m drivers in the device manager after installing modded drivers. Let me know how it goes! -
I just got mine today and am confused about how turbo boost is supposed to work. I've ran the CPU test on cinebench and the windows performance index and neither of them caused the CPU to boost to 3.1ghz. I have the AC power plugged in, the power setting on "High performance" and I even disabled the GT650M in device manager for another run of cinebench. The whole time cpuz reported 2.5ghz (i have the i5-3210m).
It's probably worth mentioning that i've also installed Prema's bios mod, but I haven't disabled turbo or really changed anything from default values. Anyone know why the cpu isn't boosting?
Edit: This might have something to do with cpuz. I installed throttlestop and if I run prime95 with it active it shows the cpu boosting to 2.8 ghz if i force turbo (still not 3.1ghz). While this is happening cpuz still says 2.5ghz (the multiplier isn't displayed for some reason). -
Ahhh thank you so much. I was worried it wasn't properly assembled or something! I skipped the first step and went straight to the drivers disc, but that did the trick. Just on a side note, in those settings, do you recommend leaving it all on "auto-select" or do you "prefer" the 560M for 3D and/or PhysX?
I am no longer getting a warning about my settings on Company of Heroes! Yay!! -
I bought it from Eurocom, they shipped it with 4GB DDR3-1600 and I added 4GB DDR3-1333 from an old laptop. I plan on replacing it with soon, although there will be almost no performance gain.
-
Leave it on auto select. The only thing you would generally change (besides advanced stuff like FPS limits thats not in the control panel) is change vsync from application controlled to Adaptive, but you would probably need to wait for a driver update to be able to do that. Otherwise leave it alone unless theres a problem. Enjoy!
-
Just got my W110ER in the mail yesterday! She is all set up with a fresh install of Windows 7 Enterprise x64, 16GB of Ram, my 256 C300 SSD. I had a bunch of old power supplies with replaceable tips and they all work with the laptop fine, even under load.
I will talk about the first thing I have noticed with it. First, it does not support the Intel 6205 Wireless Card. I had to dig up an old Dell card for it to work. Second, the fan DOES spin all the time but it's so slow and quiet it can not be heard in a quiet room. Because the fan spins all the time this makes for 49-52C idle temps! Now the load temps were good but I found a trick that does not require any modding at all or propping up the laptop. Take two pieces of tape and cover two rows of intake slots right below the fan and leave the middle open, this forces the fan to draw air from the other vents creating a cooler bottom and WAY cooler load/idle temps!! I played Just Cause 2 for 5 hours yesterday and it licked 75C for both CPU and GPU, on my lap! I call that amazing. And I idle around 12-17 watts which is lower than what you guys are getting. I wonder if I have a different BIOS than what other people got in thiers. If anyone wants me to post anything let me know.
oh btw, Just Cause 2 at 1366x768 with everything on high gets me a constant 60FPS
man am I happy! oh and did I say 8xAA?! This is a damn powerful card! This little beast beats my coworkers HUGE M6600
-
Two quick questions:
1. my bios (ver. 01, EC 02) does not seems to have an option for a hdd password; is this true of your experiences as well? does ver. 03 EC 03 provide such a thing?
2. password issue aside, is it worth upgrading to the most recent bios? can I go back if I want to?
Thanks. -
I am at work at the moment, so I cannot answer your first question ( I don't encrypt my data), but as to the second, If you want to customize a bit, I would highly reccomend Prima's custom BIOS. Otherwise updating your bios is almost always a good idea.
-
Power draw on this thing is great for such a powerhouse! I'm running Ubuntu 12.04, but no Nvidia/Bumblebee drivers yet.
i7-3720QM
16 GB RAM
Power draw while browsing the web over wifi (including this forum with animated ads):
LCD@low = 13 watts
LCD@high = 15 watts
Is there any way to install just the Nvidia drivers and use them full time (I don't need the ability to switch back and forth), or is Bumblebee the only way to use the 650M? -
There's only bumblebee, since the physical display connections are wired to the Intel video.
-
Where can I buy this?
-
Megacharge Custom User Title
Just got my W110ER today, and while it's a nice little notebook, I gotta say you guys weren't kidding about the screen, it's utter garbage, we really need better screen options on this thing.
-
I'd say my biggest surprise with this laptop is the quality and volume of the speakers from this small of a laptop. Really impressed with it, didn't expect it at all.
-
IMO, the screen in my W110er is okay, I have no problem at all (I got the InfoVision M116NWR1 R0 IVO0489 display according to AIDA64). But maybe it's just me... coming from a Sony Vaio SA25.
-
I know the screen of my np6110 will be worst than my thinkpad w510. but the np6110 have just 1366x768 and it is ok if the screen not is the best of all. at less will be big to read
I believe in some hours my babybook NP6110 will arrive here. I am waiting the postman
BR -
Anyone know any good laptop sleeves and screen protectors? The reason why I need a screen protector is because I'm currently overseas in Afghanistan. My last laptop screen got scratched up due to sand getting between the keyboard and laptop screen. Any help is appreciated!
-
Wow.. that sounds as bad as it could..........It cant be that bad, can it
-
Its pretty bad. Terrible viewing angles. I wonder if the matte option is any better. -
My screen is actually amazing. I love the viewing angles and the color is amazing!
Monitor Name (Manuf): N116BGE-L41 CMO N116BGE-L41 -
I dont think its too bad, but then again I use an external display for most of my stationary gaming. The speakers are really great, with great balance to the sound, but they are too dang quiet in my opinion. They are hard pressed to be audible with any kind of noise in the room. Once again not a big deal, i use a headset for most of my stationary listening. After having it several weeks I can confidently say that coming from a laptop with all the bells and whistles, the only thing I miss is a backlit keyboard
-
I've got a matte screen myself, and I don't really have any complaints for it. I don't know about how fancy it is, but it certainly is a big improvement over my old netbook's matte screen. No bad angles here.
-
I am debating between the AUO matte screen option or a matte film/screen protector for the glossy screen.. 10$ vs 100$, and characteristics may be 90% identical.
If this was an IPS matte finish panel it would be easer to justify, but unfortunately we have few options from the suppliers. -
If you let windows install its default driver for the audio, they are a lot louder.
-
You may be quite lucky with the Chimei screen, though I have yet to see a good 11.6" glossy.
Mine came with the Ivo panel, and it's poor - but no worse than your typical 11.6" glossy.
But this was expected...
This being said, if anyone can get the matte, even for 100$, I recommend the matte. The anti-gloss cover may be a frustrating experience. -
By the way, my W110ER cannot withstand more than 5 minutes of Prime95 + Furmark. After about 3 mins it's 95*C cpu 90*C gpu and seems to stabilize for 1 min, and yet soon it goes gradually up again and in 1 more min or so shuts itself down.
I have the lowly dual-core Ivy @ 2.6GHz - but it's true that this is a hot summer here. When idling it's 65C / 60C cpu/gpu. -
Some late night musings.
I read an article on Real World Tech regarding the effect of bandwidth and GPU performance. The results were apparently there is a formula which can roughly predict the ideal effect of memory bandwidth.
(Cuberoot Bandwidth A)/(Cuberoot Bandwidth B)
However, the limitation of this equation is that it assumes a constant scaling of bandwidth increase vs performance, in reality, its more of a diminishing returns scenario where for a certain level of GPU compute power, the performance benefits of increasing bandwidth deminishes at a certain point, this diminishing point is obviously higher in more powerful GPUs. This is important because increasing memory bandwidth is expensive financially and electrically. Therefore, it is ideal to find a good balance of manufacturing cost and the performance curve.
All the same, there is a fun mental excercise I did
the 650m has 28.8gb/s bandwidth
the 650m GDDR5 has 64gb/s bandwidth
My assumption is that the 650m GDDR5 is very close to the diminishing returns curve with the 64gb/s bandwidth and the ideal bandwidth is somewhere in between 64 and 28.8
Now, imagine improving our 650m bandwidth to 32gb/s (i.e GDDR3 2000mhz effective memory speed)
(cuberoot 32)/(cuberoot 28.8)=1.0357
so in an ideal situation we can expect a 3.57% overall performance improvement at the same clockspeed
according to some rough 3dmark06 data from other users, the 950mhz 650m GDDR3 can score about 12000 while the GDDR5 model scores about 14000 ( i think this user overclocked the GPU to 985mhz)
now assuming similar clockspeed, this is a 16% difference in performance i will attribute to the bandwidth difference
now back to the equation
(cuberoot A)/(cuberoot 28.8)=1.16
solving for the equation A=45gb/s
therefore, the ideal bandwidth for the gt650m is very roughly 45gb/s, This is consistent with the BF3 study I did where I started to see a drop in performance gain vs bandwidth increase beyond +200mhz.
this is interesting because it is smack bang in between the 28.8 of GDDR3 and the 64gb/s of GDDR5
I can understand now why NVIDIA created two versions since the GDDR5 is overkill while the GDDR3 is woefully insufficient.
In terms of memory overclock, the highest GDDR3 clockspeed I know of is 1050mhz on the GTX 280.
I killed an N61jq with a 25% memory overclock on the HD 5730m.
Applying this, the conservative maximum is 1125mhz memory overclock. I wouldn't personally go more than 1075mhz overclock assuming memory technology is better since the GTX280 came out.
1075mhz yields a bandwidth of 34.4
(cuberoot 45)/(cuberoot 34.4)=1.094
this means we will be at 90.6% of the theoretical sweetspot
Thanks for reading and let me know what you guys think
-
Megacharge Custom User Title
Well mine is junk, worst viewing angles ever, poor colors, etc.
The hardware ID it shows in device manager is CMO1113. -
"Ideal memory bandwidth" will depend on the application and the resolution. The 45gb/s might apply to 3dmark06, but it will change for other programs.
Can you link the page where you got the cube root forumla from? I've never seen that before, but I think there's a context where it has to be used. Using the forumla on the gddr3 vs gddr5 shows give as 30% performance difference, but as mentioned, it will depend on the program and res. e.g. the difference is is far smaller than 30% for 3dmark11 at 720p, but it's about 30% for metro 2033 at 900p. -
Is there any way to get a 1080p screen in one of these? The new asus 11.6" ultrabook has a 1080p ips panel and it made me curious.
-
I am currently Idling at 16.5-17 watts. I have installed modded drivers and that is all! With the drivers from the CD I received with my notebook it was jumping wildly from 17-35+. I posted a few days ago that I was idling at 13-15 and that is with one screen. This is my current wattage is with 2 screens. I have 2 1920x1080 screens.
I can't wait to check the battery life now. Always remember to never have anything open that may be checking the GPU, this will cause it to never "sleep" and idle around 25-30 watts!!
-
Hmm, where did you get the inf for the 650m? The one I found didn't have 650m.
Also, kinda strange that the nVidia drivers would make the power fluctuate wildly like that when the nVidia GPU isn't even active. -
I think there is a difference when the GPU is just underclocked/volted versus it actually becoming dormant. I did the test before I updated and it was going crazy. I believe it was you that also commented on that same behavior but I can't be sure. I am using my UPS to measure power on a particular port and has almost real time wattage and it was fluctuating once every 3 seconds. 17......35, 17..........35,17.... (watts)
v302.59 Windows 7/Vista 64bit | NVIDIA OEM Mobile - 30x Series GeForce Driver release - LaptopVideo2Go Forums
This is the driver/INF combo I used. I just ran the installer and restarted. It also has helped my temps even more too since I think the GPU wasn't getting the right sleep signal, it was just underclocking it and undervolting it.
EDIT: I took these readings with the battery out as well.. With the battery in and fully charged it uses the same. I just wanted to be sure. -
I get 12-15W when the GT 650m is "dormant". But will try these drivers and see what happens.
Thanks for link.
No way. It's only LVDS and not eDP which is what we'd need to get 1080p or even 900p. -
No problem, The increased power for my machine is the two screen, so I will try it standalone with just the laptops screen. I still agree with others on this forum that we should be idling around 6-8 watts.. I think a drive update will solve everything.
EDIT:
MUCH better temps.. the GPU is at 43c! Before it was hovering around 53-57c. I also opened GPUz which caused the clock speed to jump up and causing the temp to rise, I bet it's even lower once I close GPUz. This is with the fan taped, not raising the back of the laptop and running single screen with firefox open, a ton of apps and such..oh and stock BIOS
-
lvds is more than enough for 1080p, but the 2 channels need to be physically connected to be able to reach that resolution. don't know if clevo has it but probably not.
-
Don´t get your hopes high, intel screwed tings up in battery life, not only the base clock is 1200 Mhz vs 800 on sandy bridge, as the voltage is also higher some 0.1v. also the motherboard has to be pretty beefed up to support the components and that has an impact on idle consumption.
-
I chose a sandy bridge for that reason..
-
I am with my sager NP6110 now
Arrive today morning.
I like the machine, it is awesome. very small easy to take it with me, I like the screen, not is the best in the world, but is good, better than I imagine looking the coments about it.
I like it.
[]'s -
Coming from the M11x, my screen is magnificent!
-
6-8 watts?! The lowest I can get is double that or more
I am on modded drivers as well -
Megacharge Custom User Title
I guess I'm just spoiled, and I need a good screen.. One thing that really pissed me off is the fact that I can't just disable the HD4000 and only run the 650M, this is incredibly stupid, I want to adjust the colors on the screen, but it only works through the Intel graphics config.
The way this whole optimus thing is set up seems to be more trouble than it's worth.
-
That's the way Optimus works. It's fully muxed through the IGP, otherwise it would need its own separate video out, and require the dedicated card to be active all the time to display unless it had its own video port. Personally I much prefer manual switchable graphics. Either one is on or off, none of this auto switching nonsense.
-
Vergeofinsanity81 Notebook Consultant
So I was using my NP6110 today and it just randomly shut itself off and powered down. I was only surfing the web and using spotify. Should I be concerned? Is there some ways I can test my laptop? I just got this about 1-2 weeks ago. I am hoping it's not a lemon.
-
Megacharge Custom User Title
Anyone have a link to the latest Intel chipset drivers for the W110ER (from Intel, not Sager)?
*** Official Clevo W110ER / Sager NP6110 Owner's Lounge ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Ryan, Apr 7, 2012.