Put simply, can Sager computers be set up with any mainstream Linux distributions? Maybe mainstream isn't the right word... With any distros that wouldn't require a massive amount of custom stuff or programming on my own part (I don't know that much about Linux, other than that it would be lighter to run than Windows).
Another question I have is whether anyone knows if Optimus will be enabled on builds with the Nvidia GTX 680M. According to notebookcheck, the 680M will feature Optimus [notebookcheck link] , but I don't know how official that is.
Much thanks,
~Tyranids
-
Yes we offer Linux on our builds, however, remember that optimus on Linux will never be officially supported on Linux by NVIDIA. So there are some software projects that do attempt to help out.
-
I checked out your Ubuntu offerings, and tried reading up a little more on multiple-booting laptops. I guess it really comes down to power consumption. The reason I was wondering about Linux was because it's lighter on resource consumption, correct? But if Optimus isn't supported, then that would outweigh any gains from the OS difference.
I'm really trying to see how I could get the maximum battery life possible out of a 15.6" model with the GTX 680M (when that comes out), or 660M if the 680 isn't offered in any of the 15" laptops. -
Not true.
Nvidia says they will not support Optimus until PRIME lands in the kernel and drvscreen lands in X. -
What are you planning to do with that much graphics power? Are you looking to dual boot so you can run a low powered Linux OS for word processing and then boot into Windows for games or run Linux full time?
I believe that project bubblebee allows you to use on-board or dedicated graphics but not have them dynamically switch (you would need to google that to confirm). -
ALLurGroceries Vegan Vermin Super Moderator
I think you are confusing nVidia with nouveau. Otherwise please cite a source. -
You don't want to run Linux if your concern is battery life. Although the linux devs have made considerable strides in power management lately, it's still not up to snuff against Windows. I couldn't give specific numbers for my sager, but on my netbook I get about 2.5-3 hours browsing and watching youtube in Mint and up to 5 hours in Windows.
-
That is actually exactly what I was planning to do. And the reason I'd take the 680M is that since I'd be buying a new device, I want the new technology, and only the 660M and 680M are on the newer 28nm process. Even though the 670/675 are in between, if the 680 wasn't >200 to upgrade to, I would probably take that. If not, I'd just go with the 660M. But my main goal was yes: use low powered Linux OS for typing papers and such, and Windows to run games and other multimedia programs.
EDIT:Oh, I did not see this post. Thank you though. I hadn't really researched it, so I did not know about this. -
ALLurGroceries Vegan Vermin Super Moderator
Linux power consumtion varies greatly between platforms. It will generally use slightly more power on a laptop than windows, but with some tweaking the difference can be as little as 1 to 2 watts.
-
On the laptop in my specs, I get ~2.5 hrs with the 3.0.0-17 kernel. With the 3.2+ kernel, we are supposed to see increased battery life. I don't use this laptop very often on battery, but if I needed to juice a little extra time, I'd do a couple of things...
Install powertop to see what is eating up most of my battery (waking up my PC from a sleep cycle)
Install cpufreq-utils to trim back my CPU speed to something more reasonable
Edit the grub boot params to include the following:
pcie_aspm=force i915.i915_enable_rc6=1 i915.i915_enable_fbc=1
Finally, install the jupiter applet to further control the power management.
You could also code up a way to turn off these super powered GPUs on battery, especially if you're using a lightweight WM.
I can't speak for the state of Bumblebee with Nvidia's most recent cards though. I've always stuck with ATI cards, accepting the heat issues, to circumvent having to deal with Optimus. -
If you get 2.5 hours without having to do any of that then that is plenty. I'm planning to get an additional battery in whatever model I finally purchase, so 2.5 hours on 1 is fine.
If however I'd have to do that... I can deal with the programs you suggested but I wouldn't be able to code anything myself, I've minimal programming knowledge (and in Java too, not too helpful here).
Also, what does that boot parameter you mentioned do? I don't even know how to change anything related to booting up (other than basic require password/not and which programs to start).
Thank you all, this thread has helped me immensely. -
Optimus is not supported in Linux and is actually a large headache to deal with, especially if you hardly know anything about the OS.
I would try to get a laptop with one GPU or wait until it's well supported, which should take another 2-3 years. -
I'm not confusing the two.
Robert Morell and Pierre-<wbr>Loup Griffais of Nvidia have both stated on mailing lists that if dma-buf remains EXPORT_SYMBOL_GPL, Nvidia will not be able to support it. -
Does Optimus not work well even in Windows? After the initial post it came out that Windows is oftentimes actually more power efficient, and if not the difference would be <=10 Watts.
My new plan has simplified to just getting a card with Optimus, running windows, and on battery having settings for only the bare minimum (reduced CPU speed, screen brightness way down, wifi off, Optimus on). -
Anthony@MALIBAL Company Representative
Optimus works fine in Windows, but with some caveats. It's software based- so you may not get the best performance in games (if any) until a new driver is released. This is because of the way it offloads graphics to the dGPU which then passes it back through the iGPU. Drivers make a much bigger difference for optimus systems versus discrete only.
It's not that it's buggy, it's just that the best performance is really dependent on having good profiles set up and quick driver updates. -
Is there a way to disable it when plugged into power? Or only turn on when detecting a 3D application? I have only ever had ATI cards, so I really know almost nothing about the system.
-
Anthony@MALIBAL Company Representative
You can either force the dGPU to be enabled all the time, or let the system auto-sense. By default, it relies on the iGPU for everything for max power savings and only kicks in the dGPU when you run a profile that requests it. -
There is also a laptop_mode utility that allows you to fine tune power saving, sleep and hibernation options. I have yet to get it working on my Sager NP6165/Clevo W150ER.
I am about to try bumblebee on my Optimus with Nvidia 650M soon and I'll report back.
So far I got the 3.2 kernel and the xorg-video-intel drivers to recognize the HD 4000 on my Ivy Bridge, but I get a kernel hang in a couple of minutes. The hardware is fine as it works in Windows. Any ideas why this may be happening? -
What are you talking about? Took less than 5 minutes to get Optimus working in 12.04 using Bumblebee. Biggest difference was in the cpu temps and power since I don't play modern games yet in Linux and didn't test any VMs either.
-
It's great to hear this. I am about to try installing Bumblebee.
I expect that the temps will be a bit higher because the powersaving features are not properly working under linux. In my case, my HD 4000 hangs the computer after a few minutes. Do you know why this may be happening?
I may not be using the latest xorg-video-intel driver though. My kernel is a 3.2.x. -
sudo add-apt-repository ppa:bumblebee/stable
sudo apt-get update
sudo apt-get install bumblebee
that's it. But that was on a fresh install. You might want to purge out any other drivers first if not a clean install -
I started a thread on installing Linux on my Sager NP6165 / Clevo W150ER in here.
Still haven't installed bumblebee because I'm dealing with a random hanging issue
Clevo/Sager and Linux
Discussion in 'Sager and Clevo' started by Tyranids, Apr 10, 2012.