Hi. I've pretty much decided on getting the Sager NP8130, base configuration (specs further down). Problem is, I really want longer battery life than just a couple of hours or so (my current 4-year-old Toshiba has 2 hours, which I have found to be not enough for my needs). The NP5160 has Optimus (so probably a max battery life of 5 or 6 hours on integrated graphics) but then I have to live with the GT 540M which is WAY too weak for me. The GTX 460m is listed as Optimus capable on Nvidia's website, but the laptop itself doesn't mention it at all, and the owner's lounge here says the integrated graphics are disabled. Is there any way that I could manually re-enable the integrated graphics and/or Optimus? I wouldn't even mind having to reboot every time I wanted to switch.
And if I really wanted to, I could just take out the GTX 460m right? For example, I go to China every summer to visit relatives. The plane ride is VERY long and my laptop dies before the ride is halfway over. If this is possible, I would take out the GTX 460m, wrap it in bubble wrap, and my laptop would run off the intel graphics and have loooong battery life... right?
Display: 15.6" 1920 x 1080 LED Backlit Matte Display
Processor: i7-2630QM, 6MB L3 Cache, 2.0GHz
Memory: 8GB, 1333MHz DDR3
Graphics Card: NVIDIA® GeForce® GTX 460M 1.5GB GDDR5
Hard Drive: 500GB 7200rpm SATA II
Optical Drive: 8X Multi DVD+/-R/RW RAM Dual-Layer Drive
Wireless: Internal 802.11B/G/N LAN and Bluetooth Card
Cooling: Stock OEM Thermal Compound, CPU & GPU (I plan to take it apart and put some AC5 in there)
$1163 after Cash Discount
-
-
Most likely there isn't any way you can do that! It goes way deeper than just Optimus, which is a software.
BIOS as well as the hardware should support that.
Forget about it. -
You really should post this in the Sager forum.
Anyway,
The 5160 has the graphics GPU soldered onto the board, so it’s fixed to what card the laptop supports. Optimus is enable on this laptop, so you can use the onboard IGP or the external NVidia GTX540M
As far as the 8130, it is not Optimus enabled, there is nothing you could do in bios or software to change that. It is a hardware configuration issue. -
The point is: If it is not enabled by the manufacturer, there is nearly no chance of enabling it yourself, as you need proper access to BIOS as well as maybe hardware, which is not built in!
I asked myself the very same question about my notebook, which runs dedicated NVIDIA but also has integrated (Arrandale) GPU. Turned out there is nothing I can do. First step would be hacking BIOS to gain full access, which is nearly impossible, risky and voids warranty - Hybrid graphics would have been nice, but I don't want it THAT bad. -
Thread moved to Sager/Clevo subforum. You'll probably get more model-specific advice there.
-
I thought it was like SLi, which I read could be enabled through a few hacked drivers. Maybe I'm wrong about that too
Though for long periods of time when I need battery life, not performance, if the GPU's not soldered in can I just take it out? Or will that cause crashes / failing to boot / 15 inch brick? -
Optimus has to be supported by the motherboard, there is no way you can enable it, if Clevo did not implement it on this model's mainboard. Just forget about it.
If you need longer battery life, buy one or two spare batteries. But for the same price you can just buy a netbook which gives you about 10h battery life, more than enough for all your needs -
-
-
Every international plane ride I've been on in the past 3-4 years has had plugs by the seats even in coach... You should just change your airline!
On a side note, the 5160 is more than powerful enough to run SC2 on medium which is all you need, even the Sandy Bridge integrated card can run SC2 on Low... Gets you a lot more battery life on that too...
I'm having a very similar debate as you between the 5160 and the 8150 though so I know where your coming from. What I really want is a solid quote on the price of the AMD 6970... -
-
That's rubbish. I used to play on my old Dell E1705 while travelling. Sure, it will only get me around 2 hours of gameplay, but that's certainly enough for most European flights.
And I am not talking vanilla stuff, I am talking for instance Tiberium Wars or other such games. Gaming on your laptop while on battery is why they are called gaming laptops.
I also played on my current laptop, but mostly older stuff. The last game I've played on battery was Darksiders. But I only get around an hour of gameplay with my current laptop. -
As stated, Optimus requires motherboard support; specifically, the way Optimus works is that all of the discrete card output is funneled into the IGP, which then displays what the discrete card tells it to. This means that the discrete card is not connected directly to the screen at all, and this is why it has to be done on the motherboard level; if the discrete card was connected "normally", with it's own mux to the screen, Optimus couldn't work because there's no way to channel the discrete card's output through the integrated graphics. And, of course, on top of this you need the appropriate drivers and BIOS support to let Optimus work.
As for enabling SLI, it really depends on what you mean by that. After all, since notebooks are generally such closed systems in the first place, if a notebook is designed to be able to take 2 discrete graphics cards, it's designed to have SLI enabled; there's not much point in a notebook that can have 2 discrete graphics cards that doesn't let them both work.
And no, removing the GPU is usually a "big job", not something you can do on the fly (it's about the same level as replacing the CPU). It is theoretically possible that if your notebook is designed with switchable graphics, you could remove the GPU and run it off the integrated graphics, but replacing/removing the GPU is generally going to be a half-hour, hour job minimum, mostly thanks to the way heatsinks and GPUs are constructed in notebooks. It's not "plug and play" (well, unless we're talking a ViDock or something similar). -
What airline do you use? Mine requires adapters AND first class for electricity.
Not intending to belittle Europe though. (edited since it sounded snobby)
-
Even if Optimus can't be enabled on the P150/P151, a variety of threads/posts/emails like this are very important. The only way Sager and company are going to request that Clevo implement Optimus in a future model is if they know there is demand for it. Clamoring for Optimus support, no matter how futile in the current generation, is evidence of that demand. Personally, I think it's ridiculous it wasn't implemented from the beginning, that's why this topic keeps popping up everywhere. Intel's Sandy Bridge processors, chipsets, and Nvidia's high end GPUs could theoretically work together on the properly designed motherboard. The big players have done their part, therefore the responsibility rests solely with Clevo. Rationally, the only reason they wouldn't is if the return on such a development wouldn't justify the costs.
The fact that Optimus has been enabled on countless mid range laptops without full blown MXM 3.0b slots but with the same chipsets and processors proves that it can be done at relatively low cost and that there is demand for it in that segment. If the GTX 485m, MXM 3.0b by design, supports Optimus, then there's no reason why it can't be done on an MXM 3.0b board. The only question left is whether there is demand in the high-end/gaming segment. Why would there not be demand in this market? I'm not going to dig through the owners' thread to find the post, but a representative claimed they did not implement it because of Punkbuster problems. For a hardware company to downgrade the potential of its products to meet the demands of a generally disliked and decreasingly important program is pretty short sighted IMO. If enough newer high end gaming laptops had Optimus support, Even Balance would be forced to patch their software or face fallout with game publishers. Software should never cause retrogression in hardware, if anything it should push improvement in hardware. In this case we have software allegedly causing the intentional omission of a legitimate hardware advance.
There also seems to be this implicit assumption that gamers don't care about battery life, WE DO, it just hasn't been possible to have both until now. Now that it is possible, we find out it's simply been omitted. This is not just wishful thinking, this was an oversight, plain and simple.
I think Punkbuster was just an after the fact excuse. Someone paying $1500+ for a laptop is not going to care if they have to pay an extra $100 for Optimus, especially given the benefits. The demand for these laptops is already extremely high, but I know the OP, myself, and others have been holding off on a purchase because of this Optimus issue. In reality, we'll probably cave in, get our laptops, and 6 months from now a revision with a redesigned motherboard enabling Optimus will be released for the same price or less than what we paid. Granted, thats a fact of life when it comes to computer purchases, but it's usually because of substantial barriers to production that physically prevent better products from being released. This isn't a die shrink, it's routing video from a discrete GPU to an integrated GPU, something currently being done, but not on high end laptops. It's akin to Nvidia having a GT 425 with 3D vision support and then releasing a GTX 580 several months later with it left out.
This laptop had the potential to be the best 15 inch laptop in all respects, bar none. An optional 120hz 1080p panel could have been included, instead it's capped at 1366x768. Same goes for a backlit keyboard. Those are permissible omissions in my book, but in 2011, Optimus is not. I don't think you could find a more general purpose high end laptop on the market until Ivy Bridge. By doing this they would have widened their audience dramatically. Graphics professionals and wealthy individuals who don't necessarily care about gaming but like to have the best would suddenly see this laptop as an attractive option. A <7lb, 3D capable, 4+ hour battery life, comparatively low cost notebook that scores 15k in 3Dmark Vantage...that would be unbelievable, and the worst part is there is NO reason why it couldn't have been done already. With Optimus it's not necessarily going to be an "easy" fix, but that's only because of poor design decisions initially.
In the meantime, Clevo is left with a glaring opening for competition from other, larger manufacturers like HP and Dell. All the while, they have vast potential demand that slips away in favor of other products that include these features individually. Not much of an issue right now considering the the supply shortage, but once this initial rush is satisfied the potential losses will begin to mount.
************************************
TL;DR, I know, but it's simply bad business on Clevo's part and someone has to show them this. I encourage everyone who would like to see this feature implemented to let your Clevo vendor of choice know. Clevo may be making wide margins on these laptops, but the distributors are making less, so quantity is relatively more important to them. Vendor representatives, take this information to heart and let Clevo know that you care because your customers definitely do. -
I guess one reason why the implementation of Optimus is more expensive in a MXM 3.0b system is that you'd have to change the MXM slot which destroys the advantage of using MXM in the first place. Apparently it's easier to accomplish if the GPU is mounted directly on the motherboard.
Anyway, Optimus should have been an option for this laptop, it would have been the first true gaming laptop, Desktop like performance, still portable due to comaratively low weight and size and 5h+ battery life: awesome! -
Is there any model from any manufacturer which has Optimus with a 460M or better ? If this message can help convince one of them, I am waiting for such a laptop to be available and i'm ready to pay a little higher for it: gamers also want battery life for internet and work sometimes.
Also another question.. I know there exist Switchable Graphics for AMD cards. What's the difference with optimus ? -
Yesterday I found a really good deal on an HP. Specs:
2nd generation Intel(R) Core(TM) i7-2630QM (2 GHz, 6MB L3 Cache) with Turbo Boost up to 2.9 GHz
1GB ATI Mobility Radeon(TM) HD 6570 graphics [HDMI, VGA]
FREE Upgrade to 6GB DDR3 System Memory (2 Dimm)
640GB 7200RPM Hard Drive with HP ProtectSmart Hard Drive Protection
No Additional Office Software
No additional security software
6-Cell Lithium-Ion Battery (standard) - Up to 6.0 hours of battery life +++
15.6" diagonal High Definition HP BrightView LED Display (1366x768)
TouchScreen with HP TouchSmart's intuitive multi-touch applications (includes HP TrueVision Webcam)
SuperMulti 8X DVD+/-R/RW with Double Layer Support
Intel 802.11b/g/n WLAN
Backlit Keyboard with HP SimplePass Fingerprint Reader
$1024 (about $1100 after tax)
The graphics card sucks compared to the 480 AFAIK but it is better than the 540 (also AFAIK) which supposedly will run starcraft 2 on medium quite well. And because the screen's 1366x768 (just remembered that my dad has a 23 inch 1080p monitor that he never uses, I might get it, if so, this point is null) I might slide by on high. Also, what I really like is the backlit keyboard and touchscreen. The backlit keyboard is something that I was also really missing on the Sager, but decided that if necessary I could leave out. It's really funny that a gaming laptop company left out a backlit keyboard but this "business" laptop had one. The touchscreen is really nice because, after all, I don't live in a cave and send zerg rushes all day. However, I've read that Apple tried it on their MPBs and said it was "tiring" after a few minutes. Also the battery life is supposed to be 6 hours on the ATI, not the integrated intel graphics.
Comments?
Sorry, this is getting WAY off topic. -
Oh, and in response to your previous question about pulling out a graphics card before a flight; the difficulty in that is in the way notebooks are generally constructed. A notebook GPU is not usually integrated with its heatsink; it's connected to a separate heatsink that either has it's own fan, or shares a fan with the CPU. Additionally, many notebooks will not boot without a graphics card; unless the notebook is designed with (old-style) switchable graphics (and even then only possibly), it simply won't boot if the graphics card is missing. I'm going to link you to a teardown guide that was recently resurrected in the Hardware Forum here. Of particular note are the pictures; now this is a Toshiba, and it's a bit more fiddly than most, but most importantly, look at the last few pictures, where the CPU and graphics chip are exposed. You can see how first getting to the graphics chip is quite a feat, and removing the graphics chip could mean some difficulties in reassembling the notebook back into a working whole. And yes, I know this is a soldered on GPU, but even with a MXM-based or otherwise removable one, the appearance is often similar. Here is an Elitebook 8740w. After you scroll down past the screen and external images, Aikimox has a picture of under the keyboard, showing off the fan in the upper left, the CPU covered by the small heatsink in the lower left, and the GPU covered by the big heatsink to the right of the fan. Now, the 8740w has its GPU mounted in a MXM 3.0B slot, so it does, in fact, have a removable GPU, but again, you can see how removing the GPU in a way that would leave the notebook usable would be difficult (it's also worth noting that Aikimox's model uses a quad core Core i processor, and has no IGP, so even if you did remove the GPU, it wouldn't POST).
Oh, and the HP you mentioned... is that the dv6tqe? The Pavilion is _not_ a business model. It's one of HP's consumer models. HP business models are the Probooks and Elitebooks, as well as a few others. -
-
Why would Nvidia support Optimus on the 485m if it would be bottlenecked by the Integrated GPU? I don't think Nvidia would risk an Optimus implementation on the 485m that significantly lowered its performance when in discrete GPU mode. This could potentially hurt 485m sales if uninformed customers saw the resulting poor benchmarks. Unless of course Optimus "support" itself is just a pure marketing ploy and there is nothing physically different between Nvidia GPUs with explicit support and those without. This would imply that Nvidia's part in the switching process is done through the drivers alone, which I don't think is the case. Not only that, but if there was a bottleneck issue, it's not going to be solved in the 485's production lifetime. Ivy bridge integrated GPUs are not going to be that much faster than Sandy Bridge's. After Ivy Bridge, Nvidia will have a new flagship mobile card on a 28nm process using substantially less power or having substantially more performance at the same TDP. The 485m becomes instantly obsolete, Nvidia had to have known this.
I understand there has to be a physical connection from the card to the integrated GPU and then to a video output. However, given that these cards are MXM 3.0, there is no physical connection between the card itself besides the MXM 3.0 slot, which would then interface with the board, possibly the IGP, and video out. Video output is therefore physically on the board rather than the card, so its dependent on board design. It's also possible Optimus just can't work through MXM 3.0, but that would imply there are embedded 460 or 485m's, something I haven't seen.
Claiming the 485m (which I assume is MXM 3.0 only) has Optimus support when there is no way to actually implement it, bottleneck or not, is blatant false advertising that wouldn't have slipped past Nvidia's legal team.
I think the primary reason we haven't seen high-end cards with functional Optimus is because it wasn't practical before Sandy Bridge, but that's not to say it wasn't possible. A high-end GPU implies a relatively high-end CPU, often a quad core. In the Core 2 generation, integrated graphics was implemented through an Intel GPU on the motherboard. In the first generation i7, on-die GPUs were included for the first time, but only in dual cores and only for use with certain chipsets. However, ALL Sandy Bridge processors have on-die graphics. Likewise, both the HM65 and HM67 chipsets have native support for the use of these on-die GPUs.
From a design perspective, it wouldn't have been practical to design a board that could take an MXM 3.0 card and have integrated graphics before Sandy Bridge. Optimus itself is only a year old; most designs we're comparing the P150HM to were drawn up before Optimus was even possible. Therefore, there would be absolutely no reason to include support for both because you couldn't easily switch between integrated and discrete. Not only that, having a single laptop model designed for two separate motherboards, one with integrated GPU support and one without (for quad cores), was probably too expensive from a mass-production standpoint. In the past, Optimus has been seen in mid-range laptops with relatively limited CPU options, specifically dual cores that had on-die CPUs. The M11x is a perfect example of this. The original version, released at last year's CES, didn't have Optimus. It was redesigned for it after Optimus's launch in February and released 6 months later.
I do agree that the bottleneck theory is possible, I just don't think it's very likely considering that Nvidia's entire high-end philosophy is to have the best performance at all costs. Optimus induced bottlenecks would threaten that goal in favor of battery life, something that would be secondary for Nvidia's high-end. My view is that this was a design oversight, the technology was all there - the universality of on-die GPUs and of chipset compatibility. -
I might be wrong, but it seems like Alienware M17x R3 supports optimus/powerxpress. And it has a powerful graphics card. The only issue is that the european prices are plainly ridiculous, as in almost twice as high.
-
-
-
Desktop Replacement is the word right? or Portable Workstation?
-
Yes, as their primary purpose is gaming, they're designed to be tethered to a plug at most times. The 2hr battery life was not intended to be used for gaming.
-
-
-
This could actually be a very clever marketing ploy on nVidia's part; if the GTX 485M does bottleneck through the IGP, smart manufacturers won't implement Optimus for it, because it adds to their costs, and there wouldn't be any advantage. Thus nVidia can tout the "ability" for the GTX 485M to support Optimus, knowing full well that manufacturers won't put it in because it'd lose them money with no advantages. And if someone does enable it, and the bottleneck becomes apparent, hey, it's not nVidia's fault that the IGP isn't powerful enough to keep up. All of the advantages, none of the blame!
Oh, and page 3 of that review supports my assertion that most of Optimus is software; the only listed required hardware component is the Optimus Copy Engine. This could simply be something they've added to all future GPU releases, though, whether or not they're "intended" to support Optimus.
Addendum - I can think of at least one very good reason to put the Optimus Copy Engine on all GPUs, even the high-end ones that wouldn't do well with Optimus; the way the Optimus Copy Engine works in divorcing the displaying from the rendering would seem to be perfect for enhancing SLI.
Oh, and I freely admit that a lot of my opinion is predicated on the idea that the Sandy Bridge GPU couldn't display the GTX 485M rendering quickly enough. I wouldn't mind being proven wrong on this front. -
-
-
Clevo since its on their website.
-
Add me to the list of people who really, really wanted Optimus on these models. I would have pulled the trigger a couple of weeks ago on an NP8130 or NP8150 if they'd been designed with it, there would have been literally no downside to owning the machine.
I want something I can game with when traveling & power is available, but that can do 4+ hours on battery for light tasks like web browsing and office apps.
I've been trying to find out what kind of battery life either one of those models can get in the 460m/2630 configuration, at reasonable or full brightness for simple web browsing / office stuff. I haven't seen any tests of that yet (I've been following about 5 different threads), and though I'm sure they're nice machines I'm not going to buy without numbers. We'll see what the other major vendors come up with in the next month or two; the first laptop to pair Optimus with high end graphics is going to sell like crazy. -
-
-
Clevo IS more portable than Alienware, but it is also weaker (single card vs. SLI). It's part of the reason why I choose Clevo and not Alienware. At the time when I got my laptop, I could have bought a more powerful Alienware for the same price, but size, weight, looks and battery life made me go for Clevo.
Enable Optimus
Discussion in 'Sager and Clevo' started by pianoplayer, Jan 26, 2011.