The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Enable Optimus

    Discussion in 'Sager and Clevo' started by pianoplayer, Jan 26, 2011.

  1. pianoplayer

    pianoplayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Hi. I've pretty much decided on getting the Sager NP8130, base configuration (specs further down). Problem is, I really want longer battery life than just a couple of hours or so (my current 4-year-old Toshiba has 2 hours, which I have found to be not enough for my needs). The NP5160 has Optimus (so probably a max battery life of 5 or 6 hours on integrated graphics) but then I have to live with the GT 540M which is WAY too weak for me. The GTX 460m is listed as Optimus capable on Nvidia's website, but the laptop itself doesn't mention it at all, and the owner's lounge here says the integrated graphics are disabled. Is there any way that I could manually re-enable the integrated graphics and/or Optimus? I wouldn't even mind having to reboot every time I wanted to switch.

    And if I really wanted to, I could just take out the GTX 460m right? For example, I go to China every summer to visit relatives. The plane ride is VERY long and my laptop dies before the ride is halfway over. If this is possible, I would take out the GTX 460m, wrap it in bubble wrap, and my laptop would run off the intel graphics and have loooong battery life... right?

    Display: 15.6" 1920 x 1080 LED Backlit Matte Display
    Processor: i7-2630QM, 6MB L3 Cache, 2.0GHz
    Memory: 8GB, 1333MHz DDR3
    Graphics Card: NVIDIA® GeForce® GTX 460M 1.5GB GDDR5
    Hard Drive: 500GB 7200rpm SATA II
    Optical Drive: 8X Multi DVD+/-R/RW RAM Dual-Layer Drive
    Wireless: Internal 802.11B/G/N LAN and Bluetooth Card
    Cooling: Stock OEM Thermal Compound, CPU & GPU (I plan to take it apart and put some AC5 in there)
    $1163 after Cash Discount
     
  2. Gracy123

    Gracy123 Agrees to disagree

    Reputations:
    277
    Messages:
    2,080
    Likes Received:
    7
    Trophy Points:
    56
    Most likely there isn't any way you can do that! It goes way deeper than just Optimus, which is a software.

    BIOS as well as the hardware should support that.

    Forget about it.
     
  3. othonda

    othonda Notebook Deity

    Reputations:
    717
    Messages:
    798
    Likes Received:
    15
    Trophy Points:
    31
    You really should post this in the Sager forum.

    Anyway,

    The 5160 has the graphics GPU soldered onto the board, so it’s fixed to what card the laptop supports. Optimus is enable on this laptop, so you can use the onboard IGP or the external NVidia GTX540M

    As far as the 8130, it is not Optimus enabled, there is nothing you could do in bios or software to change that. It is a hardware configuration issue.
     
  4. Gracy123

    Gracy123 Agrees to disagree

    Reputations:
    277
    Messages:
    2,080
    Likes Received:
    7
    Trophy Points:
    56
    The point is: If it is not enabled by the manufacturer, there is nearly no chance of enabling it yourself, as you need proper access to BIOS as well as maybe hardware, which is not built in!

    I asked myself the very same question about my notebook, which runs dedicated NVIDIA but also has integrated (Arrandale) GPU. Turned out there is nothing I can do. First step would be hacking BIOS to gain full access, which is nearly impossible, risky and voids warranty - Hybrid graphics would have been nice, but I don't want it THAT bad.
     
  5. MidnightSun

    MidnightSun Emodicon

    Reputations:
    6,668
    Messages:
    8,224
    Likes Received:
    231
    Trophy Points:
    231
    Thread moved to Sager/Clevo subforum. You'll probably get more model-specific advice there.
     
  6. pianoplayer

    pianoplayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Thanks.
    That sounds, well, nasty. I don't want it THAT bad either :p I thought it was like SLi, which I read could be enabled through a few hacked drivers. Maybe I'm wrong about that too :eek:

    Though for long periods of time when I need battery life, not performance, if the GPU's not soldered in can I just take it out? Or will that cause crashes / failing to boot / 15 inch brick?
     
  7. Daniel Hahn

    Daniel Hahn Notebook Evangelist

    Reputations:
    146
    Messages:
    664
    Likes Received:
    0
    Trophy Points:
    30
    Optimus has to be supported by the motherboard, there is no way you can enable it, if Clevo did not implement it on this model's mainboard. Just forget about it.

    If you need longer battery life, buy one or two spare batteries. But for the same price you can just buy a netbook which gives you about 10h battery life, more than enough for all your needs ;)
     
  8. pianoplayer

    pianoplayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Spare batteries are expensive, about $140 AFAIK. No netbook for me, because said needs include Starcraft 2. ;)
     
  9. mobiousblack

    mobiousblack Notebook Deity

    Reputations:
    387
    Messages:
    760
    Likes Received:
    93
    Trophy Points:
    41
    What about those new ones that are suppose to come out with optimus enabled? it won't be as beefy as the 8130 though.
     
  10. Windkull

    Windkull Notebook Evangelist

    Reputations:
    21
    Messages:
    447
    Likes Received:
    0
    Trophy Points:
    30
    Every international plane ride I've been on in the past 3-4 years has had plugs by the seats even in coach... You should just change your airline!

    On a side note, the 5160 is more than powerful enough to run SC2 on medium which is all you need, even the Sandy Bridge integrated card can run SC2 on Low... Gets you a lot more battery life on that too...

    I'm having a very similar debate as you between the 5160 and the 8150 though so I know where your coming from. What I really want is a solid quote on the price of the AMD 6970...
     
  11. Daniel Hahn

    Daniel Hahn Notebook Evangelist

    Reputations:
    146
    Messages:
    664
    Likes Received:
    0
    Trophy Points:
    30
    Netbook in addition to your laptop, not instead. SC2 has nothing to do with it, you will not get any decent battery life out of any laptop while playing SC2 or any other game. In general you should not do any intensive gaming on battery...
     
  12. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    That's rubbish. I used to play on my old Dell E1705 while travelling. Sure, it will only get me around 2 hours of gameplay, but that's certainly enough for most European flights.

    And I am not talking vanilla stuff, I am talking for instance Tiberium Wars or other such games. Gaming on your laptop while on battery is why they are called gaming laptops.

    I also played on my current laptop, but mostly older stuff. The last game I've played on battery was Darksiders. But I only get around an hour of gameplay with my current laptop.
     
  13. Judicator

    Judicator Judged and found wanting.

    Reputations:
    1,098
    Messages:
    2,594
    Likes Received:
    19
    Trophy Points:
    56
    As stated, Optimus requires motherboard support; specifically, the way Optimus works is that all of the discrete card output is funneled into the IGP, which then displays what the discrete card tells it to. This means that the discrete card is not connected directly to the screen at all, and this is why it has to be done on the motherboard level; if the discrete card was connected "normally", with it's own mux to the screen, Optimus couldn't work because there's no way to channel the discrete card's output through the integrated graphics. And, of course, on top of this you need the appropriate drivers and BIOS support to let Optimus work.

    As for enabling SLI, it really depends on what you mean by that. After all, since notebooks are generally such closed systems in the first place, if a notebook is designed to be able to take 2 discrete graphics cards, it's designed to have SLI enabled; there's not much point in a notebook that can have 2 discrete graphics cards that doesn't let them both work.

    And no, removing the GPU is usually a "big job", not something you can do on the fly (it's about the same level as replacing the CPU). It is theoretically possible that if your notebook is designed with switchable graphics, you could remove the GPU and run it off the integrated graphics, but replacing/removing the GPU is generally going to be a half-hour, hour job minimum, mostly thanks to the way heatsinks and GPUs are constructed in notebooks. It's not "plug and play" (well, unless we're talking a ViDock or something similar).
     
  14. pianoplayer

    pianoplayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Those are the 5160 as I mentioned.
    Wow, I've always had the idea that intel graphics REALLY suck. I though they would sputter and die in starcraft. That means Intel has gotten much better at making graphics, as my toshiba from a few years ago stutters when running Halo 1, from 2001.
    What airline do you use? Mine requires adapters AND first class for electricity.
    I meant removing it at home, using it in-flight, and then putting it back upon arrival. It should be much harder than changing desktop GPUs right? Because that takes about 10 seconds.
    Well, you fly between the same continent. I fly across one that's bigger than two Europes and and an ocean that's even bigger.
    Not intending to belittle Europe though. (edited since it sounded snobby)
    I think you misunderstood me. I don't need long gaming battery life, I need the ability for things to look good while gaming in general, and still have non-crap battery life doing word processing / internet. If i had to sacrifice one, I would probably sacrifice the second one because I have an iPad (probably going to sell that for a Xoom, but that's another story) and I could do work on that. Netbooks would make me carry another computer with me.
     
  15. PCchief

    PCchief Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Even if Optimus can't be enabled on the P150/P151, a variety of threads/posts/emails like this are very important. The only way Sager and company are going to request that Clevo implement Optimus in a future model is if they know there is demand for it. Clamoring for Optimus support, no matter how futile in the current generation, is evidence of that demand. Personally, I think it's ridiculous it wasn't implemented from the beginning, that's why this topic keeps popping up everywhere. Intel's Sandy Bridge processors, chipsets, and Nvidia's high end GPUs could theoretically work together on the properly designed motherboard. The big players have done their part, therefore the responsibility rests solely with Clevo. Rationally, the only reason they wouldn't is if the return on such a development wouldn't justify the costs.


    The fact that Optimus has been enabled on countless mid range laptops without full blown MXM 3.0b slots but with the same chipsets and processors proves that it can be done at relatively low cost and that there is demand for it in that segment. If the GTX 485m, MXM 3.0b by design, supports Optimus, then there's no reason why it can't be done on an MXM 3.0b board. The only question left is whether there is demand in the high-end/gaming segment. Why would there not be demand in this market? I'm not going to dig through the owners' thread to find the post, but a representative claimed they did not implement it because of Punkbuster problems. For a hardware company to downgrade the potential of its products to meet the demands of a generally disliked and decreasingly important program is pretty short sighted IMO. If enough newer high end gaming laptops had Optimus support, Even Balance would be forced to patch their software or face fallout with game publishers. Software should never cause retrogression in hardware, if anything it should push improvement in hardware. In this case we have software allegedly causing the intentional omission of a legitimate hardware advance.


    There also seems to be this implicit assumption that gamers don't care about battery life, WE DO, it just hasn't been possible to have both until now. Now that it is possible, we find out it's simply been omitted. This is not just wishful thinking, this was an oversight, plain and simple.


    I think Punkbuster was just an after the fact excuse. Someone paying $1500+ for a laptop is not going to care if they have to pay an extra $100 for Optimus, especially given the benefits. The demand for these laptops is already extremely high, but I know the OP, myself, and others have been holding off on a purchase because of this Optimus issue. In reality, we'll probably cave in, get our laptops, and 6 months from now a revision with a redesigned motherboard enabling Optimus will be released for the same price or less than what we paid. Granted, thats a fact of life when it comes to computer purchases, but it's usually because of substantial barriers to production that physically prevent better products from being released. This isn't a die shrink, it's routing video from a discrete GPU to an integrated GPU, something currently being done, but not on high end laptops. It's akin to Nvidia having a GT 425 with 3D vision support and then releasing a GTX 580 several months later with it left out.


    This laptop had the potential to be the best 15 inch laptop in all respects, bar none. An optional 120hz 1080p panel could have been included, instead it's capped at 1366x768. Same goes for a backlit keyboard. Those are permissible omissions in my book, but in 2011, Optimus is not. I don't think you could find a more general purpose high end laptop on the market until Ivy Bridge. By doing this they would have widened their audience dramatically. Graphics professionals and wealthy individuals who don't necessarily care about gaming but like to have the best would suddenly see this laptop as an attractive option. A <7lb, 3D capable, 4+ hour battery life, comparatively low cost notebook that scores 15k in 3Dmark Vantage...that would be unbelievable, and the worst part is there is NO reason why it couldn't have been done already. With Optimus it's not necessarily going to be an "easy" fix, but that's only because of poor design decisions initially.


    In the meantime, Clevo is left with a glaring opening for competition from other, larger manufacturers like HP and Dell. All the while, they have vast potential demand that slips away in favor of other products that include these features individually. Not much of an issue right now considering the the supply shortage, but once this initial rush is satisfied the potential losses will begin to mount.


    ************************************


    TL;DR, I know, but it's simply bad business on Clevo's part and someone has to show them this. I encourage everyone who would like to see this feature implemented to let your Clevo vendor of choice know. Clevo may be making wide margins on these laptops, but the distributors are making less, so quantity is relatively more important to them. Vendor representatives, take this information to heart and let Clevo know that you care because your customers definitely do.
     
  16. Daniel Hahn

    Daniel Hahn Notebook Evangelist

    Reputations:
    146
    Messages:
    664
    Likes Received:
    0
    Trophy Points:
    30
    OK, I'll admit, the statement was too general, but think about it: Your laptop has a SB quadcore and the GTX 485M; Optimus or not, it if you fire up a game Optimus will switch the the dedicated GTX 485M which - in any newer 3D game - will drain your battery quickly. The whole idea of Optimus is driver-based GPU switching, therefore there is no need to implement any hardware switching, probably not even the option for manual software switching. So even Optimus would buy you no additional battery life if you play newer games on it.

    I guess one reason why the implementation of Optimus is more expensive in a MXM 3.0b system is that you'd have to change the MXM slot which destroys the advantage of using MXM in the first place. Apparently it's easier to accomplish if the GPU is mounted directly on the motherboard.

    Anyway, Optimus should have been an option for this laptop, it would have been the first true gaming laptop, Desktop like performance, still portable due to comaratively low weight and size and 5h+ battery life: awesome!
     
  17. Kalimdra

    Kalimdra Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    0
    Trophy Points:
    5
    Well that's exactly what i am looking for. I intend to buy a new laptop in the next month, and I don't want to sacrifice battery life for gaming performance. When Optimus was revealed it was the perfect solution, but why they only put this on low performance notebooks ? I mean those notebooks already have a quite good autonomy (even if more is always good). The GPUs which will profit the more from this technology are the high performance ones (GT460M and higher), but no manufacturer offer this even though it is possible. May be there is some limitation i don't know about ?

    Is there any model from any manufacturer which has Optimus with a 460M or better ? If this message can help convince one of them, I am waiting for such a laptop to be available and i'm ready to pay a little higher for it: gamers also want battery life for internet and work sometimes.


    Also another question.. I know there exist Switchable Graphics for AMD cards. What's the difference with optimus ?
     
  18. pianoplayer

    pianoplayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Absolutely right. I don't get why anyone would do this. Unless there is no motherboard that supports the GTX 480 AND optimus (which I really doubt) then they have pretty much no excuse to do this. I wouldn't mind it if they called the Optimus version something like NP8130-O or something like that, but just not having Optimus (or having it with a bad graphics card) is just unforgivable.

    Yesterday I found a really good deal on an HP. Specs:

    2nd generation Intel(R) Core(TM) i7-2630QM (2 GHz, 6MB L3 Cache) with Turbo Boost up to 2.9 GHz
    1GB ATI Mobility Radeon(TM) HD 6570 graphics [HDMI, VGA]
    FREE Upgrade to 6GB DDR3 System Memory (2 Dimm)
    640GB 7200RPM Hard Drive with HP ProtectSmart Hard Drive Protection
    No Additional Office Software
    No additional security software
    6-Cell Lithium-Ion Battery (standard) - Up to 6.0 hours of battery life +++
    15.6" diagonal High Definition HP BrightView LED Display (1366x768)
    TouchScreen with HP TouchSmart's intuitive multi-touch applications (includes HP TrueVision Webcam)
    SuperMulti 8X DVD+/-R/RW with Double Layer Support
    Intel 802.11b/g/n WLAN
    Backlit Keyboard with HP SimplePass Fingerprint Reader
    $1024 (about $1100 after tax)

    The graphics card sucks compared to the 480 AFAIK but it is better than the 540 (also AFAIK) which supposedly will run starcraft 2 on medium quite well. And because the screen's 1366x768 (just remembered that my dad has a 23 inch 1080p monitor that he never uses, I might get it, if so, this point is null) I might slide by on high. Also, what I really like is the backlit keyboard and touchscreen. The backlit keyboard is something that I was also really missing on the Sager, but decided that if necessary I could leave out. It's really funny that a gaming laptop company left out a backlit keyboard but this "business" laptop had one. The touchscreen is really nice because, after all, I don't live in a cave and send zerg rushes all day. However, I've read that Apple tried it on their MPBs and said it was "tiring" after a few minutes. Also the battery life is supposed to be 6 hours on the ATI, not the integrated intel graphics.

    Comments?

    Sorry, this is getting WAY off topic.
     
  19. Judicator

    Judicator Judged and found wanting.

    Reputations:
    1,098
    Messages:
    2,594
    Likes Received:
    19
    Trophy Points:
    56
    As I pointed out above in my post, I don't think we've _ever_ seen Optimus in a high end discrete card; only mid-range. I suspect (no proof) that it may be a limitation of the IGPs to date; they can only take so much data from the discrete card, and would bottleneck anything higher than a mid-range card (as stated above, this is speculation; no proof). Previous iterations of switchable graphics (nVidia or AMD) involved, as I stated above, 2 different set of muxes (multiplexers), one for the IGP and one for the discrete card; there would be a software or hardware switch that you would hit that would change from one to the other (pretty much always involving a reboot). Optimus was revolutionary in that it was the first solution that let you change between IGP and discrete card "on the fly", without needing a reboot after each change. This is part of why the new AMD "Dynamic Switchable Graphics Technology" (which I don't think is actually out yet) holds so much promise; it offers the possibility of on the fly switching like Optimus does for AMD users.

    Oh, and in response to your previous question about pulling out a graphics card before a flight; the difficulty in that is in the way notebooks are generally constructed. A notebook GPU is not usually integrated with its heatsink; it's connected to a separate heatsink that either has it's own fan, or shares a fan with the CPU. Additionally, many notebooks will not boot without a graphics card; unless the notebook is designed with (old-style) switchable graphics (and even then only possibly), it simply won't boot if the graphics card is missing. I'm going to link you to a teardown guide that was recently resurrected in the Hardware Forum here. Of particular note are the pictures; now this is a Toshiba, and it's a bit more fiddly than most, but most importantly, look at the last few pictures, where the CPU and graphics chip are exposed. You can see how first getting to the graphics chip is quite a feat, and removing the graphics chip could mean some difficulties in reassembling the notebook back into a working whole. And yes, I know this is a soldered on GPU, but even with a MXM-based or otherwise removable one, the appearance is often similar. Here is an Elitebook 8740w. After you scroll down past the screen and external images, Aikimox has a picture of under the keyboard, showing off the fan in the upper left, the CPU covered by the small heatsink in the lower left, and the GPU covered by the big heatsink to the right of the fan. Now, the 8740w has its GPU mounted in a MXM 3.0B slot, so it does, in fact, have a removable GPU, but again, you can see how removing the GPU in a way that would leave the notebook usable would be difficult (it's also worth noting that Aikimox's model uses a quad core Core i processor, and has no IGP, so even if you did remove the GPU, it wouldn't POST).

    Oh, and the HP you mentioned... is that the dv6tqe? The Pavilion is _not_ a business model. It's one of HP's consumer models. HP business models are the Probooks and Elitebooks, as well as a few others.
     
  20. pianoplayer

    pianoplayer Notebook Enthusiast

    Reputations:
    0
    Messages:
    15
    Likes Received:
    0
    Trophy Points:
    5
    Yes, it is the dv6tqe. Funny, somewhere on the website it said that it was a business model. Thought that was weird too. Should I buy it, or wait for another month or so?
     
  21. PCchief

    PCchief Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5

    Why would Nvidia support Optimus on the 485m if it would be bottlenecked by the Integrated GPU? I don't think Nvidia would risk an Optimus implementation on the 485m that significantly lowered its performance when in discrete GPU mode. This could potentially hurt 485m sales if uninformed customers saw the resulting poor benchmarks. Unless of course Optimus "support" itself is just a pure marketing ploy and there is nothing physically different between Nvidia GPUs with explicit support and those without. This would imply that Nvidia's part in the switching process is done through the drivers alone, which I don't think is the case. Not only that, but if there was a bottleneck issue, it's not going to be solved in the 485's production lifetime. Ivy bridge integrated GPUs are not going to be that much faster than Sandy Bridge's. After Ivy Bridge, Nvidia will have a new flagship mobile card on a 28nm process using substantially less power or having substantially more performance at the same TDP. The 485m becomes instantly obsolete, Nvidia had to have known this.

    I understand there has to be a physical connection from the card to the integrated GPU and then to a video output. However, given that these cards are MXM 3.0, there is no physical connection between the card itself besides the MXM 3.0 slot, which would then interface with the board, possibly the IGP, and video out. Video output is therefore physically on the board rather than the card, so its dependent on board design. It's also possible Optimus just can't work through MXM 3.0, but that would imply there are embedded 460 or 485m's, something I haven't seen.

    Claiming the 485m (which I assume is MXM 3.0 only) has Optimus support when there is no way to actually implement it, bottleneck or not, is blatant false advertising that wouldn't have slipped past Nvidia's legal team.

    I think the primary reason we haven't seen high-end cards with functional Optimus is because it wasn't practical before Sandy Bridge, but that's not to say it wasn't possible. A high-end GPU implies a relatively high-end CPU, often a quad core. In the Core 2 generation, integrated graphics was implemented through an Intel GPU on the motherboard. In the first generation i7, on-die GPUs were included for the first time, but only in dual cores and only for use with certain chipsets. However, ALL Sandy Bridge processors have on-die graphics. Likewise, both the HM65 and HM67 chipsets have native support for the use of these on-die GPUs.

    From a design perspective, it wouldn't have been practical to design a board that could take an MXM 3.0 card and have integrated graphics before Sandy Bridge. Optimus itself is only a year old; most designs we're comparing the P150HM to were drawn up before Optimus was even possible. Therefore, there would be absolutely no reason to include support for both because you couldn't easily switch between integrated and discrete. Not only that, having a single laptop model designed for two separate motherboards, one with integrated GPU support and one without (for quad cores), was probably too expensive from a mass-production standpoint. In the past, Optimus has been seen in mid-range laptops with relatively limited CPU options, specifically dual cores that had on-die CPUs. The M11x is a perfect example of this. The original version, released at last year's CES, didn't have Optimus. It was redesigned for it after Optimus's launch in February and released 6 months later.

    I do agree that the bottleneck theory is possible, I just don't think it's very likely considering that Nvidia's entire high-end philosophy is to have the best performance at all costs. Optimus induced bottlenecks would threaten that goal in favor of battery life, something that would be secondary for Nvidia's high-end. My view is that this was a design oversight, the technology was all there - the universality of on-die GPUs and of chipset compatibility.
     
  22. belegdol

    belegdol Notebook Geek

    Reputations:
    4
    Messages:
    87
    Likes Received:
    5
    Trophy Points:
    16
    I might be wrong, but it seems like Alienware M17x R3 supports optimus/powerxpress. And it has a powerful graphics card. The only issue is that the european prices are plainly ridiculous, as in almost twice as high.
     
  23. JasonNH

    JasonNH Notebook Evangelist

    Reputations:
    45
    Messages:
    319
    Likes Received:
    14
    Trophy Points:
    31
    I suppose it depends on what you define as powerful, but it is only available with the 460m, which is significantly behind the power of the 485m. As for AMD, the 6870 supports powerxpress and it seems we will find out soon if the 6970 does as well.
     
  24. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    High-end Clevos aren't classified as gaming laptops though.
     
  25. Patrck_744

    Patrck_744 Burgers!

    Reputations:
    447
    Messages:
    1,201
    Likes Received:
    41
    Trophy Points:
    66
    Desktop Replacement is the word right? or Portable Workstation?
     
  26. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Yes, as their primary purpose is gaming, they're designed to be tethered to a plug at most times. The 2hr battery life was not intended to be used for gaming.
     
  27. John@XoticPC

    John@XoticPC Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    I say "Desknote" for short.
     
  28. Sirisian

    Sirisian Notebook Enthusiast

    Reputations:
    0
    Messages:
    36
    Likes Received:
    0
    Trophy Points:
    15
    "Dragtop" is what my brother calls them. :D
     
  29. Judicator

    Judicator Judged and found wanting.

    Reputations:
    1,098
    Messages:
    2,594
    Likes Received:
    19
    Trophy Points:
    56
    I think this is the fundamental core of our disagreement. I don't think there _is_ a significant physical difference between nVidia GPUs with explicit Optimus support and those without. From my understanding of Optimus, apart from the Optimus Copy Engine, the differences are at the motherboard level (where they feed the data from the card to the integrated graphics), and in the drivers. So yes, I think that nVidia's part in the process is primarily only through drivers, apart from writing the specs on how to wire the motherboard linkup between the discreet card and the IGP. So basically (IMO), any driver that has support for Optimus and support for a given graphics card, given the motherboard connection, will work, even if it'll bottleneck.

    This could actually be a very clever marketing ploy on nVidia's part; if the GTX 485M does bottleneck through the IGP, smart manufacturers won't implement Optimus for it, because it adds to their costs, and there wouldn't be any advantage. Thus nVidia can tout the "ability" for the GTX 485M to support Optimus, knowing full well that manufacturers won't put it in because it'd lose them money with no advantages. And if someone does enable it, and the bottleneck becomes apparent, hey, it's not nVidia's fault that the IGP isn't powerful enough to keep up. All of the advantages, none of the blame!

    I think it's quite possible to implement Optimus through a MXM 3.0 slot. See above for my thinking as to why it's not done. And since I think it is possible to implement it (just not practical), it's not false advertising. It's just "useless" advertising, so to speak. It's like, oh, saying that a jet airplane has the fastest ground speed of any vehicle on 3 wheels... it may be totally true, but also completely beside the point (since the point of the plane is to fly, not taxi).

    The M17x R1 had integrated graphics and MXM 3.0B graphics cards. Not that Optimus existed at the time, but switchable graphics were quite possible before Optimus. Just not common due to the costs involved in the engineering (dual muxes, BIOS and driver support, etc). And 2 motherboards for a (essentially) same model have been done numerous times before; there are many C2D designs that come in both integrated GPU varieties and discrete card varieties (mostly older designs, admittedly). The original M11x didn't have a Core i CPU; it was a ULV C2D, and it _did_ come with switchable graphics (of the dual mux variety), just not Optimus. Optimus was, by the way, implementable before Arrandale; the notebook cited in Anandtech's review here is a ULV C2D with the GeForce 210M operating through an Intel GMA 4500MHD IGP.

    Oh, and page 3 of that review supports my assertion that most of Optimus is software; the only listed required hardware component is the Optimus Copy Engine. This could simply be something they've added to all future GPU releases, though, whether or not they're "intended" to support Optimus.

    Addendum - I can think of at least one very good reason to put the Optimus Copy Engine on all GPUs, even the high-end ones that wouldn't do well with Optimus; the way the Optimus Copy Engine works in divorcing the displaying from the rendering would seem to be perfect for enhancing SLI.

    Oh, and I freely admit that a lot of my opinion is predicated on the idea that the Sandy Bridge GPU couldn't display the GTX 485M rendering quickly enough. I wouldn't mind being proven wrong on this front.
     
  30. PCchief

    PCchief Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    Fair enough. Hopefully someone will get Lucid's QuickSync discrete GPU workaround up and running on a desktop h67 and see how it effects a high end GPU.

    True, "useless" advertising wouldn't surprise me.

    I understand switchable graphics was possible before Optimus, I never disputed that in your first post, I just said it wouldn't have been easy to switch without it. When I said one design for an integrated GPU and one without, I meant along with an MXM 3.0 slot. I'm not referring to laptops in general here. Optimus on quad core systems was impossible before Sandy Bridge. In order to have Optimus available with a high end card you have to have two designs, otherwise the motherboard routing doesn't work with the quad core because there's no IGP. Technically it could be done by regressing to a Core 2 Quad with an off-die IGP but that's pretty counter prodcutive from a gaming standpoint. Having two boards is much more practical for a larger audience where one customer might not care about discrete graphics. The XPS advertises Optimus support, but if you select a quad core it forces you to select a different Nvidia GPU "without" Optimus, presumably embedded on a different motherboard. For botique gaming/workstation laptop manufacturers it wouldn't have been cost effective to offer two different motherboards for such a narrow audience with both variants wanting a discete GPU, as opposed to laptops like the XPS where the motherboard feature set is much less specialized. As I said, my remarks were really geared towards latops typcially compared with the P150HM, namely the W860CU and M15x. These primary competitors were also designed for Clarksfield before Arrandale was available, pretty similar to the quad core only state of Sandy Bridge right now, the difference being that this time around the quad cores have IGPs.

    All I said was that originally it didn't have Optimus and then it was redesigned for Optimus with the on-die IGP in the i5/i7. I never said it didn't have switchable graphics or that it had Core i CPUs at launch.

    I didn't mean to suggest Optimus was only possible on Arrandale, but that it limited CPU selection significantly. "on-die" encompasses Sandy Bridge., Arrandale and Atom processors. ULV Core 2's are the exception that fit in an awkward postion between dual core Atom and ULV i5/i7s. Generally though, there would be no reason to implement Optimus on the older Core 2 platform unless it was ULV based because in typical Intel fashion they replaced the entire price spectrum, not to mention the die shrink. Now that ULV i7s are out there's even less reason to use the ULV Core 2s except for the low end.

    Seems fairly likely now, I also reviewed the Nvidia Optimus whitepaper and it's pretty vague regarding the Copy Engine, only spending about 1/21 pages on Nvidia specific hardware.
     
  31. oan001

    oan001 Notebook Evangelist

    Reputations:
    256
    Messages:
    482
    Likes Received:
    0
    Trophy Points:
    30
    Out of curiosity, who makes these classifications?
     
  32. Patrck_744

    Patrck_744 Burgers!

    Reputations:
    447
    Messages:
    1,201
    Likes Received:
    41
    Trophy Points:
    66
    Clevo since its on their website.
     
  33. ckevin

    ckevin Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    Add me to the list of people who really, really wanted Optimus on these models. I would have pulled the trigger a couple of weeks ago on an NP8130 or NP8150 if they'd been designed with it, there would have been literally no downside to owning the machine.

    I want something I can game with when traveling & power is available, but that can do 4+ hours on battery for light tasks like web browsing and office apps.

    I've been trying to find out what kind of battery life either one of those models can get in the 460m/2630 configuration, at reasonable or full brightness for simple web browsing / office stuff. I haven't seen any tests of that yet (I've been following about 5 different threads), and though I'm sure they're nice machines I'm not going to buy without numbers. We'll see what the other major vendors come up with in the next month or two; the first laptop to pair Optimus with high end graphics is going to sell like crazy.
     
  34. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The new M17x has switchable gfx.
     
  35. PCchief

    PCchief Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    I saw claims of 5 hrs with the 9 cell battery. My only problem with the M17x is the weight, at almost 12lbs it's a good 5lbs heavier than the P150HM.
     
  36. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Clevo IS more portable than Alienware, but it is also weaker (single card vs. SLI). It's part of the reason why I choose Clevo and not Alienware. At the time when I got my laptop, I could have bought a more powerful Alienware for the same price, but size, weight, looks and battery life made me go for Clevo.