The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Clevo P170HM3

    Discussion in 'Sager and Clevo' started by hizzaah, Apr 5, 2011.

  1. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26

    Sure thing.

    P170hm3
    -Intel 2720qm
    -dual momentus xt's in raid 0 ie 2x500gb (simply because the newest vertex3/m4/corsair etc are in short supply and quite expensive and this setup gives excellent bang for buck performance)
    -16gb kingston hyperx 1600mhz
    -bluray burner
    -intel 6230wifi
    -gtx485m

    that about covers it i think...

    In case anyone is wondering, the 3d emitter is actually BUILT INTO the laptop/notebook. ALSO, this model has 3 or so more speakers built into it than the standard p170hm.

    And these are high quality screens and considering the cost of the screen upgrade on the standard 170 may as well get the 3d model which comes with better higher quality 120hz screen for slightly more dough. (atleast then its 3d and you get the glasses and more speakers too)
     
  2. Red Line

    Red Line Notebook Deity

    Reputations:
    1,109
    Messages:
    1,289
    Likes Received:
    141
    Trophy Points:
    81
    How's the screen? Matte and 120Hz should really kick in, pal. I've never used a laptop with 3d screen before, only a desktop LG W2363D... but it's such a pain to get to 60Hz monitors again((( Also, do you possess any info on your 3d glasses, are they from a newer revison with 60hr playback? Can you provide us with some short video or a few pictures of your system? Nice built, BTW)
     
  3. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    The screen is fantastic.

    Its full hd, matte, 120hz 3d and identical quality wise to those upgrades. Thats why Id think its a shame to just upgrade your screen when you could get a much better deal by getting the 3d model for a little more.

    Dont really know which revision glasses these are but id assume perhaps the newest out? I mean the HM3 is barely out anywhere and this is built in so id hope its the latest lol.
    I have played it for a long time and still havent needed to recharge it yet :)


    Yes I have no problems posting a few pictures or videos on like youtube etc once I move into my new place with proper broadband internet connection.
     
  4. bahnzii

    bahnzii Notebook Consultant

    Reputations:
    66
    Messages:
    108
    Likes Received:
    11
    Trophy Points:
    31
    Congrats :D

    Definitely post some gameplay vids in 3D as I am curious to see how much difference the 485 makes versus the 460 regarding 3D frame rate.
     
  5. Websurfer

    Websurfer Notebook Consultant

    Reputations:
    34
    Messages:
    174
    Likes Received:
    0
    Trophy Points:
    30
    Can someone confirm that the 3D screen option adds additional speakers from the non-3D model?
     
  6. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    will do as soon as I get the rams to work in 1600mhz lol


    I have crisis on it at the moment and once fear gets released might have that too. Anything others that might be worth it? (preferably new)
     
  7. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    I can because I have it lol

    Also look at clevo's website and the schematic outline, youll see the extra speakers. Not a massive improvement but a little bonus and allows the unit to be a little louder overall.
     
  8. bahnzii

    bahnzii Notebook Consultant

    Reputations:
    66
    Messages:
    108
    Likes Received:
    11
    Trophy Points:
    31
    Witcher 2 looks like some good eye candy. Maybe Dirt 3 or Black Ops for some recent titles. You mentioned Crysis...Crysis 2 of course is another one to humble the 'geek hardware' ego :D
     
  9. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    Sorry I meant Crysis 2 when I stated crysis.


    I am now ready to upload some videos and will do so very shortly hold tight....




    Grrrrr upload takes too long with this stupid mobile internet. Guys ull have to wait till i get my apartment and adsl2+.

    For now I can give you some ideas:

    using fraps in crysis 2 just after opening scene when ur on top of the building looking over the park:

    all extreme settings w/v sync on fps is ~60 @1024x768
    36 @1920x1080

    all extreme settings w/v sync on and 3d on ~ 56 @1024x768
    30 @1920x1080


    above settings with gtx485m overclocked (699/1398/1600)
    non-3d ~75
    ~43

    3d ~ 60
    ~39



    unigine w 3d and mostly high settings ~415 or somewhere around there.
     
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    With those 3D settings, I gotta ask, are you using 120Hz refresh rate as per the screen? Or is it still at 60Hz, and 120Hz is just for the monitor. I know 60Hz and 75Hz running the same game maxed out will take a performance hit for the latter.
     
  11. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    120hz refresh
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    So the game's running at 120Hz refresh rate and still getting such good framerates? Wow. That's amazing. More and more now I want a good P170HM with a 580M and a 3D screen... I'll be like AHHHHH *Angelic light throwing down on laptop*
     
  13. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    any stats on the 580s at all?

    im wondering if i should get it upgraded from the 485 to the 580.
     
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Following trend with the 460M to 560M, it should be ~15% faster with less heat and more overclocking room, totally replacing the 485M (same price).
     
  15. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    anyone want a brand new 485m then?
     
  16. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    485m to 580m will probably be marginally faster (like <10%) but will be more power efficient. 460m and 560m are nearly identical performance-wise.

    IMHO not worth the premium paid, because difference between selling your 485m and price of 580m will probably be > $200.
     
  17. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The 560M is ~15% faster than the 460M, and costs the same price. It has replaced the 460M on laptops, and the price hasn't so much as hiccuped. The 485M -> 580M should follow this trend.
     
  18. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    The 560m is the exact same as 460m except core is 100MHz faster and shader clock bumped 200Mhz more. You would achieve exact same performance as 460m if you overclocked (and my thought is 95% of 460m's will) to the 560m stock. It's just a rebadge of the same chip with slightly higher clocks.

    Their chip yield probably improved so could reliably run chips at the higher clocks.

    My point is that it's not worth the upgrade from 485m to 580m in any case.
     
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Now selling a 485M to buy a 580M may not be feasible, unless you only pay about $30 or so in the exchange, this is true, but the 580M should cost the same as the 485M in terms of configuring laptops. Also, as I said earlier, the 560M is using a different, more power-optimized core than the 460M. It basically has better stock clocks, more overclockability and generates less heat. If the 580M does this over the 485M and it costs the same, why is it a bad idea to sell a 485M and upgrade? As long as you aren't losing a lot of money doing it.
     
  20. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    All I'm saying is once the 580m does come out, chances are it will cost about same new as the 485m (like the 460m to 560m). So the likelihood of selling your 485m within $100, heck even $200, of the 580m is not likely.
     
  21. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    Ive been reading that the 580m may allow for optimus technology to start kicking in which means improved battery life; something I do care about.
     
  22. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    If you do get it with optimus, you'll never be able to change/upgrade/replace the GPU in your machine. It'll require a motherboard replacement.
     
  23. Anthony@MALIBAL

    Anthony@MALIBAL Company Representative

    Reputations:
    616
    Messages:
    2,771
    Likes Received:
    3
    Trophy Points:
    56
    The 485m and 560m don't disallow optimus by design- but rather, Clevo chose not to implement it due to driver concerns. Optimus support still requires Clevo to decide to use it, even if the cards themselves allow it.
     
  24. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    oh...... so clevo had decided to disable the function as it were?

    so IF nvidia has improved things etc clevo might enable the function with the gtx 580m?

    is it not enabled with the gtx 560m?
     
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Read: Cards irreplaceable/unchangeable with optimus enabled. Whole motherboard swaps each time you upgrade/RMA is bad. Clevo no like bad.
     
  26. evangelionpunk

    evangelionpunk Notebook Geek

    Reputations:
    10
    Messages:
    86
    Likes Received:
    30
    Trophy Points:
    26
    :confused: ??
     
  27. Pman

    Pman Company Representative

    Reputations:
    327
    Messages:
    1,882
    Likes Received:
    0
    Trophy Points:
    55
    As far as I am aware the P1X0 will never have the optimus enabled regardless of which card is being used.

    Sorry guys
     
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I posted that information about four times already, and you keep asking about Optimus. If you get Optimus enabled for a machine, you cannot change your video card. It will require a motherboard replacement. This is bad business sense for the upgradeable Clevo machines. Especially for the high end notebook market such as ours, where, unlike alienware, changing a video card doesn't void warranty (only the new card is simply exempt from it). Optimus is probably not going to be enabled on any clevo machine until the discrete GPU can be swapped out without replacing the motherboard AND while keeping Optimus active with the new nVidia GPU put in.
     
  29. bartman8888

    bartman8888 Notebook Geek

    Reputations:
    23
    Messages:
    90
    Likes Received:
    0
    Trophy Points:
    15
    D2 Ultima - thanks for answering the million dollar question why Clevo doesn't implement Optimus - you're the first to provide the technical reason. We were previously told it was an incompatibility with Linux (or lack of Optimus support for Linux).

    Do you think Clevo could wire the IGP to the video out and put a soft switch in BIOS and still be upgradeable?
     
  30. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I don't know if that's what Clevo's official reason is, but the technological aspect is correct, in being unable swap out things. From a business standpoint, however, it's just not logically feasible to implement it for anything beyond a 555M, which people don't particularly buy with the plans of upgrading (considered a medium-class GPU by most; hell my 280M still smokes it).

    As for the soft switch thing, I honestly don't know. Technically, disabling the driver for the main GPU should be able to allow a swap to a secondary GPU driver. Then re-enabling it would swap it back to the main GPU driver. There could easily be a hotkey to do this. The screen'd just flash for a second if you did. Windows doesn't like two or more GPU drivers being installed at once since Win Vista though, and it carries over into Win 7. I don't know how they even get optimus to work like that. If microsoft wasn't so female-dog-like, I'm sure there'd be a better way.
     
  31. mythlogic

    mythlogic Company Representative

    Reputations:
    1,238
    Messages:
    2,021
    Likes Received:
    277
    Trophy Points:
    101
    Hey Guys,

    The entire new generation of nVIDIA hardware (5xx series) supports true Optimus implementations now. Basically how it works is in an Optimus implementation the dGPU (nVIDIA) directly copies output into the output buffer of the iGPU (Intel) and the Intel chip is whats hooked to the display connector. It requires windows 7 because of the memory mapping and sharing thats been implemented in Windows 7 to do the DMA access of one card to another to basically stream the output when the dGPU is in use. However, that means that you LOSE all the extra neat features of a direct GPU output, things like 3D Vision have a march harder time working if at all. Also you lose an extra display connection (the nVIDIA can natively support 3, while intel only two). And the memory sharing technology doesn't work in Linux because it would take nVIDIA to implement it in the driver and they said no already. So to do this on their high-end while not technically impossible, would require a motherboard re-work and so obviously not this generation.

    For the OTHER way to do it, is to use Switchable graphics with Mux's (Swiches) that can be configured to change which display device can use the output on the screen. This would have the benefit of being able to use JUST ONE or the other video device, and linux compatibility, however, yes it would be a hotplug operation when you switched and wouldn't be seemless to the OS or to the user. And those mux's cost money to design and implement. Again a motherboard rework would be required

    Overall yes, you can have optimus AND MXM cards, BUT no major manufacturer has decided to implement it. Alienware is doing mux switching on the m17x r3 platform.
     
  32. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Finally, a good, in-depth response. So you can't do mux switches by request Mythlogic? I know you assemble your own machines. If you could that'd be a lot of business for you for people who want the best of both worlds and have extra cash =3.
     
  33. mythlogic

    mythlogic Company Representative

    Reputations:
    1,238
    Messages:
    2,021
    Likes Received:
    277
    Trophy Points:
    101
    I wish.. Trust me if I could solder some mux's on and away we go, you better BELIEVE that we would have advertised the ever loving crud out of that :p

    But mux's are really just little tiny tiny chips that you then have to run traces to on the motherboard and so on.. Its just not something that you can addon later. We wish they would just spend the $3 / motherboard extra to put them in, but I understand $3 / board adds up over 100's of thousands of boards.
     
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    $3 /board? Just charge us $5 extra per board. Who'll notice the difference? Plus you still make a profit on it. XD.

    For shame, Clevo.
     
  35. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Well exactly. $3/board? Come on. Comparable Alienware machines cost 10-20% more and people gobble them up. And they have switchable graphics.

    Do you really think people would flinch over an extra $5, heck $10? I understand quantity of scale, as I worked in the auto industry as an engineer for over fifteen years, and saving $1 per car meant saving $500k a year. And if we could spend $3, we would charge $50 and it wouldn't matter if it was something that added a feature to the car.
     
  36. i.blades.i

    i.blades.i Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    Would it be possible to still use the functions of the iGPU but not the display.. as far as video encoding. There are a couple progs here and there popping up with quicksync support.. Plus its always a bummer to miss out on an extra GPGPU device.. They support some gpgpu stuff, don't they?

    Sorry to bump up an old thread.. but it was a good one.. got a better understanding of that..

    and.. wow. Nvidia's answer to the linux community.. "No".. Thats statquo for just about every video (chip maker) manufacturer out there.. Right now I'm pulling my hair out with the proprietary Imgination tech SGX powervr on the ti 4460 soc (pandaboard).. They are awful. They should advertise their product as 1080p playback (in theory).. but i digress..

    btw.. intel are a bunch of sweethearts to the linux community.. so thats an exception.. i'd love to see iGPUs catch up to dGPUs in speed (or at least closer) - but theres still the issue of sharing system memory.. bleh
     
  37. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    Hey Myth, :)

    check out what this guy has done:

    http://forum.notebookreview.com/har...dge-throttling-permanent-fix.html#post8197737

    He got the iGPU working via implementing an Intel HD Vbios option ROM via MMTool into bios and it works by hard coding either to use iGPU or dGPU (since bios has switch hidden, just like ours). ;)
    All this on a NON OPTIMUS HARDWARE!
    Either Asus wastes lot of money or Nvidia simply lied to us...
     
  38. mythlogic

    mythlogic Company Representative

    Reputations:
    1,238
    Messages:
    2,021
    Likes Received:
    277
    Trophy Points:
    101
    Yea, he's cheating. There are two models there, one with and one without optimus. However, in his case they were all "optimus" just a bios change disabled it. We've already have all these laptops with the Intel GPU running, only problem is that its outputs aren't hooked up, so what happens is that windows kicks on and tries to go optimus and outputs via the iGPU well that gets you no where :p, so its not quite the same thing, but its something we are still working on to be able to do video encoding etc on the iGPU.
     
← Previous page