The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    *** Official Clevo W110ER / Sager NP6110 Owner's Lounge ***

    Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by Ryan, Apr 7, 2012.

  1. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I'll see if I can find the couple articles again, but GDDR5 didn't seem to put off more heat, just more power draw, which is likely the reason for limiting its use in the W110ER. With the 35W CPU and 45W GPU the total system load puts it near the 90W design threshold so that was the likely consideration.
     
  2. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    I have a feeling that Clevo screwed up somewhere in the design process, which was why the gddr3 version was used.

    I bet they originally wanted to keep everything under 90W with the 90W PSU but then they had to offer the 3610qm due to supply issues and they had to raise the envelope to 120W anyway.

    Either that or they wanted to keep the price as low as possible which is why they went with gddr3 and a low end display.
     
  3. robomasterxp

    robomasterxp Notebook Guru

    Reputations:
    28
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    wait, so if we use a 35W CPU instead of the 3610QM now, shouldn't the laptop be able to use GDDR5? :rolleyes:
     
  4. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    I'm not sure GDDR5 would even use that much more power. The gddr5 version of the 650m is underclocked, so that probably compensates for some of the extra power usage. I doubt the difference in power consumption is very big.

    Now that I think about it, gddr3 is probably just a way to save money because the performance difference is minimal at 768p.
     
  5. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Wait, what? Erm, if it's designed for a 35W CPU + 45W GPU and total power draw is near 90W (which it is), then adding an additional 5W+ load to the system would be too much. I'm sure the original design consideration was for a 90W system, it's just that the cooling and power system seem to manage the 45W CPU's just fine.

    And that is probably the real answer right there.

    But here is an example from desktop DDR3 vs GDDR5... (only one I can find short notice): http://www.tomshardware.com/reviews/radeon-hd-5550-radeon-hd-5570-gddr5,2704-14.html

    5550 DDR3 vs GDDR5 had 15W more power draw and 5570 DDR3 vs GDDR5 had 19W more power draw. That's pretty significant considering these cards don't require a power connector, meaning they draw less than 75W total. But temperature-wise was only a few deg C, granted that's GPU and not vRAM, but still something to consider.

    [​IMG]

    [​IMG]
     
  6. plancy

    plancy Notebook Evangelist

    Reputations:
    56
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30
    But I think that's because the GDDR5 versions are able to process more, contributing to the power draw...
     
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    At peak load it's at peak load whether it's crunching 1 calculation per second or a million. It may be doing more, but it's still drawing that much power. Notice GDDR5 is also half the memory size of the DDR3 as well.

    Regardless, it is what it is. I'm sure it was a matter of cost, performance, power, and heat consideration of the total system design.

    Well in all honesty, Clevo spec'ed it for a 35W CPU, Sager (and most Clevo builders) is the one that chose the 3610QM, heck you can even get a 3720QM.
     
  8. macuill

    macuill Notebook Consultant

    Reputations:
    18
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    Sum it up.
    Its designed as an 11" Gaming Subnotebook.
    Designed for 35W CPU. Designed for 1366x768 resolution. Designed for portability.

    If one takes that in consideration, there's no need for DDR3 as there is no real benefit of DDR5 at this resolution.
    And the possibility to put in a 45W CPU is a plus, but originally not intended.
     
  9. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    LBO offers them with 3820QM... :)
     
  10. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
  11. Moobke

    Moobke Notebook Geek

    Reputations:
    0
    Messages:
    83
    Likes Received:
    0
    Trophy Points:
    15
    Small question, as I understand you can open the laptop by removing the battery.

    Now if I would buy a spare battery and take it with me, can I easily change between batteries? Or does the bottom of the laptop always come off if you take out the battery?
     
  12. WCFire

    WCFire Notebook Evangelist

    Reputations:
    281
    Messages:
    331
    Likes Received:
    3
    Trophy Points:
    31
    HTWingNut did a video review where he opened up the computer and took out the battery at around 1 minute.
    NP6110 Awful Overview - YouTube

    The bottom of the laptop does not always come off when you take it out. I've seen pictures of units where the battery is detached with the bottom on and vice versa.
     
  13. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    There are two different maces to open the bottom. As as I understood it you have to take the battery out before opening the bottom, not the other way around.

    You can easily swap out the battery without opening the bottom ;)
    Also make sure you've got the machine plugged in, I'm quite sure that you've then got the chance to swap batteries on-the-go while the laptop is on :p
     
  14. Moobke

    Moobke Notebook Geek

    Reputations:
    0
    Messages:
    83
    Likes Received:
    0
    Trophy Points:
    15
    Great, i'll guess i'l take a spare battery with me and then I won't have to worry about the battery life any longer! :D
     
  15. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yeah. You set the one side to unlock. Then the toggle unlock switch you have to hold in "unlock" position and slide the battery out, release the lock, then slide and hold the unlock again and then you can remove the bottom panel.

    Also, if there's anyone familiar with Throttlestop I could use your help. Either it doesn't work with Ivy Bridge or I'm just not doing it right. Would like to see if there's a way to lock in turbo as well as get to a lower power state.

    I was able to get the CPU to 800MHz at one point, but now can't recreate that.
     
  16. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    Here with Ivy Bridge dual core:

    [​IMG]

    [​IMG]

    [​IMG]
     
  17. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    The CPU is probably throttling. According to notebookcheck, a stock i5-2450m (also 2.4-3.1ghz) gets around 2.7 for Cinebench and 3400 for 3dmark06 CPU.

    Here's a review of another notebook with a i5-3210m. They didn't do the same benchmarks, but it seems to match up better to where the i5-3210m should perform.
     
  18. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Right, I also feel the same thing. With a 3610qm it gets around 4400 cpu on 3dmark06 but according to notebookcheck tests the 3610qm should get something near 6000.
     
  19. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
  20. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Mmmmm I'm not sure if these guys ran it at 720p but they get 6000

    Intel Core i7 3610QM Notebook Processor - Notebookcheck.net Tech

    We have to check with other system to verify this.

    EDIT: Just saw your image... Mmmmm interesting...
     
  21. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Well, it is appearing that when the GT 650m is active, the CPU doesn't boost appropriately. Just ran at 1280x768 with GT 650m on and got the 4400 or so. Disable the GT 650m in device manager and get 6000+. So I guess it's not due to resolution. I know final 3DMark score is dependent on resolution but CPU doesn't make sense to be, and apparently isn't.
     
  22. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Important for those considering the lower TDP chips like 3612QM:

    You'll only notice the difference between 3610QM and 3612QM when the load is over relatively long periods of time. The 35W TDP merely means that the temporary speed boost via Turbo Mode would expire faster than the 45W TDP chips.

    That's because the Turbo Mode 2.0 that was introduced with Sandy Bridge architecture allows thermal and power headroom above TDP levels. So the 35W chip, turns off Turbo faster to get back to 35W TDP levels.

    Now TDP means both thermals and power. It could be running at -15C, but it'll still go back down to meet 35W power usage limitation.

    Regarding power adapters: IDF 2 years ago Intel mentioned a concept that allows the battery to aid the AC power adapter when system power usage exceeds the capability of the AC adapter. They called it Hybrid Power Boost. Since the concept hasn't be implemented yet, the CPU is giving headroom for GPU to boost.
     
  23. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I don't see anything anywhere that says the system utilizes some sort of hybrid power boost. And TDP is Thermal Design Power. If temperature isn't a concern then its limitation would be voltage/power limitations of the chip or board, and they could keep pushing up the speed as long as the FSB and/or multiplier is unlocked, the same way AMD did with their chip. TDP is taken out of the equation. The CPU/chipset could run to 100W if it would let it then.

    TDP isn't based on power consumption, it's thermal power required to cool the system. They are related of course since you really shouldn't require 100W system to cool a chip that produces less than 35W of heat.
     
  24. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    It doesn't, but it tells you that reason for Turbo not working could be due to power limitations. Intel's future proposal to solve the problem is the Hybrid Power Boost. Also, revisit the Notebookcheck review. They note that CPU having Turbo off when GPU is being used is intentional, because it gives power budget to the GPU.

    TDP IS power consumption too. 35W using CPU on load dissipates 35W. You can't have a CPU using 15W but dissipating 35W heat. Of course the CPU won't use full power all the time.

    But the way I described is how Turbo works. 35W 3612QM will have the Turbo expire faster than the 45W 3610QM. Until that time, it can exceed TDP. Pre-Sandy Bridge systems didn't allow Turbo to exceed TDP limits, the speed gain was whatever headroom it had.
     
  25. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Intel Boost is designed specifically so it isn't constrained by TDP. It does have its own internal parameters such as peak temp and/or rate of temperature increase though before it throttles.

    But until I get confirmation of this "hybrid boost" I won't quite buy that as an answer. I think it's an artificial constraint either intentional or unintentional, by Clevo.
     
  26. FNAKFHE

    FNAKFHE Notebook Consultant

    Reputations:
    40
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30
    I dont get it, I understand your 'analytical' thinking, but i dont understand why your talking about a 15.6". its too big and not palm-able like your 11.6 or 14" m14xR2.
     
  27. neon10th

    neon10th Notebook Consultant

    Reputations:
    55
    Messages:
    223
    Likes Received:
    0
    Trophy Points:
    30
    According to a lot of people, the size of the 14" m14xR2 is about the same size as a normal sized 15" laptop, including the np6165.
     
  28. FNAKFHE

    FNAKFHE Notebook Consultant

    Reputations:
    40
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30
    Ok fine,

    To the both of you.

    I dont want the 14 inch which is almost the size of a 15".

    And i dont want the 15".

    Im not gonna order anything untill sager lets out a 13".
     
  29. MexicanSnake

    MexicanSnake I'm back!

    Reputations:
    872
    Messages:
    1,244
    Likes Received:
    0
    Trophy Points:
    55
    Wow HTWingNut, THANKS for doing that! It seems that when the GT 650m is active the CPU offers less performance, but why? Perhaps it is due to thermal issues or some other power options, or maybe clevo wanted it to be this way...

    PS: How do you poweroff the GT 650m and enable just the HD 4000 (only in device manager or is another way)? :confused: pardon my ignorance...
     
  30. neon10th

    neon10th Notebook Consultant

    Reputations:
    55
    Messages:
    223
    Likes Received:
    0
    Trophy Points:
    30
    Sager may never do this for a while, so in the meantime I'm waiting for Gigabyte to release their U2442. You might like it because it's a really thin an light 14" notebook with a 1600x900 screen, a backlit keyboard, thunderbolt port, and a GT640M (GDDR5) which can most likely be overclocked past the GT650M (DDR3) since they are almost the same chip.
    Gigabyte U2442 Ultrabook hands-on (video) -- Engadget
     
  31. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    You don't understand. Sandy Bridge has two Turbo parameters. "Long" and "Short" Turbo. Long is the one that existed since Nehalem, its basically guaranteed because it does not try to exceed TDP. The Short Turbo though has a fixed duration. If it was purely temperature based(which is BS) then Short should never go off with super cooling, however it does as its power usage based as well(and that relates to thermals). And there's also a fixed timer for Short Turbo.

    As long as the system manufacturer designs the CPU cooling that fits TDP specs, the CPU will play within the limits, and Turbo should always engage. However, people forget that Turbo, despite how it may seem initially, is an opportunistic, not a guaranteed boost. It's especially true with Sandy Bridge's implementation. Of course that does not mean every notebook manufacturer follows Intel's specs perfectly. Some overclocks, some puts a 45W chip on a 11-inch system and turns off Turbo.

    That's just repeating what I said. I never said Hybrid Boost exists, it was merely and example to prove that not enough power is being provided to the system to allow the CPU and GPU to boost together. And on a gaming system, Clevo chose to boost the GPU.
     
  32. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Where does it say that it's gddr5? Also it has a ULV processor. 900p and the backlit keyboard are interesting, though.
     
  33. FNAKFHE

    FNAKFHE Notebook Consultant

    Reputations:
    40
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30
    Thanks for the link.

    But, I don't see no thunder bolt port in the images?

    Also, it's a nice sexy look. Almost like a Mac book pro.
     
  34. neon10th

    neon10th Notebook Consultant

    Reputations:
    55
    Messages:
    223
    Likes Received:
    0
    Trophy Points:
    30
    Hmm...I thought I read it had gddr5 somewhere but I just tried googling it and nothing came up. I guess it hasn't been confirmed yet so we can only hope at this point. But either way there's thunderbolt so that could be great for future-proofing. However, there are two versions of the U2442, which are the U2442N (standard voltage processor) and U2442V (ULV processor).

    If you look at the 3rd image the thunderbolt port is directly to the right of the HDMI port.
    http://cdn.slashgear.com/wp-content/uploads/2012/03/U2442_1.jpg
     
  35. FNAKFHE

    FNAKFHE Notebook Consultant

    Reputations:
    40
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30
    Yes it's there. But still too big of a bezel around the LCD frame which means it will be almost as big as a 15". No thank you.

    I wish we had strong ivy bridge CPU in 13 inch frame and a good gpu.
     
  36. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    Guys, when I come home I'll take a look at the BIOS again.
    But I can already tell you that we have a few options regarding TDP and how turbo is handled, which we haven't enabled in the betas yet.
    And as far as I remember "hybrid turbo boost" is set to off by default...
     
  37. FNAKFHE

    FNAKFHE Notebook Consultant

    Reputations:
    40
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30
    You just made that up all on you own.

    Intel web site for ivy bridge CPU says nothing about hybrid turbo. Only for turbo boost. But I think your making up the hybrid part.

    In similure news ferrai is outing a new Enzo hybrid car check the link.

    Ferrari chairman reveals hybrid Enzo will come this year -- Engadget)
     
  38. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    You basically derailed this thread into oblivion haha

    He might be making it up, but why would he? Also take note, he uses quotation marks :p

    Btw, hybrid Enzo.... whats next hybrid Lamborghini?
     
  39. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    There is an option called "hybrid boost/power" or whatever, which is disabled...it may or may not refer to the CPU...it may very well be related to optimus...whatever it is...point being it is off and the disabled turbo we are facing is Clevos own stuff.

    BTW: My favorite coffe is called ENZO, so hybrid ENZO, "yes please"!
     
  40. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    Huh. So it actually has Hybrid Turbo Boost, interesting. This would be the first time I hear of such happening. In order to work you need power in the battery and A/C. Can't wait to see the results!
     
  41. FNAKFHE

    FNAKFHE Notebook Consultant

    Reputations:
    40
    Messages:
    115
    Likes Received:
    0
    Trophy Points:
    30
    Agreed its for the gpu. Not CPU. ....?????

    (notice my question marks)
     
  42. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,482
    Trophy Points:
    681
    Let the games begin:

    [​IMG]
     
  43. Wavebuster

    Wavebuster Notebook Enthusiast

    Reputations:
    102
    Messages:
    35
    Likes Received:
    0
    Trophy Points:
    15
    Anyone think Mythlogic would install an i7-3820QM that I had purchased separately in inclusion to aftermarket RAM and SSD, assuming I send the stuff to them? I was also hoping I could send my own 8GB USB flash drive for any important drivers and monitor calibration settings. As for the heat output of putting that in the computer... well, let me be the one to worry about that.
     
  44. macuill

    macuill Notebook Consultant

    Reputations:
    18
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
    Oh is there someone concerned about heat? :p
     
  45. Jonnyinter

    Jonnyinter Notebook Consultant

    Reputations:
    8
    Messages:
    101
    Likes Received:
    0
    Trophy Points:
    30
    Could someone who has this laptop please post some pictures or videos of it? I'm finding it incredibly difficult to find anything decent online. Thanks!
     
  46. macuill

    macuill Notebook Consultant

    Reputations:
    18
    Messages:
    100
    Likes Received:
    0
    Trophy Points:
    30
  47. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    LOL. That's exactly what I was going to do. Where did you find that fan? And how to mount it and power it?
     
  48. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    Hi,

    May I ask how you attach these extra heatsinks (or whatever you call them) and how much they cost? I would think about doing this do my p330 to improve already good temperatures (but a weaker 555m, not 650m ^^) :) The fan though, is a bit too much for me haha
     
  49. Clevernerd

    Clevernerd Notebook Consultant

    Reputations:
    123
    Messages:
    118
    Likes Received:
    0
    Trophy Points:
    30

    You'll need some copper ram heatsinks+SEKISUI 5760 Thermal Adhesive found on Ebay for example.
     
  50. mythlogic

    mythlogic Company Representative

    Reputations:
    1,238
    Messages:
    2,021
    Likes Received:
    277
    Trophy Points:
    101
    Yea we will.. but if it burns up.. we won't exactly warranty it :p
     
← Previous pageNext page →