The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    U30JC discussion thread

    Discussion in 'Asus' started by coriolis, Jan 10, 2010.

  1. bozeefus

    bozeefus Notebook Consultant

    Reputations:
    1
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    nvm i am wrong
     
  2. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    Lucky you! We'd all like that Canadian i5 model w/ BT.

    Does the Taiwan model come with a US/Canada QWERTY keyboard? Important as the UK English model has the symbol keys in very different locations, though both are QWERTY.

    The Taiwan model means you need to depend on the ASUS warranty since you won't be returning it to Taiwan most likely.

    The different OS levels may not mean much to you. Depends on your needs. I have Home Premium in one & Prfessional in another notebook & can't really tell the difference.

    Does the Canadian Win Pro come with a real DVD or is it a backup on the hidden partition on the HDD? A real DVD would have been worth something to me.

    Be sure the Taiwan seller will put in real Transcend RAM that is TWO sticks of 2GB each and that the speed is the correct speed, not something slower.

    Bigger 500GB HDD is nice for the $27 more.

    Since the two are nearly the same price, if they indeed upgrade the RAM, the issue to me would be the warranty (happy to send it to ASUS if there's a problem?) and the keyboard layout is correct for your area.
     
  3. MrVibe

    MrVibe Notebook Consultant

    Reputations:
    37
    Messages:
    124
    Likes Received:
    0
    Trophy Points:
    30
    It also comes down to $790 after bcb for the u30jc vs $1049 for the MacBook pro after 100 rebate
     
  4. Stumme

    Stumme Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Thanks for the quick response.

    Yes. It's exactly the same as the US keyboard except for some Chinese input method labels printed next to all the English characters/symbols. Makes each key looks more crowded, but exotic?

    That's true. Haven't thought of that. But my current laptop is an ASUS M5 also bought in Taiwan (yes, it's practically antique these days), and I've never had to send it in for fixing in all 6 years of its operating life. Assuming the quality of ASUS's laptops hasn't dropped since then, it might not be a very big issue.

    What about the XP mode though? It sounds like it would be rather useful for legacy programs.

    This I don't actually know. Infonec doesn't mention it on their website, and I doubt anyone's going to answer the phone over the weekend. I seem to have came across someone who bought the model from them earlier in this or another thread, perhaps that person can enlighten us?

    I don't think the native 2G RAM from ASUS is Transcend's. Would that make a difference? Speed should be the same.

    It's a Seagate though. Has bad rep among my Taiwanese friends who bought their ASUS laptops in recent years with those. Is it true that Hitachi has better (cooler/quieter) drives these days? Enough to make up for the 180G shortage?
     
  5. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    When I go to Bing CB, they offer 2% cashback at Newegg. Their U30Jc is $899.99 (+$13.18 S+H). $899.99-2% ($18)= $881.99+13.18=$895.17

    How are you getting $790? Are you seeing a different cash back %?

    Great if I'm messing up on the BCB page. $790 would be a great price.
     
  6. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    That was at TigerDirect, though the U30JC isn't available there anymore.
     
  7. fusoyaii

    fusoyaii Notebook Guru

    Reputations:
    149
    Messages:
    64
    Likes Received:
    0
    Trophy Points:
    15
    Does the BIOS allow you to enable Virtualization? I wanna upgrade to Win 7 Pro and use XP-mode.
     
  8. chris2k5

    chris2k5 Notebook Consultant

    Reputations:
    9
    Messages:
    297
    Likes Received:
    0
    Trophy Points:
    30
    Just ordered a U30JC from NewEgg.

    I hope the battery lives up to my expectations. I'd love to bring it to class and the library a lot.
     
  9. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    What are your expectations for the U30Jc's battery?
     
  10. BlazingSkies

    BlazingSkies Notebook Consultant

    Reputations:
    0
    Messages:
    183
    Likes Received:
    0
    Trophy Points:
    30
    guys, PLEASE do me one favour? can you try to play some ps2 games on pcsx2 (ps2 emulator) and see how it turns out? anyone know? if the it great then i'd jump on this! please help anyone!! i would appreciate it tenfold! :(
     
  11. PEEGGY

    PEEGGY Notebook Consultant

    Reputations:
    7
    Messages:
    133
    Likes Received:
    0
    Trophy Points:
    30
    @Stumme
    I would go for X2C with no doubt: 4G RAM is worthy upgrade and 7pro would be nice, I advise you not to invest on huge HDDs cause you can upgrade to SSD anytime.But i5-430M is not huge improvement over i3-350,just adds small turbo boost (up to 2.5GHz) and eats up more battery so all that with i3 for lower price or i5-520M for better performance would be better options.
    Does anybody know if U30JC ships with i5-520M?

    @fusoyaii
    All i-cores support VT-X but only i5-i7 supports direct IO
     
  12. jstnwng

    jstnwng Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    0
    Trophy Points:
    5
    I think it will, i take it all the time and just leave the charger at home! Classes and the library dont require that much power since you'll probably just be taking notes and surfing the web maybe? Easy to get 7+hours, I dont know how much battery having an SSD adds (does anyone have an estimate?) so i might be cheating, but i think i can get pretty darn close to 8 on 2nd to lowest brightness and power settings at 'balanced' wifi on and applications like microsoft office open.
     
  13. Lindsey

    Lindsey Notebook Enthusiast

    Reputations:
    5
    Messages:
    24
    Likes Received:
    0
    Trophy Points:
    5
    D*mn, Newegg is sold out again already. I wanted to run it by my spouse before hitting the button to buy and now I missed out. Guess that gives me a chance to wait out the Acer 3820tg or Asus ul30jt then...
     
  14. Stumme

    Stumme Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    Thanks for the alternate perspective. But the quote I got for that Taiwan model already includes a vendor's upgrade to 4G (2x2G) RAM. So that extra HDD will actually come at the same price after a rebate. Will there really be a performance difference from X2C's 1x4G?

    This limited difference between i3 and i5 you mentioned though... Does that mean one wouldn't notice a thing unless running some resource-demanding games? And how much time can we expect the i5 CPU to cut back from the battery?
     
  15. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Resource-demanding games are almost certainly going to be bottlenecked by the 310M rather than the CPU anyway.
     
  16. PEEGGY

    PEEGGY Notebook Consultant

    Reputations:
    7
    Messages:
    133
    Likes Received:
    0
    Trophy Points:
    30
    Then you have to choose X2C if you work in business environments like domains...and you will need 7pro or you have to see if you like Chinese printed chars.

    Nothing you can notice but 1x4G is more expensive and helps for future upgrade.

    exactly

    I haven't measured that with i5-430M but you can save around 20~30 minutes when compared to 520M,I guess.But that depends on what you do and how much task demanding apps you will run on your PC.
     
  17. GENETX

    GENETX Notebook Geek

    Reputations:
    64
    Messages:
    92
    Likes Received:
    0
    Trophy Points:
    15
    Why would a faster processor consume more energy? They all run at a lower clockspeed and voltage in enery saving modus. These are the same for all processors in the serie, so they will behave the same. Therefor, the energy consumption would be equal.
     
  18. PEEGGY

    PEEGGY Notebook Consultant

    Reputations:
    7
    Messages:
    133
    Likes Received:
    0
    Trophy Points:
    30
    @GENETX
    Higher clock speeds generate more heat and consume more energy and that will be observed when you use task demanding apps.
     
  19. chris2k5

    chris2k5 Notebook Consultant

    Reputations:
    9
    Messages:
    297
    Likes Received:
    0
    Trophy Points:
    30
    I expect a solid 4-5 hours with brightness at 3/4 or at least 1/2 setting, Wi-Fi on for web browsing, Microsoft Word to take notes, and a mix of both for library.

    4 hours is the bare minimum for me.

    I am basically looking for a Macbook Pro without the high price tag and no accidental damage warranty.
     
  20. mrPico

    mrPico Notebook Deity

    Reputations:
    297
    Messages:
    764
    Likes Received:
    0
    Trophy Points:
    30
    You should try calling GenTechPC or XoticPC and ask for stock. I bought one from GenTech PC a few weeks ago when every retailer was out of stock. I made a call and they told me they had stock so I was surprised and ordered right away. It was shipped to me in 3 business days.
     
  21. aray213

    aray213 Notebook Enthusiast

    Reputations:
    0
    Messages:
    13
    Likes Received:
    0
    Trophy Points:
    5
    If that is what you are looking for you will NOT be disappointed. I have been able to play DVD's for more then 4 hours with Wi-Fi on at about 70% brightness.
     
  22. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    I get at least 7 hrs with that setup as long as I'm not on a web page that is using flash & is kicking on the NVIDIA GPU.

    As an example, before just now going online, I had the wifi OFF (it goes on in 2 sec literally and uses battery so I leave it off unless I'm online), brightness at 50% and was reading a downloaded book (Kindle for PC) with Word 2007 open and using "Power Saving" mode with the NVIDIA Control Panel set to "integrated graphics" as the default.

    My battery meter reports "84% 8 hrs 15 min remaining".
     
  23. bozeefus

    bozeefus Notebook Consultant

    Reputations:
    1
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    I cannot wait to get mines!


    And regarding about turning Optimus off. The only way, from what I know so far, is to uninstall the Nvidia drivers. If you do that, though, you won't have access to the 310m.
     
  24. hebron

    hebron Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    @Quatro:

    the 310 just kicks in when it detects more grapichs demand?


    In other words, can the nvidia310 be "turned ON/OFF" ??? for some battery juice

    Thanks!
     
  25. BlazingSkies

    BlazingSkies Notebook Consultant

    Reputations:
    0
    Messages:
    183
    Likes Received:
    0
    Trophy Points:
    30
    Can anyone test the laptop with PCSX2?
     
  26. MrVibe

    MrVibe Notebook Consultant

    Reputations:
    37
    Messages:
    124
    Likes Received:
    0
    Trophy Points:
    30
    Any recommendation for a low profile tightly fit sleeve for this laptop? th case logic made for the macbook pro does not look good too me

    thanks
     
  27. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    Can the NVIDIA GPU be turned off in all situations? I'm not 100% certain. Certainly, you can designate (in NVIDIA control panel) certain apps to run ONLY in integrated or nvidia, but that doesn't mean it stays that way if Optimus thinks it knows better.

    Here's an interesting experiment I just did...

    I'll use CPU-Z, GPU-Z and Hardware Monitor to keep watch on power, CPU speed, wattage & GPU in use in real-time. (NOTE: If GPU-Z is turned on when only the Intel chip is on, it won't later see the NVIDIA GPU, so GPU-Z must then be restarted once the NVIDIA chip is running.)

    I'm:
    1) on battery
    2) Battery Saving mode (CPU-Z: 0.9-1.2GHz, HWM: 5.5W-9.5W)
    3) Default "integrated graphics" in NCP (Nvidia Control Panel)
    4) Firefox 3.5.9 set in NCP to run in integrated graphics

    Using GPU-Z to see what GPU is bearing load (and restarting GPU-Z after each step):
    1) Boot-up to desktop, GPU-Z shows "integrated graphics in use"
    2) Launch Firefox, GPU-Z shows "integrated graphics in use"
    3) Go to HULU, GPU-Z shows "integrated graphics in use"
    4) Play "Fringe" in standard 360p, GPU-Z shows " NVIDIA in use"
    5) Change "Fringe" to HD 480p, of course, GPU-Z shows " NVIVDIA in use"

    When NVIDIA GPU is active:
    1) HW Monitor show that the power draw from the CPU DROPS to 5.47W as the NVIDIA GPU kicks in (takes load from CPU).
    2) CPU stays at 0.9GHZ since NVIDIA is doing the work.

    So clearly, Optimus still decides what it wants, giving me preference, but not absolute control.

    But if I'm in Office apps, basic wireless with no ad or TV videos running, NVIDIA should stay off.
     
  28. BlazingSkies

    BlazingSkies Notebook Consultant

    Reputations:
    0
    Messages:
    183
    Likes Received:
    0
    Trophy Points:
    30
    Optimus is pretty much retar........ not letting you watch vids. on Integ. gpu
     
  29. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    I think Optimus probably will listen to you for fullscreen apps, but Flash and the like are tricky and may not be easy for getting it to work the way you want it to.
    By the way, did you add the GPU selection to the context menu? That seems like it might be the most convenient way to micromanage Optimus in specific circumstances.
     
  30. bozeefus

    bozeefus Notebook Consultant

    Reputations:
    1
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    Maybe power draw is lower when using the nvidia card? You can't always assume that the weaker gpu uses less power
     
  31. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    Which is the context menu?

    Right-clicking on a shortcut & selecting the preferred GPU & then launching?

    And I can't check GPU-Z in full-screen HULU. When I tab over the GPU-Z, HULU immediately goes down to the small size. But if Optimus is choosing to use NVIDIA for the HULU movie small size, it would make sense to me that that's not going to change in full-size.
     
  32. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    I thought that HWMonitor was reading the power usage of just the CPU. Is it reading all power draw from the motherboard?

    So I thought the CPU was drawing less because the NVIDIA GPU was doing the work and drawing it's own power (for which I had no measurement).

    Engineer help please.
     
  33. bozeefus

    bozeefus Notebook Consultant

    Reputations:
    1
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    my comment was actually directed towards blazing blue who commented that optimus was retarded. perhaps optimus does know what's best when trying to maximize battery life.
     
  34. MasterEvilAce

    MasterEvilAce Notebook Guru

    Reputations:
    14
    Messages:
    71
    Likes Received:
    0
    Trophy Points:
    15
    I think what's possible is... that running Intel gfx @ 100% load can draw more power than running nvidia @ 10%...

    So if something is too demanding for the integrated graphics, perhaps it will be better kicking in the discrete.

    I don't have any numbers for this, just saying...
     
  35. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, the context menu is what pops up when you right click.

    Fullscreen Hulu probably wouldn't use the IGP either. Still, I think the main issue is the way Flash integrates into the browser; Most games would probably listen to your choice of GPU.
     
  36. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    I'd say that that's probably somewhat true, and in some situations this would also cause greater power consumption in the CPU, because the Intel IGP would not be able to take all the load off the CPU.
    These benchmarks support the idea that in some situations the discrete card would consume more power, since the Optimus system actually got slightly greater battery life in the test than an IGP-only system.
     
  37. bozeefus

    bozeefus Notebook Consultant

    Reputations:
    1
    Messages:
    136
    Likes Received:
    0
    Trophy Points:
    30
    yes i agree. you expressed my thoughts exactly :)
     
  38. BlazingSkies

    BlazingSkies Notebook Consultant

    Reputations:
    0
    Messages:
    183
    Likes Received:
    0
    Trophy Points:
    30
    Why can't Nvidia let users have a switch for ON/OFF... it'd be easier for EVERYONE. Choose when you want it on and off. It's not like it SPLITs the integ. and discrete at the same time... ex.

    You have word doc, movie, youtube on...
    word doc uses integ.
    movie uses discrete.
    youtube uses one of them?

    something like that. otherwise optimus seems like such a RETAR......ed idea -_- itd be easier just to have a on/off switch... man are computer users THAT LAZY?
     
  39. lowlymarine

    lowlymarine Notebook Deity

    Reputations:
    401
    Messages:
    1,422
    Likes Received:
    1
    Trophy Points:
    56
    The issue is that computers with switchable graphics like you describe typically require a reboot in order to change which GPU is in use (thus negatively impacting battery life since start-up/shutdown are very power-intensive), special drivers just for the hybrid graphics, or both. In theory, Optimus should remedy both of these issues (though it still requires special drivers for now) with little or no battery life impact.

    I also don't find it hard to believe that using the nVidia GPU is more power-efficient for Flash video, at least with Flash 10.1 installed. Remember, Flash 10.1 doesn't do hardware decoding on Intel cards, while the nVidia GPU can likely decode .flv with much lower CPU usage while in it's lowest power state.
    And really, it's not really laziness so much as forgetfulness. I'd probably launch a game or movie and then remember I needed to flip the switch about 90% of the time. :p
     
  40. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    Well, they do. It's called the VT series, as in UL30Vt, UL80Vt & UL50Vt. They have a physical button instead of Optimus.

    But I trust Optimus to do its job. And it seems to be doing it pretty well.
     
  41. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    I still have a question about Hardware Monitor. Is its power drain figure:
    1) for the power the CPU is taking or
    2) for everything on the motherboard (including the NVIDIA GPU when its on)?

    Because if it is for the power the CPU is taking, then of course the power wattage will drop as the discrete graphics takes the load off of the CPU's graphics ... but it would also mean I have no way to check how many watts the NVIDIA discrete chip is taking.
     
  42. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    HWMonitor's power consumption is likely for the whole Arrandale chip (cores and uncore), but I wouldn't consider the figure at all reliable.
     
  43. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    Okay, but what I'm getting at is that means that if the power draw goes lower (5.5-9.5W) when the NVIDIA chip kicked in, that doesn't mean lower power consumption because I have no measurement on what wattage the discrete GPU is then drawing.
     
  44. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    When the NVIDIA GPU is running, the power draw would be lower FROM THE CPU but then thw NVIDIA GPU would ALSO be drawing its own power, so its wouldn't be a power-saving idea.
     
  45. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Yes, but the point is that it's entirely possible that the Nvidia GPU will use less power than the Arrandale IGP+CPU would have to do the same task.
     
  46. Quatro

    Quatro Journeyman

    Reputations:
    536
    Messages:
    1,834
    Likes Received:
    6
    Trophy Points:
    56
    Is there any software monitor to see how much total power the CPU+Intel GPU oncore or the CPU+Nvidia GPU are drawing at any time?
     
  47. eugenes

    eugenes Notebook Evangelist

    Reputations:
    201
    Messages:
    405
    Likes Received:
    12
    Trophy Points:
    31
    Tweak 7/Tweak Vista does have a power tap that shows the drain on the battery, so that may be a way for you to estimate how much less or more power the Nvidia GPU is drawing.
     
  48. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, the best way you've got of monitoring power usage is via the battery. Most computer components don't tend to have the right sensors for measuring power usage.
     
  49. BlazingSkies

    BlazingSkies Notebook Consultant

    Reputations:
    0
    Messages:
    183
    Likes Received:
    0
    Trophy Points:
    30
    Hey Quadro
    If you have the resources can you test PCSX2 and tell me how it goes?
     
  50. GENETX

    GENETX Notebook Geek

    Reputations:
    64
    Messages:
    92
    Likes Received:
    0
    Trophy Points:
    15
    Measuring by software is a very bad idea imho. You don't have any information about the sensors, their acquaracy and location. A real power meter put on the power cable itself would be the best option. But in that case, it should be a good high-end one as well to get a good measurement.

    I don't think it makes much sense. The engineers at NVIDIA probably already tested this in their lab creating a good algorithm to switch between integrated and dedicated. By manual switching, you wouldn't win more than 15 minutes I think. Most likely, you will lose.

    When I look at the TDP, the 210M (same core as the 310M) states 14W at notebookcheck. The igp of the Intel is said to use about 8W. But Intel uses a different calculation on TDP, which should be 75% of the absolute max, yielding: 10,7W max. It is also most likely that the max won't be reached for both. The Intel igp will clock lower, while NVIDIA does the same with PowerMizer.

    As stated, the NVIDIA GPU will do some work the CPU usually does. Probably the difference between those combination (CPU + intel or CPU + Nvidia) won't be much more than 1 or 2W during internet browsing and playing a flash movie here and there on the web...

    So when the laptop gets a battery life of 420 mins (7hrs) I can do some calculations. The battery has 61Wh capacity, wich yields 219600J of energy.

    219600 / (420*60) = 8.714W dissipation for the system.

    So when I am correct, and you would watch flash movies for 30mins @ +2W TDP, this would give the following battery life:

    8,714 + 2 = 10.714W dissipation during 30 mins
    10.7 * 60 * 30 = 19285J dissipation
    219600 - 19285 = 200315J left in the battery
    (200315 / 8.714) / 60 = 383 minutes runtime left
    383 + 30 = 413,3 minutes of total runtime (6:53)

    As you can see, these little improvements wouldn't make much difference. You won't watch flash movies all the time when you are working on the battery. You won't consume much more energy compared to the most optimal situation. Your real loss won't be much more than just a few minutes. It's not really worth tweaking.

    What is really worth to look at is to undervolt your processor. That is afaik not possible for the Arrandale processors. I saw great improvements on the Core 2 Duo culv notebooks, where users are reporting to get an additional 30 mins of runtime.

    I van't find the voltages, but 5.5W @ 0.9Ghz could be at 0.7V maybe. -0.1V isn't unusual in undervolting, yielding (0.6 / 0.7) * 5.5 = 4.7W dissipation for the CPU (assuming Quatro his measurements were for the CPU). But to get a bit saver on calculations, ill set this to 5W. This 0.5W lower dissipation will have huge impact:

    (219600 / 8.714 - 0.5) / 60 = 445 minutes of runtime, that is an improvement of 25 minutes already.
     
← Previous pageNext page →