nvm i am wrong
-
Does the Taiwan model come with a US/Canada QWERTY keyboard? Important as the UK English model has the symbol keys in very different locations, though both are QWERTY.
The Taiwan model means you need to depend on the ASUS warranty since you won't be returning it to Taiwan most likely.
The different OS levels may not mean much to you. Depends on your needs. I have Home Premium in one & Prfessional in another notebook & can't really tell the difference.
Does the Canadian Win Pro come with a real DVD or is it a backup on the hidden partition on the HDD? A real DVD would have been worth something to me.
Be sure the Taiwan seller will put in real Transcend RAM that is TWO sticks of 2GB each and that the speed is the correct speed, not something slower.
Bigger 500GB HDD is nice for the $27 more.
Since the two are nearly the same price, if they indeed upgrade the RAM, the issue to me would be the warranty (happy to send it to ASUS if there's a problem?) and the keyboard layout is correct for your area. -
-
Thanks for the quick response.
-
How are you getting $790? Are you seeing a different cash back %?
Great if I'm messing up on the BCB page. $790 would be a great price. -
That was at TigerDirect, though the U30JC isn't available there anymore.
-
Does the BIOS allow you to enable Virtualization? I wanna upgrade to Win 7 Pro and use XP-mode.
-
Just ordered a U30JC from NewEgg.
I hope the battery lives up to my expectations. I'd love to bring it to class and the library a lot. -
-
guys, PLEASE do me one favour? can you try to play some ps2 games on pcsx2 (ps2 emulator) and see how it turns out? anyone know? if the it great then i'd jump on this! please help anyone!! i would appreciate it tenfold!
-
@Stumme
I would go for X2C with no doubt: 4G RAM is worthy upgrade and 7pro would be nice, I advise you not to invest on huge HDDs cause you can upgrade to SSD anytime.But i5-430M is not huge improvement over i3-350,just adds small turbo boost (up to 2.5GHz) and eats up more battery so all that with i3 for lower price or i5-520M for better performance would be better options.
Does anybody know if U30JC ships with i5-520M?
@fusoyaii
All i-cores support VT-X but only i5-i7 supports direct IO -
-
D*mn, Newegg is sold out again already. I wanted to run it by my spouse before hitting the button to buy and now I missed out. Guess that gives me a chance to wait out the Acer 3820tg or Asus ul30jt then...
-
This limited difference between i3 and i5 you mentioned though... Does that mean one wouldn't notice a thing unless running some resource-demanding games? And how much time can we expect the i5 CPU to cut back from the battery? -
Resource-demanding games are almost certainly going to be bottlenecked by the 310M rather than the CPU anyway.
-
-
Why would a faster processor consume more energy? They all run at a lower clockspeed and voltage in enery saving modus. These are the same for all processors in the serie, so they will behave the same. Therefor, the energy consumption would be equal.
-
@GENETX
Higher clock speeds generate more heat and consume more energy and that will be observed when you use task demanding apps. -
4 hours is the bare minimum for me.
I am basically looking for a Macbook Pro without the high price tag and no accidental damage warranty. -
-
-
As an example, before just now going online, I had the wifi OFF (it goes on in 2 sec literally and uses battery so I leave it off unless I'm online), brightness at 50% and was reading a downloaded book (Kindle for PC) with Word 2007 open and using "Power Saving" mode with the NVIDIA Control Panel set to "integrated graphics" as the default.
My battery meter reports "84% 8 hrs 15 min remaining". -
I cannot wait to get mines!
And regarding about turning Optimus off. The only way, from what I know so far, is to uninstall the Nvidia drivers. If you do that, though, you won't have access to the 310m. -
@Quatro:
the 310 just kicks in when it detects more grapichs demand?
In other words, can the nvidia310 be "turned ON/OFF" ??? for some battery juice
Thanks! -
Can anyone test the laptop with PCSX2?
-
Any recommendation for a low profile tightly fit sleeve for this laptop? th case logic made for the macbook pro does not look good too me
thanks -
Here's an interesting experiment I just did...
I'll use CPU-Z, GPU-Z and Hardware Monitor to keep watch on power, CPU speed, wattage & GPU in use in real-time. (NOTE: If GPU-Z is turned on when only the Intel chip is on, it won't later see the NVIDIA GPU, so GPU-Z must then be restarted once the NVIDIA chip is running.)
I'm:
1) on battery
2) Battery Saving mode (CPU-Z: 0.9-1.2GHz, HWM: 5.5W-9.5W)
3) Default "integrated graphics" in NCP (Nvidia Control Panel)
4) Firefox 3.5.9 set in NCP to run in integrated graphics
Using GPU-Z to see what GPU is bearing load (and restarting GPU-Z after each step):
1) Boot-up to desktop, GPU-Z shows "integrated graphics in use"
2) Launch Firefox, GPU-Z shows "integrated graphics in use"
3) Go to HULU, GPU-Z shows "integrated graphics in use"
4) Play "Fringe" in standard 360p, GPU-Z shows " NVIDIA in use"
5) Change "Fringe" to HD 480p, of course, GPU-Z shows " NVIVDIA in use"
When NVIDIA GPU is active:
1) HW Monitor show that the power draw from the CPU DROPS to 5.47W as the NVIDIA GPU kicks in (takes load from CPU).
2) CPU stays at 0.9GHZ since NVIDIA is doing the work.
So clearly, Optimus still decides what it wants, giving me preference, but not absolute control.
But if I'm in Office apps, basic wireless with no ad or TV videos running, NVIDIA should stay off. -
Optimus is pretty much retar........ not letting you watch vids. on Integ. gpu
-
I think Optimus probably will listen to you for fullscreen apps, but Flash and the like are tricky and may not be easy for getting it to work the way you want it to.
By the way, did you add the GPU selection to the context menu? That seems like it might be the most convenient way to micromanage Optimus in specific circumstances. -
-
Right-clicking on a shortcut & selecting the preferred GPU & then launching?
And I can't check GPU-Z in full-screen HULU. When I tab over the GPU-Z, HULU immediately goes down to the small size. But if Optimus is choosing to use NVIDIA for the HULU movie small size, it would make sense to me that that's not going to change in full-size. -
So I thought the CPU was drawing less because the NVIDIA GPU was doing the work and drawing it's own power (for which I had no measurement).
Engineer help please. -
-
I think what's possible is... that running Intel gfx @ 100% load can draw more power than running nvidia @ 10%...
So if something is too demanding for the integrated graphics, perhaps it will be better kicking in the discrete.
I don't have any numbers for this, just saying... -
-
These benchmarks support the idea that in some situations the discrete card would consume more power, since the Optimus system actually got slightly greater battery life in the test than an IGP-only system. -
-
Why can't Nvidia let users have a switch for ON/OFF... it'd be easier for EVERYONE. Choose when you want it on and off. It's not like it SPLITs the integ. and discrete at the same time... ex.
You have word doc, movie, youtube on...
word doc uses integ.
movie uses discrete.
youtube uses one of them?
something like that. otherwise optimus seems like such a RETAR......ed idea -_- itd be easier just to have a on/off switch... man are computer users THAT LAZY? -
I also don't find it hard to believe that using the nVidia GPU is more power-efficient for Flash video, at least with Flash 10.1 installed. Remember, Flash 10.1 doesn't do hardware decoding on Intel cards, while the nVidia GPU can likely decode .flv with much lower CPU usage while in it's lowest power state.
And really, it's not really laziness so much as forgetfulness. I'd probably launch a game or movie and then remember I needed to flip the switch about 90% of the time. -
But I trust Optimus to do its job. And it seems to be doing it pretty well. -
I still have a question about Hardware Monitor. Is its power drain figure:
1) for the power the CPU is taking or
2) for everything on the motherboard (including the NVIDIA GPU when its on)?
Because if it is for the power the CPU is taking, then of course the power wattage will drop as the discrete graphics takes the load off of the CPU's graphics ... but it would also mean I have no way to check how many watts the NVIDIA discrete chip is taking. -
HWMonitor's power consumption is likely for the whole Arrandale chip (cores and uncore), but I wouldn't consider the figure at all reliable.
-
-
-
Yes, but the point is that it's entirely possible that the Nvidia GPU will use less power than the Arrandale IGP+CPU would have to do the same task.
-
-
Tweak 7/Tweak Vista does have a power tap that shows the drain on the battery, so that may be a way for you to estimate how much less or more power the Nvidia GPU is drawing.
-
Yeah, the best way you've got of monitoring power usage is via the battery. Most computer components don't tend to have the right sensors for measuring power usage.
-
Hey Quadro
If you have the resources can you test PCSX2 and tell me how it goes? -
Measuring by software is a very bad idea imho. You don't have any information about the sensors, their acquaracy and location. A real power meter put on the power cable itself would be the best option. But in that case, it should be a good high-end one as well to get a good measurement.
I don't think it makes much sense. The engineers at NVIDIA probably already tested this in their lab creating a good algorithm to switch between integrated and dedicated. By manual switching, you wouldn't win more than 15 minutes I think. Most likely, you will lose.
When I look at the TDP, the 210M (same core as the 310M) states 14W at notebookcheck. The igp of the Intel is said to use about 8W. But Intel uses a different calculation on TDP, which should be 75% of the absolute max, yielding: 10,7W max. It is also most likely that the max won't be reached for both. The Intel igp will clock lower, while NVIDIA does the same with PowerMizer.
As stated, the NVIDIA GPU will do some work the CPU usually does. Probably the difference between those combination (CPU + intel or CPU + Nvidia) won't be much more than 1 or 2W during internet browsing and playing a flash movie here and there on the web...
So when the laptop gets a battery life of 420 mins (7hrs) I can do some calculations. The battery has 61Wh capacity, wich yields 219600J of energy.
219600 / (420*60) = 8.714W dissipation for the system.
So when I am correct, and you would watch flash movies for 30mins @ +2W TDP, this would give the following battery life:
8,714 + 2 = 10.714W dissipation during 30 mins
10.7 * 60 * 30 = 19285J dissipation
219600 - 19285 = 200315J left in the battery
(200315 / 8.714) / 60 = 383 minutes runtime left
383 + 30 = 413,3 minutes of total runtime (6:53)
As you can see, these little improvements wouldn't make much difference. You won't watch flash movies all the time when you are working on the battery. You won't consume much more energy compared to the most optimal situation. Your real loss won't be much more than just a few minutes. It's not really worth tweaking.
What is really worth to look at is to undervolt your processor. That is afaik not possible for the Arrandale processors. I saw great improvements on the Core 2 Duo culv notebooks, where users are reporting to get an additional 30 mins of runtime.
I van't find the voltages, but 5.5W @ 0.9Ghz could be at 0.7V maybe. -0.1V isn't unusual in undervolting, yielding (0.6 / 0.7) * 5.5 = 4.7W dissipation for the CPU (assuming Quatro his measurements were for the CPU). But to get a bit saver on calculations, ill set this to 5W. This 0.5W lower dissipation will have huge impact:
(219600 / 8.714 - 0.5) / 60 = 445 minutes of runtime, that is an improvement of 25 minutes already.
U30JC discussion thread
Discussion in 'Asus' started by coriolis, Jan 10, 2010.