Naw, you have nothing to worry about. As you can see from the benchmarks, even without the cooler it only hits 60 C, which is like an icebox in GPU terms (my old desktop x1800xt regularly hit 85 C at full load - it's crazy that my laptop is slightly more powerful and much cooler). Clevo did an amazing job cooling this thing. The real advantage of the Zalman cooler is keeping the GPU fan from spinning up for so long while you're just surfing the web.
-
Agent CoolBlue Notebook Deity NBR Reviewer
ah ic, I hope my lapcool 2 is sufficient in terms of size then lol...just a couple more days...I am expecting it to ship on monday and I have 3 day shipping so I guess thursday is the day I finally get to see this bad-boy.
Tell me though, the highest end computer/laptop I've ever used was let's see..a 3.0 ghz prescott(single-core), 1.5 gigs of DDR RAM, probably less then 667 mhz, and a 6800XT.
I hope I see a huge improvement in terms of graphics on counterstrike source, ^_^. -
Hehe, oh yeah. Just played a round in Office, 1920x1200, 4xAA, 16xAF, max settings. Averaging about 90 FPS.
-
Glad you got it working.
Mine still won't :/ The driver refuses to load in any compatibility mode; I'm guessing it's because I'm using 64-bit Vista. Oh well
-
Agent CoolBlue Notebook Deity NBR Reviewer
just to clear my doubts, does the laptop come with the OS and drivers pre-installed?
I'm a little skeptical on installing drivers since some people here are having problems with theirs.
EDIT:and the np5790 doesn't happen to support HDTV does it? I would like to get an external monitor soon and play some HDTV movies, ^_^. -
would you mind testing Medieval 2: Total War and providing benchmarks?
-
Yup, if you ordered your notebook with an OS, it will come to your door pre-installed, with the drivers already set up.
You can hook it up to an HDTV through a DVI or S-Video port assuming your HDTV has the port, and you have a DVI or S-video cable, respectively.
Stevek: If it has a demo, I'll have some benchmarks up probably sometime tonight.
-
Agent CoolBlue Notebook Deity NBR Reviewer
oh...won't you lose quality if you connect via a DVI or s-video?
and is there any bloatware? -
DVI-D has no quality loss associated with it to my knowledge. s-video would have a massive quality loss. It can only output 480i.
-
No bloatware at all. That's the bonus of not buying from the majors like Dell, HP, Sony, etc.
-
Agent CoolBlue Notebook Deity NBR Reviewer
ah yes what a relief, thanks!
EDIT: joga you may be swamped with many requests for your review but I was just wondering if you could give a few short comments or even a short section in your review about the quality of the built-in webcam. A few people like me are a little bummed about not having a 2.0 mp cam like the Sager NP2090 but I am hoping the 1.3 mp in the NP5790 is of higher quality then the NP2090s ^_^. -
Well, to be honest, I'm a bit disappointed in the webcam. The quality is alright - but it's the motion blur that kills it. Every time you move - at all - it will blur. Moving your head side to side, blinking, whatever.
Quality:
Example of motion blur (I barely moved my head):
I've never actually owned a webcam, so I don't know if this is normal for a webcam or not. In any case, it works and it gets the job done; I can honestly say it's the best webcam I've ever owned.
-
Agent CoolBlue Notebook Deity NBR Reviewer
ah, much appreciated! But yes it does get the job done. I don't think webcams are supposed to blur that bad but I could be wrong...I hope I am anyway lol.
-
most webcams are like most phonecams, good for stills... under-par for motion capture.
I dont use my built-in camera at all... had it for over 1.5 years now.
I have tested it initially with eyeball chat... seemed fine, but the novelty wore out quick for me. -
Updated the main review with Medieval II: Total War benchmark.
Medieval II: Total War (Demo)
Benchmarked with FRAPS, using the introduction for the "Battle of Otumba" mission. It's important to note that Real-time Strategy games are usually limited by the CPU, rather than the GPU. So if you configured your 5790 with a Core 2 Duo greater than my 2.0 GHz, you'll probably see higher framerates.
Settings: 1920x1200, 0xAA, 4xAF, all settings on "High".
Average: 18 FPS (perfectly playable for an RTS)
Image quality sample:
-
thanks a lot, that screen looks great. im prob going to wait for the 8700gt, hopefully i can get the same fps at1680x1020
-
Wow, those are pretty good framerates for such a high resolution, I wonder how much faster it might run if you ran it at WSXGA+ or WXGA+.
-
Blurring is pretty bad compared to ASUS G1 web cam ( also 1.3 M ). Neither is great quality, although not bad either, but there is a big difference in the way they handle motion. So NP5790 cam may be slightly better quality image wise, but G1 one doesn't blur at all, unless it's some heavy movement.
-
I wonder if it's the camera, the sampling rate, or the software. I'm guessing the smapling rate of the software.
-
thank you very much. now i have to look into that whole CPU up to 2.4 and see if that will help out. i have been reading a lot about how 2.0 is the sweet spot, but not sure about all that so i will have to do some more reading.
-
Agent CoolBlue Notebook Deity NBR Reviewer
yes the 2.0 ghz is the sweet spot if you want to call it that
why...?
the jump from the 1.8 ghz to 2.0 ghz has such a good performance boost because the 1.8 has 2 mb of cache while the 2.0 has 4 mb. But many people will tell you you won't see that big of a boost from the 2.0 to the 2.2 or 2.4 in games. -
Regarding the webcam, does anyone know which company makes it for Clevo? Judging from Joga's sample pic, the still one looks very sharp and bright. The blurring with motion is a big drag though.
The latest Logitech webcams have the same problem. While the new and latest lens are much improved and better than the old ones, the new drivers, however, suck wind big time. Logitech's support forum is filled with thousands of complaints about images being too dark and severe blurring with the slightest of movement. The updated drivers, downloadable from their websites, aren't any better either. I ended up having to uninstall the new driver and used a really old version I borrowed from my brother to get rid of the motion blurring. -
Let me get this straight: motion blur is a consequence of driver, not of hardware?
-
could be, not neccesarily
-
I thought we already went over the fact that the boost this cache gives is pretty marginal for most applications. Really it is the sweet spot for performance / price ratio. That is all.
-
2.2 would be the sweet spot, since it's not that much more than 2.0 and it is 10% increase in speed, but 2.2 to 2.4 jump is much more expensive ( $240 more usually ).
-
Not quite. There are several things to consider such as the optics itself and cmos vs. ccd sensors. But given equal lens quality and sensor type, the software makes all the difference!
-
I've been trying to find some stats on that because I want to know if there's a real world increase in performance between the t7300(2.0) and t7500(2.2).
Any links to back that 10% claim up? I'd really appreciate it. TIA -
well, 2.2 is a 10% increase in speed over 2.0
You will notice it in any application that uses both cores to render something. video rendering, music, 3D animation, etc.......
If you don't use any of those apps it may not make much difference, but still jump from 2.0 to 2.2 is just $90, while a jump from 2.2 to 2.4 is about $240. First one makes sense, second one, only if you don't care about cost. -
Well, the claim is based on the fact that the T7300 and the T7500 differ in only a 10% increase in clock speed. Hence a 10% performance increase. Check out the benchmarks here.
-
Thanks Odin. Even though it's only calculating Pi. Wonder how that translates into real world scenarios though. If it still remains at ~10% performance difference. They say in gaming, it doesn't.
I know from back in the day, a 10% difference in clock speed didn't translate into a 10% difference in performance. Plus I've upgraded CPUs in the past and the difference in clock speed never translated equivalently in real world performance. But those were single cores. -
2.0, thats not PI.... thats a multi-threaded test for one of Newton's functions.
Its extremely accurate (as compared to the old single-threaded Super PI)... although as for real-world comparison, this test will show the difference in heavy CPU intensive work (audio/video editing and conversion, DCC work, etc...) -
Calling out to the Sager 5790 (Clevo M570RU) owners:
- if you are using RivaTuner, and get 0 degrees for the temp readings, which drivers are you using for your notebook?
- There has been success with RivaTuner and temp monitoring for the new Clevo notebooks (w/ 7950GTX) using the 94.22 drivers... which are stock. -
RivaTuner shows 0 temperature for me using both the stock drivers and 165.01 which I installed last night to try out (obtained from http://www.laptopvideo2go.com). I should note I'm using Vista 64-bit.
I can try other drivers if you want, just let me know which ones
-
Good to know. Thanks. So for gaming, word processing, trading, etc. won't make a material difference.
-
For Vista (32-bit) users that do not see temp reading:
(from Joga):
Just right click the RivaTuner shortcut, select properties, and run it in Compatability mode for Windows XP SP2 (and probably run as administrator, although I wouldn't know since I already disabled UAC)
- (to access UAC) Control Panel -> User Accounts -> User Accounts (again) -> Turn User Account Control on or off
Conclusion: I believe that Vista locks down the system so much to not allow low level hardware access... especially with the dumb UAC.... as well as needing the user to have Admin rights to do anything.
Well, in gaming... it would help, especially in some of the newer games that uses the CPU for physics calculations. -
I've had UAC off since I got the computer.
I tried the compatibility mode when Joga posted that but it gives me an error 'can't load rivatuner32.sys' and won't load the driver into the program. -
It depends on the game. For most games (particularly graphics-intensive shooters) there wouldn't be any significant difference. On my old desktop I had an Opteron 165 (practically identical to an Athlon 64 X2, but more overclockable), which I overclocked to 2.5 GHz (from the stock 1.8 GHz). While a jump that big did have a bit of an impact on gaming performance (~5-10 FPS), a more moderate jump (like 2.25 up to 2.5) had a negligible impact (~1-2 FPS) in games like FEAR. A game like Supreme Commander on the other hand, is very CPU limited, and a 10% increase in CPU speed may indeed have close to a 10% performance increase. Shooters may begin to go this way too as games like Crysis have more advanced physics and AI, but also remember that games are just now beginning to utilize multiple cores, which should help offset any "deficiency" in raw CPU speed.
-
I got a number of confirmations that Vista 64-bit has issues with RivaTuner, the fix that was described above is for Vista 32-bit versions.
Lemme see if I can find a work-around. -
Cool, thanks
It's not a huge deal or anything, looks like the rest of the program works fine and it's the price I pay for using vista-64, but if there's a quick, dirty and easy fix than great! -
You can easily boost the FPS on the webcam, I found. The problem lies in however BisonCap determines what is good in terms of "auto" speed.
Go to Options->Video Capture Filter->Camera Control Tab
And uncheck "Auto"
30 FPS seem to make decent freely flowing video, but I keep it around 70 . I noticed also the higher you go the more likely the camera will be able to detect details of objects in poorly lighted areas. -
Agent CoolBlue Notebook Deity NBR Reviewer
what is bisoncap exactly?
also is this boost in FPS also apply to applications that actually use webcams such as MSN,AIM,yahoo,etc.? or does that only apply to bisoncap. -
Bisoncap is a basic video capture utility. The boost applies only to bisoncap but it at least shows that the camera is capable of of better frame rates than initially expected. I havn't had a chance to test MSN or yahoo webcam support yet but I imagine the FPS will be largely determined by your network quality and how the relative program encodes the video. From personal experience I have found MSN uses the best network/encoding code (in terms of webcam anyway) and provides the smoothest video as a result. So you'll probably experience the best reasults from there.
---
BTW, is anyone else slightly annoyed by the vibrating sound the keyboard makes as one types? I havn't noticed anything similar on my previous Sony Vaio, but thats probably due to the fact that that laptop was SIGIFICANTLY louder than this lovely Sager. My portable USB harddrive is even many times louder than the fanscombined sound of this laptop.
I'm also curious to where exactly this Audio DJ feature advertised is. Seems like that was a sham in terms of marketing to just name the possible capability of controling the PC through a remote contol and the infrared transciever. Lame, if I'm not mistaken. At least everything else is sexy. -
Hmm, I don't see a framerate option, but I do see an "Exposure" slider (which I presume is what you're talking about). Basically what happens is that if you lower the exposure (i.e., increase the framerate) you get less blur, but the camera can't take in as much light per frame, so the picture is darker.
I'm in a fairly dark room, so the camera was slowing down the "shutter speed" to compensate for the darkness, and that's why the blurring was so bad.
I guess you can get around it by cranking up the camera's brightness, or just sitting in a brighter room.
Vibrating keyboard? I haven't noticed anything. It feels and sounds just like any other laptop keyboard I've tried. As for the AudioDJ, that's a typo on whatever site you read it on. The AudioDJ was on the old 5760, but it was taken out of the 5790. -
Actually I had the reverse effect regarding increasing framerate. I was in a dark room and could barely see anything through the cam, but when I bumped the framerate to 70 I could suddenly see a lot clearer.
I'm definately experiencing keyboard vibration. It seems to mostly come from the right side of the keyboard when I type. I guess I'll just have to learn to type with less force or just appreciate the feedback sound =P
Nice typo though, considering it's still up on the XoticPC site and the Sager site regarding this exact model. -
Does the Zalman cooler fit the entire notebook or is it a little bit smaller? I couldn't tell from the side view pic you posted.
Thanks for the updated numbers with the cooler. -
its smaller, the 5790 hangs a bit off the cooler...
side question:
how much of a difference from WUXGA vs. WSXGA is it on battery life? -
Agent CoolBlue Notebook Deity NBR Reviewer
I haven't felt any keyboard vibration yet. And as for the webcam being blurry, I realized that it's only blurry if you move really fast. If you just move around normally it's fine. Just my 2 cents, thanks!
-
The cooler hangs over the left and right sides (about 1 inch on the left, and 3 inches on the right), but the feet are completely within the cooler.
Pictures:
- The overhang
- Where the back feet sit on the cooler (the front feet sit directly behind the little plastic bar at the front)
Edit:
Good question. I wouldn't imagine it would make much of a difference. The main drain on the battery is the backlight, not the pixels themselves. I guess the video card might have to work a little harder to push a few more pixels on screen, but I doubt it would make any kind of difference in battery life unless you were gaming (which probably isn't a good idea if you're running on the battery
)
-
Agent CoolBlue Notebook Deity NBR Reviewer
hey joga, what is the usual idle and load temperature for your hard drive? Mine floats around 55-63C on idle/load and it worries me. Should I be worrying about this?
I have the momentus 7200 RPM if that helps anything lol haha.
Sager NP5790 Review ? Photos, Benchmarks & Impressions Inside!
Discussion in 'Sager and Clevo' started by Joga, Jul 26, 2007.