Sure thing.
P170hm3
-Intel 2720qm
-dual momentus xt's in raid 0 ie 2x500gb (simply because the newest vertex3/m4/corsair etc are in short supply and quite expensive and this setup gives excellent bang for buck performance)
-16gb kingston hyperx 1600mhz
-bluray burner
-intel 6230wifi
-gtx485m
that about covers it i think...
In case anyone is wondering, the 3d emitter is actually BUILT INTO the laptop/notebook. ALSO, this model has 3 or so more speakers built into it than the standard p170hm.
And these are high quality screens and considering the cost of the screen upgrade on the standard 170 may as well get the 3d model which comes with better higher quality 120hz screen for slightly more dough. (atleast then its 3d and you get the glasses and more speakers too)
-
-
How's the screen? Matte and 120Hz should really kick in, pal. I've never used a laptop with 3d screen before, only a desktop LG W2363D... but it's such a pain to get to 60Hz monitors again((( Also, do you possess any info on your 3d glasses, are they from a newer revison with 60hr playback? Can you provide us with some short video or a few pictures of your system? Nice built, BTW)
-
The screen is fantastic.
Its full hd, matte, 120hz 3d and identical quality wise to those upgrades. Thats why Id think its a shame to just upgrade your screen when you could get a much better deal by getting the 3d model for a little more.
Dont really know which revision glasses these are but id assume perhaps the newest out? I mean the HM3 is barely out anywhere and this is built in so id hope its the latest lol.
I have played it for a long time and still havent needed to recharge it yet
Yes I have no problems posting a few pictures or videos on like youtube etc once I move into my new place with proper broadband internet connection. -
Congrats
Definitely post some gameplay vids in 3D as I am curious to see how much difference the 485 makes versus the 460 regarding 3D frame rate. -
-
I have crisis on it at the moment and once fear gets released might have that too. Anything others that might be worth it? (preferably new) -
Also look at clevo's website and the schematic outline, youll see the extra speakers. Not a massive improvement but a little bonus and allows the unit to be a little louder overall. -
-
I am now ready to upload some videos and will do so very shortly hold tight....
Grrrrr upload takes too long with this stupid mobile internet. Guys ull have to wait till i get my apartment and adsl2+.
For now I can give you some ideas:
using fraps in crysis 2 just after opening scene when ur on top of the building looking over the park:
all extreme settings w/v sync on fps is ~60 @1024x768
36 @1920x1080
all extreme settings w/v sync on and 3d on ~ 56 @1024x768
30 @1920x1080
above settings with gtx485m overclocked (699/1398/1600)
non-3d ~75
~43
3d ~ 60
~39
unigine w 3d and mostly high settings ~415 or somewhere around there. -
-
120hz refresh
-
So the game's running at 120Hz refresh rate and still getting such good framerates? Wow. That's amazing. More and more now I want a good P170HM with a 580M and a 3D screen... I'll be like AHHHHH *Angelic light throwing down on laptop*
-
any stats on the 580s at all?
im wondering if i should get it upgraded from the 485 to the 580. -
-
anyone want a brand new 485m then?
-
485m to 580m will probably be marginally faster (like <10%) but will be more power efficient. 460m and 560m are nearly identical performance-wise.
IMHO not worth the premium paid, because difference between selling your 485m and price of 580m will probably be > $200. -
-
Their chip yield probably improved so could reliably run chips at the higher clocks.
My point is that it's not worth the upgrade from 485m to 580m in any case. -
Now selling a 485M to buy a 580M may not be feasible, unless you only pay about $30 or so in the exchange, this is true, but the 580M should cost the same as the 485M in terms of configuring laptops. Also, as I said earlier, the 560M is using a different, more power-optimized core than the 460M. It basically has better stock clocks, more overclockability and generates less heat. If the 580M does this over the 485M and it costs the same, why is it a bad idea to sell a 485M and upgrade? As long as you aren't losing a lot of money doing it.
-
All I'm saying is once the 580m does come out, chances are it will cost about same new as the 485m (like the 460m to 560m). So the likelihood of selling your 485m within $100, heck even $200, of the 580m is not likely.
-
Ive been reading that the 580m may allow for optimus technology to start kicking in which means improved battery life; something I do care about.
-
-
Anthony@MALIBAL Company Representative
-
oh...... so clevo had decided to disable the function as it were?
so IF nvidia has improved things etc clevo might enable the function with the gtx 580m?
is it not enabled with the gtx 560m? -
-
-
As far as I am aware the P1X0 will never have the optimus enabled regardless of which card is being used.
Sorry guys -
-
D2 Ultima - thanks for answering the million dollar question why Clevo doesn't implement Optimus - you're the first to provide the technical reason. We were previously told it was an incompatibility with Linux (or lack of Optimus support for Linux).
Do you think Clevo could wire the IGP to the video out and put a soft switch in BIOS and still be upgradeable? -
As for the soft switch thing, I honestly don't know. Technically, disabling the driver for the main GPU should be able to allow a swap to a secondary GPU driver. Then re-enabling it would swap it back to the main GPU driver. There could easily be a hotkey to do this. The screen'd just flash for a second if you did. Windows doesn't like two or more GPU drivers being installed at once since Win Vista though, and it carries over into Win 7. I don't know how they even get optimus to work like that. If microsoft wasn't so female-dog-like, I'm sure there'd be a better way. -
Hey Guys,
The entire new generation of nVIDIA hardware (5xx series) supports true Optimus implementations now. Basically how it works is in an Optimus implementation the dGPU (nVIDIA) directly copies output into the output buffer of the iGPU (Intel) and the Intel chip is whats hooked to the display connector. It requires windows 7 because of the memory mapping and sharing thats been implemented in Windows 7 to do the DMA access of one card to another to basically stream the output when the dGPU is in use. However, that means that you LOSE all the extra neat features of a direct GPU output, things like 3D Vision have a march harder time working if at all. Also you lose an extra display connection (the nVIDIA can natively support 3, while intel only two). And the memory sharing technology doesn't work in Linux because it would take nVIDIA to implement it in the driver and they said no already. So to do this on their high-end while not technically impossible, would require a motherboard re-work and so obviously not this generation.
For the OTHER way to do it, is to use Switchable graphics with Mux's (Swiches) that can be configured to change which display device can use the output on the screen. This would have the benefit of being able to use JUST ONE or the other video device, and linux compatibility, however, yes it would be a hotplug operation when you switched and wouldn't be seemless to the OS or to the user. And those mux's cost money to design and implement. Again a motherboard rework would be required
Overall yes, you can have optimus AND MXM cards, BUT no major manufacturer has decided to implement it. Alienware is doing mux switching on the m17x r3 platform. -
Finally, a good, in-depth response. So you can't do mux switches by request Mythlogic? I know you assemble your own machines. If you could that'd be a lot of business for you for people who want the best of both worlds and have extra cash =3.
-
But mux's are really just little tiny tiny chips that you then have to run traces to on the motherboard and so on.. Its just not something that you can addon later. We wish they would just spend the $3 / motherboard extra to put them in, but I understand $3 / board adds up over 100's of thousands of boards. -
For shame, Clevo. -
Well exactly. $3/board? Come on. Comparable Alienware machines cost 10-20% more and people gobble them up. And they have switchable graphics.
Do you really think people would flinch over an extra $5, heck $10? I understand quantity of scale, as I worked in the auto industry as an engineer for over fifteen years, and saving $1 per car meant saving $500k a year. And if we could spend $3, we would charge $50 and it wouldn't matter if it was something that added a feature to the car. -
Would it be possible to still use the functions of the iGPU but not the display.. as far as video encoding. There are a couple progs here and there popping up with quicksync support.. Plus its always a bummer to miss out on an extra GPGPU device.. They support some gpgpu stuff, don't they?
Sorry to bump up an old thread.. but it was a good one.. got a better understanding of that..
and.. wow. Nvidia's answer to the linux community.. "No".. Thats statquo for just about every video (chip maker) manufacturer out there.. Right now I'm pulling my hair out with the proprietary Imgination tech SGX powervr on the ti 4460 soc (pandaboard).. They are awful. They should advertise their product as 1080p playback (in theory).. but i digress..
btw.. intel are a bunch of sweethearts to the linux community.. so thats an exception.. i'd love to see iGPUs catch up to dGPUs in speed (or at least closer) - but theres still the issue of sharing system memory.. bleh -
Hey Myth,
check out what this guy has done:
http://forum.notebookreview.com/har...dge-throttling-permanent-fix.html#post8197737
He got the iGPU working via implementing an Intel HD Vbios option ROM via MMTool into bios and it works by hard coding either to use iGPU or dGPU (since bios has switch hidden, just like ours).
All this on a NON OPTIMUS HARDWARE!
Either Asus wastes lot of money or Nvidia simply lied to us... -
, so its not quite the same thing, but its something we are still working on to be able to do video encoding etc on the iGPU.
Clevo P170HM3
Discussion in 'Sager and Clevo' started by hizzaah, Apr 5, 2011.