The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Dell XPS M1730 Owner's Lounge, *Part 3*

    Discussion in 'Dell XPS and Studio XPS' started by BatBoy, Oct 6, 2009.

  1. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    I say the 1. is correct.
    After the bios screen goes away plug the AC in, the only downside is that theres no fans spinning, but they are managed by A11 bios, assuming you're running A11 (with better fan profiles).

    As for external connections, go always with DVI-DVI then use the headphone jack with an Y splitter for audio, or a direct jack-to-jack connection. Most of the TVs which have DVI should have a jack input as well.

    The DVI-HDMI should work, but theres no HDCP with our gpus. So the best is DVI straight to DVI in TV to enjoy the best picture quality. ;)
     
  2. Fortune7

    Fortune7 Notebook Guru

    Reputations:
    110
    Messages:
    59
    Likes Received:
    0
    Trophy Points:
    15
    Thanks - very helpful. I booted-up without AC (low fans) and then ran prime95 stress test, but it failed the test after about a minute, due, no doubt to heat, but the fans didn't speed-up during the stress test (?). I'm using A11 BIOS. I probably shouldn't go running high - stress applications without those fans up at full blast.

    My TV in Brazil only has HDMI inputs; maybe DVI is a european thing; I haven't seen it provided on any TVs in Brazil, and here in UK the TVs we have here just have HDMI and SCART (which IS european!)

    Cheers, Andy
     
  3. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Dvi is an international standard afaik, more like the top quality connection in the pc world (every desktop gpu have at least 2 dvis as an example).
    Dunno how it works in other countries but the presence of a dvi connector on tvs is subjective naturally.
    At least i know that samsung usually puts one, dunno others, Lg as well.
    And yes we got scart in europe, brazil may be different (as they are ntsc instead of pal), infact scart connector is a standard not even known outside EU when we used it alot (since its the best signal after s-video because it is true RGB).
    Iirc Scart was used in japan as well as the first true rgb and its backward compatible with composite and s-video as well.
    God ive gone ot again, sorry.
    Anyway try with a dvi-hdmi, it should work minus the audii naturally. Technically hdmi carries the same digital dvi signal without a big loss of quality (well that depends by a thousand of factors tbh), so worth a try.
    Sorry for the ot but when tech things are involved i go on fire :D
     
  4. taveston

    taveston Notebook Enthusiast

    Reputations:
    2
    Messages:
    44
    Likes Received:
    1
    Trophy Points:
    16
    Has a consensus been reached on whether it is a good idea to install the A11 bios?

    Kingpinzero - I would be happy to donate for a tweaking guide. Please ensure it caters for those of us still using 8800m SLI as well though.
     
  5. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Well its a matter of tastes. A11 enables fans as soon a the gpu reaches 50c so it makes an "on off" behaviour which can be an issue.
    Cooling is improved thought.
    And thanks, the guide will cover every gpu.
     
  6. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Ok, time to test the setup and see how the RAID 1 array performs :D

    Hmm, is this right? 2 WD scorpio black 320.7200 in RAID 1?

    [​IMG]

    Seems kinda low, RAID 0 scores were much better.


    Here's the RAID 0 results. [​IMG]
     
  7. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Come to think of it, RAID 1 is only for servers and backup and such. If I delete the array, will the information stay available? I`m too lazy to install everything again... :D

    I guess I should've read more about it.

    Raid 0 = fast read and fast writes
    Raid 1 = sort of fast read, slow write

    Waste of a disk.
     
  8. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    I both case its a waste of disks man, i told you,you dont need such speeds if you game on it.
    For 50mbi in read speed gain you loose a whole hdd? You call this a deal? No way ;) plus 7200.4 its fast as hell in single drive mode.
    I advice you try a single disk solution, its the best bang.
    Anyway raid 1/2/5 are mirrored and you can break the raid if you need to.
    A fresh raid 0 will last the same as your previous with exactly the same problems and its not worth it, without doubt.
     
  9. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Weird thing now, after installing everything, including dx, dotnet, flash, codecs and such, youtube videos lag like crazy, opera hits 50% usage when watching and firefox has some plugin container.exe is using half of the cpu also, when watching videos...

    This is with the 265.90q driver, fresh installed...

    I`m not missing any drivers, I`ve installed every single device, yet youtube lags like crazy :mad:

    Nevermind, apparently, I needed an older flash player version, now it`s all good.
     
  10. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,240
    Messages:
    39,344
    Likes Received:
    70,689
    Trophy Points:
    931
    I agree with Kingpin... RAID1 is wasted disc space. And, I don't really think RAID0 is worth messing with. All just a bunch of brouhaha to me.

    I'm basing the second comment on my experience this week, which I mentioned over here.

    eleron, if I were you I would just put my drives in normal AHCI SATA mode and be done with it. Use HDD0 for your OS, games and apps, and HDD1 for your data storage.
     
  11. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    I already have 2 external hdds of 1 TB each for storage, haha.

    Basically, now I have 2.3 TB of space instead of 2.6... but since you guys say the data stays even if I mess the up the array, it`s all good.

    Ok, here's another weird part. In Nvidia Inspector, I've forced AA to 16x and there's like only minor slowdowns, in the exact same places where I had slowdowns before. So it must be the CPU that's holding everything back, cause I've eliminated every other cause...

    Feel really tempted about buying that extreme CPU now...but in case the xps dies before the warranty, whatever they'll provide as a replacement will probably not support it...
     
  12. cebalrai

    cebalrai Notebook Consultant

    Reputations:
    36
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30


    I highly recommend a Seagate Momentus XT hdd. They have 4 GB of SSD memory, so your frequently used files are very quick.
     
  13. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Hmm. Even with max AA the game still has some fuzzy corners and such, and nowhere near as realistic as... say a PS3 game.

    Right now I really hate my XPS :(
    I`ve played around on a PS3 on an HD 30inch TV and wow, did it look cool...

    So I went to 1440x900 and it`s absolutely smooth. I`m convinced the CPU just can`t keep up with the GPUs in SLI. There`s no other explanation...
     
  14. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    A few thoughts mate:

    - Ps3 version doesnt have the eye candy of pc
    - Ps3 version runs with Medium details excluding textures which are in high res, this is what i benched so far
    - Ps3 version is locked @ 30fps vsynched. If you want you can get FPS_Limiter and lock the fps to 30. Thats your smoothness.
    - Ps3 uses a 3 core PowerPC Cell cpu with 7 threads (SPUs), basically 7 micro cores, more like the intel HT architectures in i7
    - Even with my T9300 which is a GOOD cpu, i was limited in alot of games. X9000 changed the whole machine.
    - You're CPU limited, and we already told you so. The more games come out, the more you'll be limited

    We already adviced this, so again, think for real to upgrade to X9000, its the best thing you can get and you'll be futureproof for alooooooooooooot of time.
    A 6mb Cache 3,8ghz cpu is basically a speed demon, and thats what you need. NFS HP does bad on low end cpus because its a cpu intensive game, look even on my X9000 it stays above 80%, sign of CPU bottleneck, but at least it doesnt go below 15fps. An average of 50fps maxed out (no high shadow rendering disabled trick) is what i got. Not to mention other games, they do perfectly fine as well.
     
  15. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    I second that, but to put an end to his pain is to upgrade to X9000. I posted a link from ShirleyCpu, great seller, crazy fast shippings and sweet items. Plus the cost is halved ($250) althought the cpu is an ES. Totally WORTH it.
     
  16. Fortune7

    Fortune7 Notebook Guru

    Reputations:
    110
    Messages:
    59
    Likes Received:
    0
    Trophy Points:
    15
    Thanks to Kingpinzero's mini-guide for overclocking the 9800M GTX, I am getting good results on my limited few games (more on the way!). However, the 600/900/1500 settings are lost on shut-down. On re-boot, the GPU reverts to its standard (?) 500/700/1250 settings. What can be done to make a more change to the GPU overclocked settings? I have tried Nvidia inspector and EVGA Precision but neither of them hold the settings.
    Cheers, Andy
     
  17. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Yep, didn`t even read this post when I edited my previous..

    Stupid CPU is holding me back... :D
    I`m still debating whether to buy that CPU or not... I guess I`ll flip a coin :D
     
  18. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    No problem,
    anyway theres a lot of things you can do to keep oc clocks.
    First with Afterburner.
    Fire it up, set your clocks, then right click on one of those number buttons and it should light up the one you right clicked.
    Then check the box that says "apply these clocks at startup".
    Go into settings and configure the program to start with Windows and Minimized.

    Or

    Get the newer Nvidia Inspector 1.95, released 2 days ago.
    Click on show overclock button.
    Then set your clocks, then right click on Create Clocks Shortcut and select Create a Startup Folder shortcut (something along the lines).
    Done.

    Reboot and open GPU-z to see if clocks are correctly applied. Notice that you have to choose either one method or the other. I advice inspector as is the most easy one, althought MSI can be more customizing friendly.

    @eleron: do not ponder further. Upgrade the beast and be done with. :)
     
  19. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    I fully agree with KingPinZero.

    Firstly, the only racing game on the PS3 that looks better on a full 1080p 42inch screen is GT5/Prologue. Because it's designed from the ground up for PS3 hardware.

    However, the Cell is not a three-core system. It's only one PPE with 7 active SPEs. These don't make it good for just number-crunching, but also vector calculations. As a matter of fact, it can do the CPU part with just the PPE running at 3.2 GHz -- the SPEs actually help with the prost-processing and features that just wouldn't be supported or managed on the limited, outdated PS3 GPU. This is why we're able to see some amazing effects and lighting on PS3 exclusives.

    At the end of it all though, the M1730 with a good CPU and even the 8800s would sqaush the PS3. I mean, I've tried games on the big screen and they look better since the PC version supports a better resolution while most of the PS3 titles are limited to 720p.
     
  20. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Great Kade, i kinda confused the cell with the xbox360 cpu which is a 3core (i was used to comparisons when i was reviewing titles on both consoles in a magazine, back in times).
    And yes, the RSX is loosely based on geforce 7600/7900, hence the limited bandwidth (25gbs compared to 256gbs of xbox360) and performance.
    Also, tbh, GT5 still cant render at fullhd like we do on our pcs. GT5 is, in fact, 1440*1080 upscaled to 1080p.
    This makes the game loose the AA, while a 720p res kills tearing and adds 4xMsaa, with loss of details (eg blurring) on textures.
    The real ps3 power has you said comes from the joint force of the SPEs in conjuction with RSX - this gives outstanding games like Uncharted or KZ 2/3, but not without compromise. While the game on the surface looks amazing and plays even more so, the general texture details (or even the total lack of AA) is above the "medium" settings and in 99% of the situations the framerate is locked (or not vsynched at all) at 30fps.
    Usually due to the simple architecture and due to the whole GPU processing power of xbox 360, the main differences between the two in multi platform games are, in fact, AA enabled on 360 with a better shadow/texture resolution.
    In some games there's even more details on 360 versions than ps3, vice versa happens as well.
     
  21. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    For Killzone 3 and God of War 2 they used MLAA instead of MSAA for the PS3.

    By the way what score do you get in 3D Mark 06 now Eleron? You should at least get 12k with those GPU´s and your CPU. If you don´t get 12k then something is wrong with the chipset drivers.
     
  22. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Again, valid points from both of you. They have found, in the last year, a means to implement MLAA using the PS3s SPEs, which has really got people talking. I mean, it might be weaker hardware, but they've found ways to exploit the limitations to their fullest. Although it should be acknowledged that even the exclusive titles that look amazing come with a compromise. I've found that many of our mainstream FPS console ports looks better maxed out on the PC than certain 'ground-breaking' PS3 exlcusives. I mean, check out maxed out full-res Modern Warfare 2, as an example.

    Now we have Crysis 2 on the horizon, running on CryEngine 3.0, which has deferred shading. A very similar tech to that of Killzone 2 and 3. When people were actually trying to make Killzone 2 sound technically superior to Crysis, they couldn't stop going on and on about the 'multiple light sources' and how PC hardware couldn't pull it off. Now thanks to this new engine, Crysis 2 has so many light-sources and lensflare effects that even give Killzone 2 and 3 a run for their money. In Crysis 2, even sparks give off lensflare and cast lights. And here's the real kicker -- Crysis 2 actually runs great on weaker hardwware because they've made the same efficient design choices that are in place to make the software run on weaker machines (aka consoles).
     
  23. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    I haven't tested yet, I`ll testing 3dmark06 now and will test to see.
    Standard clocks, CPU at 2.6.

    Well, tomorrow I`ll decide whether I`m getting the CPU or not. I`ve already shelled 320$ for the extended warranty...

    Ok, just as I suspected, my XPS has never really as good as the other ones I've seen here.

    I mean, just look at the results, I get a better score NON OCed than with 600/900/1500. It`s ridiculous. And I have the latest chipset driver...
    Standard clocks, CPU at 2.6
    [​IMG]

    OCed:
    [​IMG]
     

    Attached Files:

  24. hikkoo

    hikkoo Notebook Consultant

    Reputations:
    397
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30
    eleron.... the sm2.0 differences okay , when me oc gpus the sm2.0 score goes down an sm3.0 score goes up , But ya cpu score going down when ocing gpus is to do with the gpu oc tool, tried many different oc tools an found 6.03 Nvidia oc tool would give highest cpu scores for 3dmk06 >> but the problem is it glitches up ... lol good for benching but no good for gaming ...when bench with 6.03 could only go upto xp 197.16 (from nvidia site)

    To help get best 3dmk06 scores is finding right combo >>> version gpu OC tool + version gpu driver(lol can spend many hours on this)

    which gpu driver ya use for xp ... to get total score in pics? :)
     
  25. Bloodroses

    Bloodroses Notebook Evangelist

    Reputations:
    164
    Messages:
    316
    Likes Received:
    13
    Trophy Points:
    31
    Would think twice about those HD's see story below:


    Users frustrated with Seagate's next-gen hybrid drive - Computerworld

    I've seen these problems in person, took me awhile to track down what the problem was at the time.
     
  26. Bloodroses

    Bloodroses Notebook Evangelist

    Reputations:
    164
    Messages:
    316
    Likes Received:
    13
    Trophy Points:
    31
    Have to agree with your assessment on benching. Previously I had over 11k in 3dmark 06. Now with standard dell drivers, SLI disabled, no OC, no Throttlestop, etc I'm getting 9371.

    Running it on an external monitor doesn't help of course, neither does running the test with other applications running but the difference in performance/score is remarkable considering all the tweaking I had to do to get it over 11k.

    I wonder if the replacement 9800GT's are any better than my old sli, just not willing to test them out right now.

    I have to disable SLI using external monitor because I have problems outputting with SLI in games and DVD's. Had same issue with 8700's, prior SLI and this one. Anyone else experience this?
     
  27. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Kade I wasn´t that impressed actually with Killzone 2, however now I have played Killzone 3 multiplayer and I must say I am really impressed with those graphics considering the hardware. However the game would be no problem for PC to play at all, so those PS3 fanboy that said Killzone 2 wasn´t possible on PC needs to get their eyes checked :)
     
  28. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Hmm eleron 11,5k in 3D Mark isn´t too shabby actually. Now I got the 12k score with my old CPU at 2.6 Ghz and my 8800m GTX SLI´s at 640/1600/950 so the GPU´s where clocked quite a bit higher than your 9800m GTX SLI.

    Do you use XP by the way or Win 7? Since Win 7 get lower scores in 3D Mark 06 than what XP does.

    Messing around with the chipset drivers can yield good performance or really bad performance. I just messed around with mine and score 200 points less now than before. However performance in games is the same.
     
  29. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    eVGA Precision, since I`m on the 265.90 quadro driver...
    nTune doesn`t work with them or has issues.
    I do use an external monitor, true, but didn`t think that would affect the score...
    Still on XP, of course.

    It's funny how when OCing the GPUs, the CPU score goes down, and so does the overall one...
    I think I`m going to keep trying chipset drivers for now. Might also have to do with the 3 MB cache the T8300 has... this CPU is a real lazy , I tell you :smile:

    11626 is as high as it goes.
     
  30. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Correct, but you have to take in account that Crysis 2 is a relatively new product, while Killzone 2 is almost 2 years old.

    Looking back to it when it got released, there wasnt really a game with such features on pc, Crysis 1 for instance used a totally different gfx approach and missed some of the KZ2 features.

    KZ3 tries to get even more features squeezed out of the PS3, but its an hard time since Crysis 2 is coming out even on ps3 and its clear that the CryEngine 3 can do some amazing things as you mentioned.

    Other games just use the hardware in a more intelligent way, like uncharted 2 as an example. Uncharted 3 probably will be outstanding, and it will use alot of ps3 processing power to keep things moving.

    @eleron: scores are fine mate. What holds you back in those benches is the awfully low cpu score. I do consider our X9000 a low scorer as well, but your cpu manages to go even lower lol. Switch to an x9000 and you'll achieve 16k easily. Which translates roughly in a huge performance gain over games.

    Remembers that CPU feeds GPU. A sli setup needs a fast cpu to feed both cards and to make sli scaling working proper. So far as we observed you're cpu bottlenecked, therefore the performance with single card is comparable to the same with sli enabled.
     
  31. cebalrai

    cebalrai Notebook Consultant

    Reputations:
    36
    Messages:
    152
    Likes Received:
    0
    Trophy Points:
    30
    So.... Funny question here. In my beast's RAID-0 configuration where one of my hdds was going bad, I decided to just to a clean install on a new Seagate Momentus XT drive. When I tried to enter my Windows key during install it wouldn't take it. This was the key from a Win7 upgrade (Family Pack). It's now my understanding that I would have had to re-install Vista then upgrade again for it to work. And I do not have a Vista-32 disk...

    So out of desperation I entered the key from an OEM copy I'm actively using for a home build PC. To my surprise it took it, activated once I was online, and gave me the Genuine tag. It also installed all updates and Microsoft Security Essentials with no problems at all. I was really surprised this worked. For the past 24 hours everything has been smooth with no messages or anything.

    Does this surprise anyone else? At this point am I in the clear or should I expect a message saying my key is no good?
     
  32. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Agreed again.

    Actually, going with what Magnus has stated, there were many games that could do what we're talking about vis-a-vis Killzone 2. I mean, okay, Magnus doesn't like Killzone 2, but even consider Killzone 3 and Uncharted 1, 2, and 3. These games use a similar approach as well, and they look flashy. Now the whole reason why Crysis struggled with lighting was because it was using Forward Rendering, which makes one massive light source much more expensive on the hardware. It's like an old saying I heard on one of these technical forums where someone said, "Having one massive light source casting countless shadows from various objects is much more taxing than having three-hundred mini light sources casting just one or two shadows, each." So you see, these consoles games including Killzone 2, really, aren't that great at pushing hardware. What they are doing, however, is making use of efficient tecniques, and Deferred Rendering is one of those techniques. Killzone 2 and 3 don't even use full HDR and yet we're fooled into believing that they have the best lighting because of all the individual light sources and the lensflare. I remember one kid on Gamespot who had deluded himself into believing that PS3 games are the only games to use 128-bit HDR, which is a joke since he was using Killzone 2 as his key example and that game's actually using LDR.

    Crytek could've gone the Killzone 2 route, but with CryEngine 2.0, they were looking for pure texture-driven large-scale lighting. They realised that while this looked better, they couldn't capture that flashy look from the consoles without changing things up, this is why we have what we have today. I've seen the comparisons, and I'll be honest, Crysis 2 doesn't look as sharp or detailed as Crysis. It's a fact -- the foliage and detail from the first game is almost ten-folds better. However, when it comes to all the fancy flashy hollywood-esque flare and post-processing, Crysis 2 does look more. . . stylish, and more importantly, it looks great in motion.

    Also, there are games from 2006, on the PC, that used Deferred Rendering; S.T.A.L.K.E.R. could be used as an example. There was also Dead Space, which came out before Killzone 2, I believe, but this is where the fanboy rants blur the topic because no one will care to consider a multi-plat title. They often argue that even Xbox 360 can't handle Deferred Rendering, as though it's some niche feature that only special hardware can accomplish when in reality, it's an economic alternative to having a diverse array of dynamic lighting. So despite all these misconceptions, Dead Space used Deferred Rendering, and we can see how amazing the lighting is in that game. What's more important, even a single 8800m GTX can run that title with over 60 FPS for the most part.

    It's just a matter of efficiency. Deferred Rendering is great for a game where you want lots of light sources without taxing the hardware. Killzone 2 was developed on that premise because it helps cope with the PS3's limited hardware. They did the same with PC games and our hardware outshined the consoles. As Magnus said, these games would run easily on the PC hardware if optimised properly, and they'd look better. Heh. Honestly, we all agree on this point anyway.

    --

    And Eleron, I have to agree with the other guys on this subject. You need a good CPU. I wasn't fully appreciative of this fact until NickBarbs sold me his awesome X9000. Believe it or not, it was only after I had passed the 3 GHz mark that I found my SLi bar scaling to almost maximum, and this is a given for dual GPU solutions. I can't get into the technical side of things, but to put it very simply and poorly; your CPU cycles provide bandwidth for those two cards to interface and work together, and a lower frequency will only introduce that much latency into the process. I would say that you need at least 3GHz to get the most out of the dual 8800m GTX GPUs, and even moreso in the case of dual 9800m GTX units.
     
  33. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Well I downloaded Crysis 2 Beta and wasn´t impressed at all. It´s definitely a console game and Crysis 1 looks far better. Although it´s only a beta, however the sheer scale is smaller than Crysis 1. Crysis 2 just feels like a generic Modern Warfare 2 shooter.

    If we are talking graphics then Killzone 3 looks much better than Crysis 2 does on the console. I just downloaded Killzone 3 single player demo and the graphics is astounding to say the least. I did however like Killzone 2, though not as much as I do for Killzone 3 that doesn´t look as dark as Killzone 2 did :)

    Also the funny thing is that Crytek tries to resemble Killzone, you have the exact same cover system they use in Killzone 2 now in Crysis 2 and you can slide exactly like in Killzone 2. I think Crytek this time tried too hard to gimmick Killzone franchise, although they definitely failed when it comes to the lighting in Killzone 3 and the amount of pure detail in the levels of Killzone 3.

    Nah Crysis 2 is a no go for me, just plain boring game, nothing like the first game.

    Another impressive thing with Killzone 3 is that it actually runs at 30 fps 98% of the time :)
     
  34. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    They failed in comaprison to the lighting in Killzone 3? I don't know about that since it's the exact same lighting. I don't know what more one could do with that, if you have some examples, then I'll happily look into those, but as it stands, I see no real difference. In fact, I think the lighting looks a bit more realistic in Crysis 2. The part about the detail, I will accept since it is apparent. Crysis 2 seems like a weak tribute to Killzone.

    And of course its linear, so are Killzone 2 and 3 -- another reason why you're able to see so much at so little a cost. Half of the eye-candy in Killzone 3 is due to static effects and nice use of animations and depth of field. In fact, the latter feature actually detracts from detail, but it makes the game look better than it would without that feature.

    I don't find that impressive. I find that standard. If Crysis 2 doesn't not match this in the final version, then I'll be really put off by the product. The PC version should be running upward of 45 FPS at the heavy points with an SLi rig.

    And yes, the console versions won't look as good as Killzone 3 since it is designed from the ground-up for the PS3 and PS3 only with quite a lucrative budget.
     
  35. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Well, I`m bidding my time for the next week or so.
    Then I`ll decide if I`m getting that CPU or not. Thing is, if per say, something happens to the beast, whatever system I`ll get will be without X9000 support...

    Then again, I could sell it ... Hmm. Tough decisions, decisions...


    EDIT : nevermind, I`m getting it. Now I need some pointers on how to apply that thermal paste, and which ?
     
  36. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    I applied AS5 to my X9000.

    Yes PC version of Crysis 2 will definitely look better. However the Beta at high settings didn´t impress me at all which by the way Crysis 1 did.
     
  37. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    You said you made a profile for it? Mind uploading it, just in case for the future? :D
     
  38. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Eleron, you need not worry. X9000 still has really good resale value. I would've sold you mine, but perhaps you should fish for a better deal.

    I'll be honest, I don't think it'll look any better than 'high' in the final product. Running it at 1200p resolution will make it look nicer than the consoles. That is hardcore setting in Crysis 2. Now I agree with you yet again that textures and mapping looked MILES better in Crysis 1 on high, but the sunshafts weren't even present in that setting, this is where lighting can be contended.

    On a side note, does Killzone do any better with its own textures? I played the demo on a smaller screen just to minimise the jaggies and still, everything looks more or less flat and bland, especially vegetation. Just like Killzone 2, the animations, depth of field, and lighting are what stand out most.
     
  39. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Texture wise Killzone 3 doesn´t stand up I agree. Well it will be interesting to try Crysis 2 in DX11 and see how it looks.

    I hope there will be a very high option in Crysis 2, however in the beta the hardcore mode is just High.

    However Gamer is Low, Advanced is Medium and Hardcore is High, so it there might not be an Very high unless that is specifically for DX11.
     
  40. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Couldn't they just make a very good 'very high' mode with just DX9?
     
  41. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    300$ is what I`m getting it for on eBay...

    Unless I find a better deal in the next 2-3 days, that is.
     
  42. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Great post Kade, its obvious that you're into technical things, maybe in a professional way, like i do.
    I could talk about these things for a month without getting bored at all ;)
    And btw, couldn't do a better explaination of CPU Bottlenecking like you did, althought it attempted to describe it in the most easy way i know.
    Great post, again ;) Thumbs up
     
  43. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Yup, got me convinced, all of you, for that matter :)
     
  44. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Is that with shipping?

    --

    KingPinZero,

    Cheers for the very kind words. I do take considerable interest in graphical development techniques and have been on a personal quest to bridge the issues, misconceptions, in this generation of consoles. So naturally, I've developed a personal interest in exclusive titles that take advantage of specific hardware. Also, thanks for acknowledging my efforts to articulate the CPU bottleneck issue in the most simplest of forms. I think. . . it was much-needed.

    Thanks again. Perhaps we'll all have a nice round of talks over the up and coming games.
     
  45. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Yep, shipping is free, since it`s standard from Italy.
     
  46. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Well it seems that advanced is Very High, for me personally it feels like this definitely is a console game from the ground up.
     
  47. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Ah, so its from my country? I hope its good then.
    Our postal service is a mess actually, so i really hope its free man, because otherwise it would cost you roughly $40 only for shipping.
    DPD takes our parcels for uk while USPS takes them for USA.
    And yes....im italiano lol.
    Ciao :D
     
  48. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Give it a whirl when the full DX11 featured version will show up on our monitors Mag.
    I know the beta could have been better, but supposedly its a beta and its not meant to be played by gamers at his state, keep in mind that.
    The eyecandy will be there with a good story driven plot, althought gameplay has been modified a bit to accomodate console players.

    Anyway preordered the Limited Edition over EA Store for 29,90, that was a crazy deal, too much to let it slip.
    Also im not bothered with the COD Like multiplayer mode. The first one was good but waaaaaay too much dispersive and big that gets boring in a minute.
    And i guess my GTX570 will choke and die with hardcore settings in DX11. I guess Crysis 2 will be like another Metro 2033.

    Games like that doesnt even run decently on 3fire or quadsli setups....lol.
     
  49. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    Eleron, actually, I could've beaten their price, and I would've happily gotten rid of mine at a cheaper cost for your benefit. Unfortunately, I still need it a bit longer so perhaps they're good for you -- same people that gave Nick what eventually became my CPU.

    --

    And dudes, about Crysis 2. The engine is actually full of a lot of potential, but the game is scaled-back. However, I think they will refine the game's features and bugs, and then we'll have a game that looks flashy and superficially impressive, like the console games, but with a slight edge. For being fair, I just won't compare this game to he original Crysis since that was a virtual Sand Box game built for the PC only, and this is a. . . well, designed to compete the console shooters but with a bit of added story. Why not? It ain't all bad.
     
  50. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Dude, I`d rather deal with a fellow NBR user than some noname, and if you say you can beat their price, shipped that is... I might consider it.
    I mean, the XPS I have I bought from NBR :)

    How long do you still need it and what`s the warranty status on it ?

    Kingpin, that's cool. My italian is a bit rusty, but I still know some words here and there.
    Ho studiato l'italiano 11 anni fa... pero non ho dimenticato tutto :p
     
← Previous pageNext page →