Eambo's right.
In real games, there is very little difference between 8800m GTX (9800m GT) and 9800m GTX. Most difference it shows are in benchmarks.
The clock of 9800m GT and 9800m GTX are identical. Set at 500/1250/800, the main difference is in 2x16 extra shaders.
Otherwise it seems, that each 9800m GTX runs 10Ws hotter than an ordinary GT. In Sli config, your looking at 20W extra heat to deal with, under full load.
http://www.lancelhoff.com/9800m-gt-vs-9800m-gtx-benchmark/
9800m GTX sli for M1730 has pretty limited supply, thus there is very hefty price on them.
Put it simply, its not really worth upgrading, unless you have serious amount cash to burn for rather marginal improvement.
-
-
Afaik, there are not mobile cards with more than 256 bit. Sadly.
-
Here's a nice read...
http://techreport.com/discussions.x/17067 -
Just heard on an Ozzie forum Dell Asia Pacific has removed the XPS1730 from it's website.....
-
Windows 7 will be out Q1 next year *at the latest*. I can see an OEM copy in September, and a full release before the end of the year personally. But anywho back to the topic at hand - Directx 11.
Windows 7 brings with it Directx 11 - and I haven't seen any sign of any cards supporting this yet. Have Nvidia said anything about mobile cards that will support this? -
Apparently DirectX 11 is backward compatible with DirectX10 cards. Not sure though, and that may be subjected to changes.
-
I just installed 190.15 and found an interesting option in the Manage 3D settings section of the NVIDIA control panel.
-
SomeFormOFhuman has the dumbest username.
As for 9800M GTX upgrades - Well if you have spare cash, spend em'. If not, stick to what you got. Our 8800s/9800M GTs don't give significant performance changes vs 9800M GTXes anyway. Unless if they give GTX 280M which I doubt. But even if it's so, it ain't worth the upgrade either. -
Sadly for the time being it looks like we're s%$t outta luck for fan control. Despite Franck's (from HW Monitor) best effort (he put allot of work in), we have been unable to get GPU Fan 2 functioning well. We had a big remote desktop debug session today.
Both the CPU and GPU Fan 1 work perfectly, but Fan 2 just does not want to stay on. Franck even tried constantly setting the Fan every second to see if that would work, but it constantly turned off in less than 1/4 second.
We're both pretty sure there is a hidden Dell PWM command to switch off the auto control, but since Dell would never disclose this info it's going to be just too much work to get this to function.
Just a note on how well it would work, just keeping CPU and GPU fan 1 at 50% (almost not audible noise) yielded idle temps at sub 20C for CPU and sub 40C (35 to 40C) for GPU 1. All this while GPU 2 was idling at 65-75C.
Couldn't test while gaming because as soon as GPU 2 reached a certain temperature (think around 75C) auto kicked in resetting all the fans.
Franck hasn't given up yet, though I think it may be worth it. I sent him an image of the Dell diagnostics program and he's going to try to debug/decompile some of the code and work out how Dell does the fan control in it.
I wish him luck, just sorry he's spent so much time on this -
GL Franck =-( That would be amazing. Hope you can get it working and finally get us the cooling we deserve =-D
-
It'd definitely be nice to be able to use the x9000 at 3.4ghz WITHOUT having the fans at full blast. With the fans on (like the usual when gpu hits 75C) and some undervolting, it should be plenty for cooling. But this max rpm noise is just too much, as much as I want to fully utilize my cpu's capability.
-
-
I score 11000 in 3D Mark 06 with my T7700 and SLI. That is after I have tweaked the system. Also it depends on what drivers you use.
-
Kade Storm The Devil's Advocate
280m should've atleast come with 512-bit memory bandwidth to distinguish itself from the current generation of mobile cards.
BatBoy, thanks for linking the article to this forum. People need to read more of this stuff to get an idea of just how borderline the upgrades are getting this generation, and I'd consider these new and 'improved' cards to be downgrades more than upgrades. 128-bit? Oh, man. That brings back nightmares of having hell runing even old, obsolete games at resolutions above 1440x900.
As for DX 10.1. Yes, this has its uses. For example, under DX 10 API, one could run deferred shading with MS-anti-aliasing, with very little performance loss. Unfortunately, with our current dx10 configs, games require a special code-patch for multi-sampling if deferred shading is involved; only one or two games employ this via patch, and those would be Far Cry 2 (for sure) and STALKER CLEAR SKY (I think). Typically, the only kind of AA available on deferred-shading with a normal DX 10.0 card is super-sampling, which we all know is very demanding in terms of performance. Now how relevant is deferred shading (in my opinion it's very relevant as it can make lighting look amazing without eating up system resources - Killzone 2), or how important AA might be under deferred shading, which Far Cry 2 already does using a patch on standard DX 10.0 cards is up for a lot of debate. I personally think DX 10.1 had some potential, but most of it is getting done with DX 10.0 now. As for DX11, it is backwards compatible, but that doesn't rule out the possibility of an improved API for cards that sport actual DX11 hardware - another thing nVidia seem to be lacking. -
Yeah, NV's mobile GPU offering has not drastically evolved since 8800m GTX was introduced. For about two years, they've been doing those shameless renaming, etc to marginally improve their offering. Partly because there was no threat of competition...
Though, I'd be very happy with 256 bit GDDR 5 RAM... -
What tweaks are those, you beat me by 500 points... -
-
Kade Storm The Devil's Advocate
And you're right about the RAM. I almost overlooked the GDDR5; at least they're showing ATi something, now. Anyway, I'm not too much of an expert or scholar on the subject, but wouldn't GDDR5 V-RAM still make less of an impact than a proper 512-bit memory bandwidth? Come to think about it, wouldn't something as fast as GDDR5 RAM be best utilised by a higher bandwidth?
More importantly, welcome back, Magnus. 11k is a pretty good score for a T7700, but it only shows me how useless my CPU is in real-time gaming. I just scored 11,920 in 3dMark06, using stock clocks with just a mild system clean-up, yet I know that Magnus' machine performs stutter free and produces better frame rates in actual games like Crysis, GTA IV, Empire Total War, etc.
On the topic of static benchmarks, I would like to know how you got such a high vantage score, Quicklite. 11k? I'm hardly 9k, yet our 3dMark06 scores are so similar. Not to mention, so are our machines.Attached Files:
-
-
Does anyone here know if I can image my Raid 1 drives and push it to a bigger upgrade for my m1730? I really do not want to reinstall my system again.
-
-
Are there anyway to mod the color of the keypad backlight color?
Kinda getting bored of a single color.
Maybe swap in M17x's keypad? -
-
Hey guys, I'm trying Magnus' Crysis Warhead config ( http://forum.notebookreview.com/showthread.php?p=3954336#post3954336) with my 9800m gt sli but I think the shadows are messed up. The ground looks so dark that I can't see anything on the floor at all. I can play the original Crysis fine with Magnus' config file... any suggestions?
Oh I'm using the official 185.65 drivers. -
Well, Magnus config was made before the 185.xx drivers came to fruition if I remember correctly.
-
Kade Storm The Devil's Advocate
r_ssao_darkening
With Magnus' setting, it's at 1.3. Personally, I like the high contrast darkness, but that's a case of taste. If the ssao is too dark for you, then you should take it down to 1.0, or even 0.9, and that'll more or less solve your problem. So open the console using the '¬' key, and then enter the following command:
r_ssao_darkening 1.0 (or 0.85, if you desire lighter).
If this solution works, then simply open Magnus' config, and find the r_ssao_darkening command, and alter its value to your preferred level. -
-
Why don't you try to put a coloured piece of paper betwen the LEDs and keyboard?
-
The surface is not flat and it's covered by another non-transparent foil, so only raised area around keys is exposed. Would take a lot of cutting and forming, not to mention that under certain angle you would be able to see the paper when you look at the keyboard, would look very cheap. However, based on your idea, it might be possible to use a thin layer of transparent paint. Just apply a bit on the LEDs, that could possibly work.
-
Kade Storm The Devil's Advocate
With these 185.65+ drivers, I noticed a general improvement in SLi performance. So I decided to do a 3dMark06 test using 1920x1200, and actually managed to stay above 11k. I think that ain't bad considering my score with the older drivers usingt he same resolution.
Driver: 185.85
Resolution: 1920x12900Attached Files:
-
-
Has anyone taken the PhysX unit out of their system? I'm curious to know if there was any temp drops or overall performance losses.
Any info appreciated. -
Kade Storm The Devil's Advocate
-
Wouldn't simply disabling it work? Or is it still running when it's disabled?
-
Kade Storm The Devil's Advocate
Well, technically, yes. Most likely, even running, this card should hardly produce much heat. It's only called into action when a 'physx' specific software demands the hardware, unlike the GPUs, which still run under lower clocks during standard 2d-pc use. I doubt that removing or disabling the card will have any significant impact on overall PC heat readings, but it would be interesting if to get feedback from Eleron.
-
I can't tell if it lead to a measurable temp drop. Of course I had, but it was because of cleaning my fans and using AS5 for my CPU/GPU's.
For sure it'll give you no performance losses. I had it disabled for a long time and desided to remove it when I opened my system.
Look at it this way: If you don't need or use it you can remove it. At least one component less in your system what could give possible issues.
A well-performed sportscar you want to keep light as possible. -
Kade Storm The Devil's Advocate
Wait. Ernstig, was that you who 'retired' the card, or did Eleron also remove his card? Sorry about the confusion.
Anyway, could you do me a favour. I don't know if it's my GPUs or software, but my physx performance, as you know, is terrible with a certain game. Would it be possible for you to download the Cryostasis demo from nzone.com, and give it a try with all settings at max, and resolution at 1280x800? It would really answer a lot of questions for me, because at the moment, I have to rely on my Ageia card to make that game play smoothly with advanced physx effects, and that only leads me to believe that the PPU card can still help with very GPU intensive games that also use intensive Physx. -
It should help. When you use your graphics card for PhysX acceleration it will take a small part of its available power to do calculations. If you use dedicated PhysX card like Ageia then all calculations are done there and gpu is free from this load, it only has to render the picture. You will still feel slight performance hit as the gpu needs to draw more particles, but it should be much less stressing then doing physics calculations on top of that.
-
Kade Storm The Devil's Advocate
Well, one would think that in theory, wouldn't they? But the difference is almost shocking, because in Cryostasis, I am talking under 20 FPS for GeForce vs. solid 35+ with Ageia, but this is a game that -really- uses and pushes PhysX, it was afterall, billed as nVidia's primary DX10 PhysX tech demo. On the GPU side, the graphics aren't amazing, but the particles and lighting is another story seeing as the game uses geometry shaders and other unique DX10 features.
Still, I'd like to see another person run this game smoothly, using just geforce PhysX. I think it dragged on my system because my system is badly managed at the moment.This is why I request that someone else run the demo, and recommend a lower resolution for the tests so the GPUs are not under stress from the graphics, and have the head-room to manage the PhysX tasks. If the same result comes out, then I guess the Ageia was giving me an edge. -
-
From what I've read, it all depends on the game. In some games, Physx does work well, and increases fps.
-
On the other hand, even though if PhysX puts additional stress on GPU, it might also compensate for it by being faster? After all, in case of Ageia, information has to run to PPU, be processed, and then run back to gpu (not sure if also cpu is involved in this process to make it even more complex), so there will be some losses, while GPU integrated PhysX is calculating it on the fly right in the core. -
Kade Storm The Devil's Advocate
Most of the so-called PhysX drivers are supposed to accommodate Ageia cards as well. Essentially, according to nVidia, any PhysX driver will work just as well with both the GeForce and the Ageia PPU.
Now if you're comparing powers of the PPU card vs. a standard 8-series GPU, the GPU will win out - more going for it, but that would only be valid if the GPU was working as a sole and dedicated PPU. I've tried this approach as well by disabling SLi, but the performance is still on the weaker side. I don't think on our laptops, it works out the same way as both GPUs are on the same board/slot or something.
On the subject of interface and latency between PPU and GPU, this would be very little; most of the GPUs since the 6-series have been really fast. Although the theory is interesting, but we know that desktop enthusiasts use older 8-series cards now as dedicated PPUs, and that works out much better than having single 280s handle both. -
-
BTW, I think it was me.I retired my card about 2 weeks ago.
-
Just picked up an M1730 from Dell Outlet:
T8300, 9800M GT SLI, 256GB Ultra Performance SSD, 4GB RAM.
Just under $1,700 shipped to HI.
Does anyone have any experience with the SSD Ultra? I'm curious about its performance compared to 7200K drives.
Thx -
-
Congrats with your system.
I have no experience with SSD yet. SSD is still expensive for me.
For sure it will outperform a 7200 drive. Even 2 drives in RAID 0.
Maybe someone has experience who changed a harddrive for SSD? -
Very nice price for thatApparently the SSD's are much faster booting than the 7200ks, from what I've seen on this thread anywho.
-
-
ok so either my video cards are dead or Vista service pack 2 has killed them (or both).
I updated to SP2 (64-bit) today via windows update. Upon restarting, everything seemed fine. Then I started up Sins of a Solar Empire. I noticed some tearing and BOOM - BSOD. I restarted, and as Vista was signing in, I got another BSOD.
I managed to boot into safe mode and uninstall the nvidia drivers (I was on 185.85). I then ran DriverSweeper and CCleaner. I booted into Vista and installed the latest nvidia drivers (downloaded from their site). Upon reboot, I had another BSOD.
I had to leave to go to work, so when I get home I'm going to try to uninstall Service Pack Two. Can you guys think of anything else I could do?
At this point, I'm assuming that if I can't uninstall SP2 or uninstalling SP2 does not solve the problem (after reinstalling nvidia drivers), it's time to call Dell tech support and get replacement graphics cards.
Dell XPS M1730 Owner's Lounge, Take 2
Discussion in 'Dell XPS and Studio XPS' started by J-Bytes, Sep 27, 2007.