I'll never be able to pass myself off as any kind of expert here (compared with you guys!) but as for re-setting heatsinks with paste, I have no doubt I can do a better job than a Dell engineer who's paid by the hour, on a schedule, and most importantly, always working on 'someone else's' machine. Re-seating heatsinks with paste is not difficult, but it requires patience and time which I'm sure are in short supply in a business / repair workshop environment. OK, I'm an engineer working on aircraft systems and instrument panels, but until I did the heatsinks on my previous setup (8700 GPUs and 2.2Ghz CPU), I'd never done it before. Well, I'd never had a machine that needed this much attention!
Of course I re-pasted my new X9000 CPU but didn't (yet) open the 9800M GTX GPU unit as I'd void any warranty if I stuffed-it-up. However, when the time comes to re-do the paste on the GPUs, I'll bet I can achieve a better job that what the factory did.
Eleron, it really pains me to think that you have to send-in your unit - and be without it - while a busy dell technician does all kinds of horror to your beloved machine. For me, if I definitely need new parts, the warranty's the way to go, but if I suspect a build issue like heatsink conductivity, I would do it myself and KNOW it's been done right.
I do however fully understand the non-engineer's approach to getting inside machines (like, don't!), but with all the internet guidance and videos around, it's really not as difficult as it might seem. The main issue is not having any parts left-over at the end of the job, and be careful not to strip screw threads - screws only need to be just-tight and the loctite does the rest.
And please, no offence intended (Eleron) if you're happy to do do this stuff but prefer to have it done under warranty....
Cheers, Andy
-
Well, I think I`ve opened up my laptop at least 10 times in the past 2 years, and hell knows how many before that. So it's not about opening it, but more like a timeframe of recent events.
Since the GPUs were replaced under warranty not more than 2 months ago and I have no indication whether it's a sensor failure or what I suspect, a thermal grease mishap, I didn`t want to take any chances.
After that, the X9000 is to be installed... hopefully at the end of the week the ordeal will be finished. -
Unfortunately this is a problem that has spanned two machines and four graphics cards and many drivers. The issue is also present in XP, Vista and Windows 7.
I don't have any other external monitors to try it on. The two monitors I have are Dell U2410 and Dell 2407wfp-hc.
While it's nice that you don't have this problem with the history I have with the xps m1730 it's apparently one of the many issues it has.
Witcher looks awesome btw, how did you like it?
Having replaced the thermal paste in two machines I can say it definitely is much better than the stock thermal pads that Dell puts in. My temps dropped significantly after applying Artic Silver to the CPU and GPU's.
After the last replacement I am not touching the inside of the machine because I might sell it or pursue legal action. I think I might try the attorney general once again to update them. -
Eleron, I believe that you're taking prudent action. It's likely something that recently occured with all the technicians who have worked on your machine. Dell should fix this issue to your satisfaction after all the problems you have had.
---------
BTW I am curious as to how many of you have at home warranty and were told to send your machine to the depot and have had to argue with them to send parts or a tech to your home? -
Well, since Dell doesn`t have a regular depot in my country, I work with a partner company, that gets the parts and does the job. So far, I can`t complain, I mean they did come through, it was Dell that stalled and delayed replacement parts or sent the wrong ones. Now, if it turns out that they used a crappy thermal paste, then there will be blood.
-
I was curious and hooked my wife's HP old monitor up to my beast and again there was no problem. I'm just using the nvidia drivers OTB, occasionally forcing v-sync if I play an older game. I've never tried XP but I've used Vista 32 and Win7 64 bit without a hitch. Sorry you've had such bad luck that sucks.
The Witcher is great! And it's a good game for the Beast since running it on max settings gives a smooth experience but feels like it does push the hardware a bit even with my 8800 GTXs running SLI. It uses Nvidia's PhysX as well so you can send monsters flying and skidding across the ground.
The preview for the Witcher 2 looks amazing.
-
A quick question about Nvidia drivers: on Kingpin's excellent advice I'm using 260.99 WHQL but I see on nvidia's site this driver is for XP and I'm running Win7 x64. The driver they recommend for win7 is 266.58 WHQL so what's the difference?
I'll quickly add that since upgrading to the 9800M GTX and X9000 I've been very happy, no un-forced BSOD or stability errors, so I'm not necessarily looking to upset my system for another 0.5% performance! I'm just trying to look for some logic in the decisions.....
Cheers, Andy -
Kingpinzero ROUND ONE,FIGHT! You Win!
Yo Andy,
well my advice was to use Quadro's 265.90, not 260.99. althought 260.99 is good for older cards as well...but quadros are on another league.
Theyre more fast and scalable, they work fine with the most of the games and have the highest benchmark scores among the newer 26x.xx drivers.
Now - i didnt tested the latest stable quadros - which are 267.05 (there is a 267.17 as well but they suck afaik) so if you want to see some real performance gain, try either quadros 265.90 or 267.05, both available on LaptopVideo2Go. Youll need a modded inf, which is on the same page of the drivers once you're ready to download it.
Report back
@Mag: great score dude, im able to break 10k with the GTX but i suppose thats due to more cuda cores. So the question: which drivers and which chipset drivers you're using? Turns out that intel chipset drivers, latest one, is mainly the older one we used on XP, but with a few upgraded components. -
Kade Storm The Devil's Advocate
Regarding the banter about G92s 'problem'. All of 'em are rumoured to carry the potential for this defect in varying degrees, including our 55nm 260/280m GTX units -- the latter tend to last because they're cooler by design and run in cooler notebooks that prevent those higher temperatures while thermal cycling.
[Rumor] More confirmation; NVIDIA G92/G92b bad Icrontic Tech -
electrosoft Perpetualist Matrixist
Rumored and discounted unless there was another source I missed? (Could happen
). There was never any real, tangible follow up except The Inquirer's normal mix of fact and speculation along with yellow journalism.
With that being said, the poor circulation, poor thermal application, and sometimes poor die contact would kill just about anything in the M1730 long term. Repasting/copper shims on your cards + rear elevation is like night and day temps wise.
But many times it is too little, too late when you realize your card(s) have been cooking for a long time. -
I ran into alittle problem when I was swapping hard drives in my 1730.
Where the ribbon off the hard drives connects to the motherboard there is a little plastic clip/hinge that swings down to hold the ribbon in place.
While I was doing this this plastic hinge came off. When I tried to put it back onto the holder one of the nippples on the side broke. So now I cant reinstall it.
SO my question is does anyone know where I can get another one?
Im not sure where to look or even what to call it.
Should I just tape the ribbon on? -
Sorry, Kingpin, I now realise in my rush to test my new 9800M GTX card I must have forgotten to update the driver! (I really am quite the muppet sometimes). I now have 265.90 and the modded inf. - seems to be OK but now at work so I'll have to wait to see any games performance increase. But wait, perhaps I should see a better 3DMark06 score now? When I leave here I'll run it. Can I ask what's the difference with a 'Quadro' driver?
Another newbie question for you; what about the BIOS for the graphics card? Are there published updates and (if so) is it worth the effort of updating? Mine is 62.92.7E.00.0D. Mean anything?
Cheers, Andy -
OK, I re-tested 3DMark06 and got 14416 for my setup, after the video driver change to 265.90. Before (260.99), I got 14474, so not much difference - a bit less in fact. Would this be expected or is there somewhere else I should look for a bigger difference? I made sure of the following before the benchmark test:
1. SLI turned on
2. Throttlestop is still giving me 3.8GHz on the CPU
3. Nvidia Inspector confirms 600/900/1500 for the GPUs
4. As few background applications as possible, but I didn't go through the process list.
Cheers, Andy -
For 3.8 and GPU OC you should get a little more that that..
But then again, Win7 is not amazing for that... so it looks ok. -
Hey guys,
My zalman nc2000 has one of its fans making an awful noise, so i thought it was oil time, but unlike the usual small fans i have there is no way to lubricate the zalman's fans ( none that i could figure out at least), so i disconnected the loud fan.
I'm thinking about buying a new cooler and saw the zalman nc3000 and wanted to know what your thoughts were about this one compared to the nc2000, and of course if there was a solution to the noisy fan !
Take care guys
-
Kingpinzero ROUND ONE,FIGHT! You Win!
the score is perfectly in line, but dont forget that 3dmark06 score isnt indicative of anything. Look at desktop gpus: often you can achieve an higher 3dmark06 score while having a bad performance in some games, this happens all the time.
So your real improvement will be seen only with ingame benchmarks, as they are the ones that actually counts (real game performance). I personally left 3dmark06 behind since a while.
265.90q are the best in these benchmarks:
Dirt 2
F1 2010
Mafia 2
Lost Planet 2
HAWX 2
Any game with built in benchmark should suffice to make a real direct comparison.
About the difference between quadros and forceware, quadros have an higher IQ (image quality) and uses Component Color Scale (FULL) instead of standard RGB color mode.
Beside that, their architecture is more oriented to quality since quadros cards are made for CAD programs and professional design applications. Their optimizations turns out to work beatifully on any geforce card, improving the quality level to higher standards than forcewares while retaining/besting out the same performance.
So far 265.90q is considered one of the best drivers ever made and under XP it does GOOD. Under 7 anyway does ok too, much better than other official forcewares.
About your score: well with an heavily tweaked 7 x64 i was able to achieve 15,295 but i really forgot what tweaks i did, they were too much.
Im not even sure i was using quadros, but im almost certain i was thought.
The OS itself can make quite a difference based on his status.
If youre a bit below 15k you're perfectly in range.
If you have time, try to test newer betas 267.05 (quadros) or official 267.24. I heard that under 7 with older cards they're fine.
Dunno mate, im still using NC2000 and its going strong thought. But im seriously doubting its worth having it when my fans are at full bore all the time. The problem with NC2000 is that it sux too much dust and it messes my bottom panel often, creating small dust agglomerates when it shouldnt. And temps are the same without it tho, so i kinda left it aside for now.
I heard good things about Notepal, you may want to check it out. -
Kade Storm The Devil's Advocate
Of course it's rumoured, but to discount on grounds of the obvious -- that there wasn't any tangible follow-up -- would be ignoring the cases seen on this forum and widespread amongst the M1730 community. Hell, had I been on any other forum, I would've panned it as a rumour and gone about 'discounting'. However, having been on here and having cards die out within the window of a few weeks has completely changed my view. It would be grossly naive to assume the Nvidia would not fight such rumours and maintain a tight lid on their image and liability in the matter. We do have a concluded law suit, which implies acknowledgement of something being wrong on their part; also further implies that they fought to keep the time priod for units sold that would be covered under the suit, as narrow as they possibly could. Over and above all this, I certainly don't need a tangible follow-up to contradict the massive migration of M1730 owners from this place to the Alienware forums.
Yes, one can take measures to avoid this, but having to jump all these hoops -- just to keep the system's crucial feature from breaking down -- creates work for the consumer and does indicate issues with the part and or product. . . neither does it prevent the inevitable. -
Thanks again, Kingpin - all very helpful. In fact, I went back-over your earlier mini-guide and guess what? I hadn't changed the setting for Power Management to 'prefer maximum performance', so after doing that, I switched-on Game Booster to drop-out some unwanted 'baggage' while I ran 3DMark06 again, and got 14,600 (best yet for me on Win7 x64), so I'm a Happy Chappie now. I think I can see why you don't really favour Game Booster as it decides for-you how to configure the setup for a game, but for me, it's a quick-and-dirty way to give me a little more performance. No doubt more can be got by spending time to clear-out more background processes but I'm using my beast for work mainly and then doing a bit of Flight Sim X or GTA4 after hours, so (I guess) I can't really set the machine-up purely for gaming and still expect (for example) file searches to be as quick.
Any thoughts on my previous question about video BIOS? Is it important to update occasionally??
Cheers, Andy -
Hey again,
thanks for the info, i just bought a Notepal x2
Plus, I noticed that an A11 bios update came out, I just updates and wow, the cards stay cool, at least for now, thats great news.
On another note i saw many people with their signatures saying 3.8 ghz with the x9000 cpu.
After some digging it seems the program used is throttlestop, what settings do you use with the 3.8 config?
when overclocking from the bios to 3.2 ghz the fans stay on like crazy, I imagine it must be even worse with the 3.8 overclock right? how is it bearable to you?
Thanks again
-
I doubt you can buy that anywhere. Try calling Dell to see if they sell it. I know what part you are speaking of and I believe it is a proprietary item for the M1730 HDD cable.
I would be very surprised if tape will work out, as the connection needs to be snug at all times. If you cannot purchase that from Dell, you might try using the broken retainer with an adhesive like Super-Glue where the nipple broke off, but even that would be "iffy". -
I just stuck the ribbon connector in the slot without the lock and it works fine for now. I will have to try and find some tape just to hold it in place. I dont travel with it so it doesnt get moved around that much.
-
Kingpinzero ROUND ONE,FIGHT! You Win!
For ocing thru TS theres a few steps a few pages back, as i helped Fortune7 to achieve the same
.
About the fan noise, it doesnt change if youre at 3,2 or 3,8ghz, above 3,2 in bios the fans are at full bore.
Its a matter of making an habit out of the noise,specially when youre on 3,8ghz with both gpus overclocked. Temps needs to be monitored
-
I'll check that out but at the moment, a simple oc from 2.8 to 3.0 gave me some neat performance boost, cpu becoming less of a bottleneck for high end games. And without gpu OC either
-
Try that at 3.8GHz instead and the bottleneck is gone completely
@Kingpinzero I tried the older 197.25 driver in Win 7 to achieve that benchmark score on the GPU´s. However I reinstalled the Quadro´s again and can´t get the same result for the GPU score as I did with 197.25. I will try diffrent chipset drivers again and see where I land.
Funny thing though is that with those 197.25 drivers Bulletstorm ran even better in Win 7 than in XP. I was mostly at 60 fps 90% of the time in Win 7.
I´ll see if I can replicate those framerates with the Quadro´s. -
Kade Storm The Devil's Advocate
Sixty frames-per-second for ninety-percent of the time? For Bulletstorm?! That's pretty steady and hardcore, dude. Even my system's struggling to sustain that much performance. Then again, I am forced to remain on the Verde drivers -- currently 267.24 Beta -- as I've become too lazy to use Nhancer with the older drivers and create new profiles for games that are only offered SLi support on the Verde series. Kinda' hate this situation since some of the older drivers are indeed better (yes, I think I am paraphrasing BatBoy's signature). -
electrosoft Perpetualist Matrixist
With the known defective nvidia series chips, there was clearly enough defective models to warrant a massive repair/replace program. In regards to later cores, outside of rumors, all that has been offered is speculation.
Now, you can't assume the same is with the latter G9X chips. There are many probable reasons, but to claim the subsequent chip(s) are potentially defective on the same scale as the known recalled chips is speculation and like I said, without a tangible follow up (IE proof) similar to the listed and known defective models, you can't jump all over a comparative bandwagon and claim the same status.
But like I said, with the M1730's poor ventilation, and poor GPU/SLI contact along with Dell's classic poor thermal compound application, you have users running their systems for months (weeks?) or years with insane amounts of heat building up and up until A11 cards idling where they should have been under load. The cards, even at idle w/ A11, still hit 58 and the fan pretty much cycles up and down constantly as the cards definitely want to get hotter . Elevate the rear, and temps drop down and stay in the low 50's at idle.
Even with the copper mods, running furmark in SLI still pushed the cards to 90+ on a flat surface with the fans at full throttle. As soon as I elevated the unit the temps immediately dropped down to ~80 running Furmark and never break 70 while gaming. I'm sure if I ponied up ~$14 for a new bottom and went dremel slap happy I could drop them some more as it looks like at least 50% of the air flow through the bottom casing is compromised. I ended up reverting back to A10, as with the copper shims and new compound along with rear elevation (or unrestricted air flow), temps are the same. Should users have to jump through hoops to address Dell's cyclical shortcomings (heat issues-> dying GPUs, dying MBs, faulty A0-A2 PSUs, etc..)? No, but you can't inherently blame the later cards themselves on the same level as the known, defective models.
I would argue that Dell's poor engineering exacerbated the defective nature of the known Nvidia cards and even in later Nvidia models, just like with many GPUs, shortened their lifespan by varying degrees based on poor thermal management from top to bottom. Not that Nvidia is innocent, but many of the heat issues could have and should have been addressed by Dell at inception or sooner during the follow up.
Not that Dell is alone when it comes to shoddy air flow. When it comes to many (not all) gaming laptops and even some normal laptops, providing unrestricted air flow has and continues to do wonders for internal temps (as evident by the mass of laptop elevation devices along with cooling pads compensating for the lack of rear elevation). -
Apparently, the service depot got the `parts`, whatever that means, and supposedly by tomorrow I should have the Beast back, and report results with the new CPU...
Fingers crossed. -
when you guys talk about nhancer, is it the same thing the nvidia profile is offering with the latest drivers?
I noticed some of my games only used 10-15% of the potential of the second card, so I used "force alternate frame rendering 2" and got a nice 15-20 fps boost on bulletstorm going up to 45-50 on the 9800m gtx and 3.0 cpu. "force alternate frame rendering 1" gave the complete opposite and lost 20 fps from normal lol.
Can't get myself to o.c higher, i couldn't bear the sound of the fans max and plus i got no warranty so i want my system to live for as long as possible
On a side note I installed the A11 bios yesterday and my cards are kept cool at 51 C average and 70 C max with card oc'd (600/1500/900 with EVGA) playing metro 2033.
What are your results? -
@KadeStorm, well I really like Nhancer thereby I am on older drivers. Problem is that it doesn´t support newer drivers sadly. However I can use Nvidia Inspector though it isn´t as good as Nhancer in my opinion. For instance you can´t set any SLI bits and test bit by bit like I do for Nhancer.
Also my 197.15 drivers is rock stable, it is with those drivers I get such good performance in Bulletstorm in Win XP. -
Kingpinzero ROUND ONE,FIGHT! You Win!
Just turned on again my xps since a loooong time, did a few tests with newer drivers.
Definitely thumbs up for latest betas on xp, 267.24. I lost a few fps in Dirt 2 (56.4 vs 60 of average, 48.3 vs 50 of minimum, compared to 265.90) but i got an higher score in 3dmark06, 16,606, which is good.
In Crysis 2 sli is working, but anything above gamer drops the framerate to 28-30 fps, at least in both Advanced and Hardcore, framerate is the same on both sides.
Gamer should run smoothly enough tho but i guess the xps is getting old
At least i know that Crysis 2 will be a no go.
Where i was able to hit 34fps in Very High with Crysis 1 im not able to sustain an High setting in the second episode.
Oh well thats fine thought. I guess this machine can keep me entertained for a bit more
-
Kade Storm The Devil's Advocate
Right on, Magnus. I just don't have the time to fully apply myself to Nhancer. I mean, with guys like you, driver updates are virtually pointless since you can create your own SLi profiles for the games using Nhancer. This does take time and effort, not to mention, reverse engineering the profiles in Verde drivers using Nvidia Inspector.
And KingPinZero, give that beast some more love, bro. Lawl. I am surprised that you're only getting 28-30?! I am getting over 40 at hardcore on my 280s, and my settings are terrible at this time. By the way, I have heard rumours circulating around that these console-ish specs force 'MSAA' on hardcore setting, so you'll have to find an external way to disable this in order to boost your frame rates at native resolution.
I genuinely do not believe Crysis 2 is as demanding, since my system shows the opposite results. I think you've got to tweak the system. -
Kade Storm The Devil's Advocate
I am not quoting the rest of what you've said because it's kinda' verging on the obvious, much like your prior post and I am not interested in disputing facts. In fact, I actually agree with you, for the most part.
You do mention a good point. Tangible proof. While I agree with you, and do believe that there's a possibility that the newer chips are fine. I will ask: Why not disprove our suspisions? Why is the burden on those that are speculating from prior experience where the Duck test applies. If it gives errors like the prior chips. Messes up like the prior chips. Burns like the prior chips. Then it is most likely carying the shortcomings of the prior chips. Why not have Nvidia disprove this claim? Although again, I don't fully think that Nvidia are at fault either, because Dell takes part of the blame.
And speaking of companies not owning up and misplaced burden of proof; Dell's released botched products time and time again. M17X R1 being another great example; they never officially accepted that there's any kind of fault with the system, even though it reeks of a product that was rushed out of the production line without proper Q&A protocol. Sure, they replaced units, but they never officially acknowledged the matter. . . like say Intel did with the Sandybridge recall. So where does the burden of proof stand? Over the heads of those who are making educated speculations? Or the company that is quietly and understandably minimising its liability in a matter just as any organisation would in a free market environment.
Sure we don't have the absolute proof. But really, have they disproven anything? -
The service depot called and said that tomorrow at around 4pm I should have the laptop back, and also they`ve installed the X9000 and it runs ok, they mentioned.
Oh boy oh boy
-
Hmm Kingpin I will give the Crysis 2 demo a try. However I can say that in the beta I managed to get SLI working although with heavy flickering screen. The framerate was pretty good at Advanced mode for me in the beta although unplayable due to the SLI flickering. The framerate was well above 30 fps I can say.
By the way try to copy my Crysis config over to the demo and see if that helps you a bit. At least in the beta it makes a small difference.
I am downloading the real Crysis 2 demo now and see where I land. I´ll see if I can tweak an SLI profile for the demo that runs pretty good for us
-
Kade Storm The Devil's Advocate
Congratulations, dude.
Question, however. . . I hope THEM putting in the X9000 didn't implicate your warranty in anyway. I thought we're usually supposed to keep our original CPUs in system for the official inspections so we promote the idea that the system wasn't opened.
I agree with your findings, Magnus. Advanced and Hardcore come close in features, and I've found similar framerates. Not to mention, even at sub 30s, this game looks and runs smoother because the rate doesn't fluctuate all over the map. -
Na, as I said, this is a third party service, and they understand the whole warranty issue better than a multinational corporation.
Besides, I know the technicians there by name already, since I`ve been over there at least a dozen times so far...
Also, there was some incentive for them, cause you know, nothing`s for free in the end... -
Just came back from a quick test of crysis 2 demo.
Even when boosting the cpu to 3.4 via bios and overclocking the gpus I get an average of 20 fps on either gamer or advanced.
my gpus hate overclocking and usually lock up after 30 minutes or so.
SLI not working really, 99% load on gpu 1 and 25 on gpu2, CPU at 100%.
Playing with the sli settings on nvidia profile only made things worse so keeping those on default.
Guess a new driver with adapted sli profile should come out soon
Edit : snooped on the net and downloaded an sli profile for the demo, and guess what, 36 fps average on gamer and 31 on advanced in the "Pier" map which is more demanding than "skyline" map.
CPU was clocked back to 3.0ghz, will try it with 3.4 -
They upgraded my CPU to an x9000
after more problems replaced mine with m17xR2 and i said i had and an upgrade to x9000 so they gave me an i7x940xm
in my replacement R2. So hopefully that will happen for you
-
Kingpinzero ROUND ONE,FIGHT! You Win!
About crysis 2 demo guys, i think theres nothing much to do on the system itself, we need to see how we can tweak the game.
Take into account that im getting 28-34 fps in hardcore on skyline which i think its a proof that this machine can kick some !
I didint tried Gamer, probably i will do later today, i guess im able to do better framewise.
And kade, gtx280m are another planet. What is helping you is the quad core and 128 shaders which are a step above xps potential.
@mag: the flicker happens whit older drivers. Using latest betas with the new sli profile update from nvidia the flicker is gone and sli is scaling proprely. Ill try your tweak asap. -
If I get one more single issue with the laptop, be sure that I`m not taking anything else back than a new model, like an AW that is.
-
Nah seems the demo don´t use the autoexec.cfg and such yet. Also you can´t access the console either in the demo as you could in the beta. I think with the real game released we´ll be able to tweak it.
-
How is the KM650 9800M GTX model? cause that`s what I got now...
+X9000 replaced for free
-
Magnus' info about nhancer and later nvidia drivers is well timed for me! I'm trying to improve performance on Flight Simulator X (yeah I know it's old by now) and I found a web guide which refers to nhancer settings, so I downloaded it yesterday and began to set it up. Then back here I get the message that nhancer doesn't work with later drivers - I'm using 265.90q.
With my Win7 x64 setup I'm now not sure how to proceed; do I go back to an older driver so nhancer works, or forget nhancer, stay with 265.90q and try other ways to improve FSX performance?
Cheers, Andy -
Kingpinzero ROUND ONE,FIGHT! You Win!
If they gave you some sli settings to use with nhancer, or some driver settings to tweak, you can do the same with Nvidia inspector. Just use the same values based on per profile (eg. Flight Simulator X) and you're sorted.
@eleron: K650 is the latest GTX 9800 revision, so you're doing perfectly fine. Now let us know how the beast does
-
@Fortune7
Actually funny thing is that I tried those latest beta drivers for XP and Nhancer worked with those drivers. That is strange though, but Nhancer worked. However I had to revert to my favourite 197.15 since the framerate was worse with those latest beta drivers compared to 197.15.
For instance Bulletstorm ran worse even if I used the same SLI profile as I have for the 197.15 drivers, the average was lower. Sure the 3D Mark score was better but not in game performance. -
Actually, the beast is AMAZING. Not a single game that hasn`t benefit from the new CPU.
And mind you, it`s only at 3.4 at the moment.
The greatest improvements were of course in NFS and GTA IV...
-
Yeah the X9000 gives a nice boost
-
Are you guys using earplugs when going 3.2+ overclocking, cause I don't know if those are issues but to me several points are raised:
1. the noise obviously
2. Fans running non stop at max means they will have a much shorter life span.
3. Overclocking will kill the cpu very early if not very soon from a sudden defect.
Are you guys all on warranty? which would explain why you wouldn't care much -
SomeFormOFhuman has the dumbest username.
Noise is one thing, but 2 and 3 aren't completely true.
You shouldn't worry about it. In fact, this is a bonus when the fans run at full blast - this gives cooler components of your CPU and GPU increasing its lifespan by alot. We're talking about a 15-20*C (or even more) improvement.
I've been running mine for 2.5 years and nothing has happened, and in fact. if any, its performance improves year after year; upgrading harddisks and cleaning the insides every 3 months from dust to keep up its optimum performance.
PS, my M1730 has not been turned off for 2 months now. CPU running at 3.8GHz and fans running continuously.
Dell XPS M1730 Owner's Lounge, *Part 3*
Discussion in 'Dell XPS and Studio XPS' started by BatBoy, Oct 6, 2009.
![[IMG]](images/storyImages/vantagey.png)