i currently do have warranty still on my laptop, i did a three year extended warranty. so if i complain about the heat problem, they might upgrade my laptop free of charge???
-
They'll try to fix it first. If that doesn't work for about 3-5 times in a row, you can push for a different system. Since they don't have any M1730 any more, it'll be some other laptop.
-
nice might aswell try my luck
-
-
My laptop has started to act up.
More and more recently, it locks when trying to view movies, and becomes utterly unresponsive, so that I have to force shutdown it.
Also, in some games, I'm getting nasty artifacts, which leads me to believe the GPUs will soon die.
Either that, or the ES CPU is a pain in the behind
Anyway, temps always stay below 78C, might go ahead and OC.
Which drivers are OCable in the 25x.x series? or more recent ones, and with what, nTune? -
Kade Storm The Devil's Advocate
I think that artefacts are a sign of GPU trouble. Good luck if you chose to pursue this further, Eleron.
-
pray its not but Kade sounds right, so much for our new improved Gen2 9800mgtx's , eleron notice on 3.6 in sig > also do get artifacts on 2.8 (1.2v) ?
rivatuner, ntune or System Perf tool 6.06 ar old so give MSI Afterburner ongoing support a go, tried Afterburner v220 beta 9 an works fine with 285 drivers.....havnt tried with 290.36 beta yet. -
Kade Storm The Devil's Advocate
Nice diagnostic approach, Hikkoo. He should try with lower clocks and voltage.
I know a few things about those second generation 9800m GPUs, as you know; I even asked you for the BIOS to the older cards. They're stable, but similarly susceptible to the same issues.
On the upside, Eleron should get a good replacement if he qualifies his argument for one. BatBoy says M17X. I don't know if they have any of those dual-GPU R1s or R2s anymore. If he can get a good R2, that'd be great. I know the R3 wouldn't be appropriate if he argues for a high-end dual GPU rig. You never know, a possible M18X might be on the table. What can I say? I'm an optimist. -
Played the whole HL2 series a couple of weeks ago, and there were horizontal lines, parallel, of messed of graphics at some points.
Ok, will OC with those, cause to have Heroes VI playable, I had to lower the settings to medium, and the res. at 1440x900, which is ridiculous. I'm guessing there's no SLI support for it.
285 + rivatuner you say? will try that, definitely -
Up to this point, I've had:
-1 battery replaced;
-1 power adapter replaced;
-1 motherboard replaced;
-3 GPU sets replaced(1 for being an 8700M GT SLI, and the other two 8800M GTXs simply started messing up everything, locking, giving a lot of nv4display errors, the usual.
Been close to a year since the 9800M GTXs have run at full speed, so it might just be their time to go.
I'm not taking anything less than a replacement this time, equivalent, that is. Nothing below GTX260s... -
Kade Storm The Devil's Advocate
If you want duals, your minimum aim would be an M17X R1 with dual 260s, or the R2. I just don't know if Dell's got any of these in stock. The R3 is a different beast since it's more of a single GPU laptop with the 3D gimmick. You can get this with the AMD/ATi 6990 and Nvidia 580, which are behemoths in their own right but no dual GPU support. I feel that there's certain advantages to having weaker dual GPUs as opposed to a very powerful single GPU since even a nice laptop with a 485m will fall behind SLi 9800m GT/X in certain areas. I know that resolution and bandwidth will be limited. So. . . yeah.
Be positive, bro. We're all here, and support and advice is always available. I am sure if played properly, Dell will nicely cooperate with you to reach a solution that leaves you happy. -
hope 580m GF114 turn out to be reliable -
Kade Storm The Devil's Advocate
Hikkoo, it's all good, my br0ski! I couldn't have learnt much about 'em without you in the picture; you're our lead 9800m GTX expert by virtue of being one of the few to push these cards to new levels and keeping the M1730 relevant along with the likes of Ernstig, SomeFormOfHuman, Eleron and Magnus.
At the risk of turning this venture of understanding our machines into a sentimental analogy, I'll draw a picture. In the realm of the XPS M1730 crowd, we're like one happy yet dysfunctional family. Just to name a few. . . Eleron's the fun uncle who is into cool cars. Magnus is the cool uncle who does all the nifty stuff with this machine when it comes to optimising even the latest of games. You're the Fear Factor enthusiast uncle "moar powah", doing stuff with these machines that would be deemed unsafe by NASA standards. Ernstig's the 'it'll all be okay' uncle; taking each issue one warranty-stride at a time. TimeWriter is the compromising uncle. DDT5 is the 'safety first, boy' uncle. SomeFormOfHuman is the Singapore uncle, who's got info. on all the secret goods regarding these machines and more. Hankaaron. . . let's just call him Uncle Hefner. BatBoy's the rational and responsible face-palm uncle who must keep us and the rest of the dominion of notebook nerds in check -- I feel for him. I am the crazy unsuccessful rockstar-wannabe uncle who turns up drunk and high, only to illicit a face-palm of epic disappointment from BatBoy.
Yeah. It's all good.
Anyway, I will add the following to this G92 discussion. Now, I don't know the details or the exact facts, but I think it goes beyond the veil of these chips having that solder problem. I recall that there were some vague claims that while the G92b has the same problem, some of the newer revisions had this problem resolved. People were sceptical of these stories, as was I a year back.
Taking my anecdotal account into consideration, since I did own two M17X R1 units with 280s, there was never any artefacts, even from extreme overclocking and voltage modifications. Having interacted with folks that have owned machines running the newer G92 cards, I felt the same opinion being mirrored; issues of this nature weren't prevalent. So in my experience and opinion, all cards moving from 260m onwards are more or less reliable when it comes to hardware consistency.
Of course, this might seem pointless now since all of this technology is by modern standards, antiquated. So yeah, let's root for the 580m, even though it seems like I won't get the opportunity to touch that thing for a good few years. -
woow Kade cool suprise....cant beat or OC that
suppose sad failure rate of gpus is were dysfunctional originated from > many bitter thanks to Nvidia for that
think now need to cheer me self up an do some benching -
Thanks Kade and hope it solves for you Eleron, when we talk about the XPS M1730 it reminds me of my Smarthphone with a Tegra 2 Nvidia GPU in it. Still it beats the likes Galaxy s2 in gaming out there, well of course my phone is as tweaked as my XPS M1730
-
Kade Storm The Devil's Advocate
Right on, Hikkoo, but like I said. . . remain positive. You've still got a great machine that can do plenty of things that today's consoles couldn't dream to accomplish.
And Magnus, buddy. . . I saw your mobile videos on youtube. Nice stuff. I agree with your comparison. However, to be honest, I am more in the Xperia Play camp since we have the dedicated controls, and the Adreno 205 is practically as good as the Tegra 2 so long as the dual channel RAM is doing its job. I think dual core phones such as the S2 have their place with CPU intensive stuff, but mobile gaming it seems is still very single-core and good GPU oriented.
As always, it comes down to those who're willing to squeeze the value out of their product or investment. I passed on my R1 to an old friend, and I'm still ripping off tips from Magnus to keep his machine up to pace since he's managing to run plenty of new games in their grand glory. In my opinion, nothing beats that feeling of stability and knowing that one's able to maximise their machine as opposed to the fun of dealing with much more powerful hardware that has yet to be mastered.
I was reading another thread on here about a related subject where someone found that their older Intel HD graphics performance improved significantly thanks to newer drivers. That's what I like about older systems because once the novelty of the hardware wears off, then the challenge and opportunity falls on the user to tweak and maximise the full potential of that hardware; this is where driver maturity can really help older hardware at times where it's primarily designed to achieve stability with the newer stuff. -
Does anyone have trouble switching on their M1730? (I am on 2nd motherboard replacement), often I have to remove the battery and hold down power button/re insert battery and sometimes it may switch on, sometimes it won't, sometimes if I hold power button down for more than a minute it switches on.
This is happening more frequently, and has happened a lot even before motherboard replacements. Is it a common problem or just a sign that my motherboard is dying. -
@nzaptx shouldnt be a common problem, so this prob happen on all 3 MOBO's , has dell ever did keyboard replacement? just a thought > have noticed if randomly goes from power to battery mode in OS?
Kade sure is challenging, more effort put in the more reward.
Fraps Heroes VI attacking 2 timber wolfs > with sli enable @ 3.2GHz, in game setting all on high, 1680x1050 max= 26fps
so disable sli get max= 51fps wth!
eleron ditch sli see if fps doubles? hope Beast is still kicking
-
Kade Storm The Devil's Advocate
Sorry to hear that, Hikkoo. It's one of those things, I guess. Certain turn-based-strategy games tend to have that CPU-factor where a single GPU is actually better with a fast CPU. Less detail and polygons, but more variables and calculations taking place. I had the same problem with World in Conflict when I first got it. . . actually ran smoother with a single GPU but with a quad-core CPU.
In the end, these things can be fine tuned. As more drivers come out, older profiles become more stable and eventually, more refined. I am sure you'll come up with something down the line. Do share it with us when you do. . . heh. -
Kade Storm I even ran World in Conflict at 1920x1200 on my old XPS M170 at most high settings and framerates from 17-60 fps, 17 fps with heavy explosions.
That laptop was as heavily tweaked as my XPS M1730 -
Good one, Kade, I should definitely write that down somewhere, it was hilarious
Thanks guys, will try all your suggestions.
Thing is, with nTune I can OC it at 600/900/1500, anything above that will lead to a BSOD. I tried 1550, which should be fun afaik, but it crashed. Definitely improved, though.
I lowered the resolution to WSXGA+ (1680x1050) and the fps count increased. Also, might have some creepy spyware around, when I do a clean boot, no Internet connection, the game runs smoother.
I decided what I want for Xmas. A steering wheel and pedals. It's high time that I played racing games properly.
And Kade, at this point, I'll take even a single GPU if it's newgen, cause we have to aknowledge than any new laptop case pretty much beats the 4 year old XPS. It's still my favourite laptop, and for as long as it will work, it will be my main beast. -
Well I reverted to an little older game Test Drive Unlimited 2. Now the game with a custom SLI profile runs at 1920x1200 all Very high and at a butter smooth framerate
Saints Row the Third runs pretty smoothly too now with a custom SLI profile -
well... !@#$ it has been more than a month...
I am still under warranty, living in Europe, went to dell partners office... ordered replacement gpu's... again... and nothing... order is pending... noting is happening and m1730 is sitting dead... cmon I understand a week or two, even three, but more than a month... without any progress... it is killing me... -
Call and ask them for service. I am in Europe too and most I waited was 2 weeks for servicing, and that was with parts coming from China.
Magnus, would you mind uploading all profiles? might get back onto some old games myself...
Still on XP?
If so, any other tricks would be awesome. -
Kade Storm The Devil's Advocate
Anyway, on the subject of replacements, your personal contentment is always the best way to go -- we get too caught up aiming unreasonably high.
In your case, your biggest plus would be a next-gen CPU. I am just going to be objective. This is a numbers game, and these days, I'm most interested in net results in performance. So speaking strictly from the empirical figures; in terms of GPU grunt, a 580m GTX -- the very best with the AMD 6990m on its heels -- is going to be just about equal to the 9800m GTX SLi. Maybe you'll lose 2 FPS on very high in Crysis compared to your present SLi configuration. So the net result is still a positive if you get a machine with a single 580m or 6990m because you'll have a much better CPU platform with a single GPU that can match and even somewhat exceed our last generation of SLi machines without the reliance on the issue of dual graphics configuration bugs. This level of performance is still plenty good. Naturally, system replacements take some time and some level of back and forth discussion between the two parties, so you can still try for a dual GPU system if you have a change of position on the matter.
Magnus, I think the companies selling SLi laptops (and Nvidia) should pay you royalties for how much rage you can squeeze from the cards. Please, do share the configurations -- I'd like to pass 'em on to my buddy because. . . I still game on his system from time to time. And cheers for sharing the feedback on World in Conflict. I actually got that one running nicely on a Toshiba X205-SLi with 8600m GT SLi -- nice laptop.
Greenfield, I would highly advise some patience at this point since it's that time of the year when things will be busy, especially here in Europe. Furthermore, most of their stock of replacement cards are. . . well. . . refurbished replacements from days of old. They'll probably try to squeeze something out of their inventory, but the chances are somewhat slim. Give them until the end of the year, and then calmly press for a more urgent resolution. Cite the possibility that the non-functional laptop is negatively affecting your productivity and that it's not longer reasonable in the good faith of NBD warranty that you should wait any further. Perhaps if you give them this leeway and then appeal to their reasonable side, they'll move your case to escalations and offer you a system replacement. -
-
Yes I will have to check all profiles and upload those that I think works best
Will do that soon.
-
Awesome. Seems to me that you're the only that can turn a slower system (spec wise) into a faster one, compared to other configs.
Like, your system is faster than mine, even though I have a somewhat better SLI setup. Although it's OCed at only 600/900/1500, any higher and it BSODs... -
Uhm, quick question.
I have a Panasonic Viera TV LCD that features HDMI / Composite / D-Sub / USB / SCART / RF Connector and what I want to do is connect to the TV with both audio and video.
Now, video is easy, with an DVI to HDMI adapter, but what about audio? the XPS has 2 outputs for audio and a 7.1 onboard system, afaik, so how do I port the audio to the TV? what kind of wire do I need? -
Kade Storm The Devil's Advocate
I am not advising, but simply reiterating what's been seen during in-game performance. Yes I am saying that in some cases that kind of performance disparity or lack thereof will be seen, but in other cases, quite the opposite will be and should be seen. It's already been documented by many websites including the infamous notebookcheck.net and Tom's Hardware. I think you're actually taking performance differences a bit too literally. Theoretically speaking, a 580m GTX is very powerful, but performance is still performance in real-gaming scenarios; not considering the newer features the card brings to the table in comparison to its older counterparts. For example, one can still sustain much more performance without taking a hit from higher levels of anti-aliasing with many of these newer cards. So of course, some levels of that power just won't be matched by the outdated hardware. So I am in both camps in the sense that cards like the 580m will break leagues when certain features are invoked, but in other cases, they might not, when compared to the SLi rig.
I'll take an example of these people tweaking their systems with the SLI 9800m GTX were scoring around 22FPS on average and are now getting about 24-27 in Crysis under very high [1][2]. The 580m GTX scored about the same under this setting -- some tests are a couple of frames higher and some a couple lower. . . it really depends on other factors, and as we can see, the story completely changes and the 580m becomes rock-solid when higher anti aliasing is invoked [3][4]. Please, do keep in mind, I am speaking simply in terms of peak performance under actual gaming conditions. Synthetic benchmarks and genuine future-features (DX11 et al.) are another story.
As for your personal experience. I am glad you enjoy the leap. I am presently using a next gen I7 laptop with the 485m GTX, and I have to say that my experience speaks otherwise; the 485m GTX runs great, and it's nicely optimised, but it doesn't in anyway leave these peoples cards in the dust from an iso-setup point of view. So 460m wouldn't even be a match -- not even close -- it's around the same league as a single AMD 5870m, which while very fast and with a much more efficient architecture, still doesn't match. My M1730 with 9800m GTX SLi did match performance in older shader-intensive games when both cards were running at their best. My 485m GTX was around 26 on average FPS in Crysis very high benchmark under 1920x1200 with no AA; my barely overclocked 9800m GTX SLi (600/1500/850) achieved up to 28 on average under the same settings and driver. I still think the 485m GTX is substantially better; I can get much more anti-aliasing out of it at higher resolutions. However, when high bandwidth average is considered without some of those finer features, it's still somewhat matching the SLI setup. Of course, this all invariably comes down to how well our machines respond to our choice of drivers, our optimisation attempts, and general state of operating system efficiency.
In terms of your experience again, perhaps your SLi system wasn't doing as good because of other factors? I am sure that you know that the 9800m GT SLi is very limited by virtue of the 512MB VRAM. I can already verify this because I upgraded from 8800m GTX to 9800m GTX to eventually 280m GTX, and I know the general flow of performance in high-end gpu-intensive games was significant because of the fact that some of the games were actually exceeding the memory capacity of the cards. We already know that Metro 2033 on very high actually exceeds the 512MB budget; no matter how fast the card. Of course, this is just ONE of the many other potential confounding factors.
Final verdict from both the actual data from established websites, variable benchmarks, and variety of anecdotal accounts, I have presently concluded -- and previously stated -- that in some cases it (580m GTX) will be matched and in others, it will be better and even achieve certain things that are just not possible with the older cards -- superior and newer technology is generally the way to go, even if direct performance isn't blatantly evident.
References:
[1] http://forum.notebookreview.com/sager-clevo/314966-9800m-gtx-sli-benchmarks-complete.html
[2] http://www.notebookcheck.net/NVIDIA-GeForce-9800M-GTX-SLI.22440.0.html
[3] Benchmark Results: Crysis : GeForce GTX 580M SLI Vs. Radeon HD 6990M CrossFire
[4] http://www.notebookcheck.net/NVIDIA-GeForce-GTX-580M.56636.0.html -
I was just basing my opinion on what I'd seen for myself from games and benchmarks ran on both my own laptop and laptops belonging to my friends. I also do use the notebookcheck site as a good way of comparing GPUs and that site shows the 580m GTX to be able to obtain almost double the score that the 9800M GTX Sli can optain in 3D mark 06 ( http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html). Obviously this will also be dependant on drivers and the CPU, but then you can't buy a laptop with a 9800M GTX and an i7 anyway.
Ofcourse overclocking can also improve the performance of the 9800MGTX, but then you can overlock the 580M too. As you pointed out by yourself too, the 580m has the added benefits of directx 11 and other newer technologies.
The other thing which needs to be remembered is that SLI solutions do not come without issues. If you have 2 gpus which together can equal the performance of one particular single gpu, the single gpu solution is always the better option, less heat, less power and better compatibility with games etc (I have had enough SLI solutions to know how annoying it can be to get certain games to work in sli).
But anyway, after reading your last post, I see where you are coming from and thats fair enough, in terms of raw power, they may be slightly comparable, even in some games such as crysis - but crysis, although overly demanding is still quite well optimized compared to a lot of those console ports out there which surely would fair better with a new, single gpu solution. So I would still have to say that for real world use, the 580M is still a significantly better gpu.
Anyway I think I'm going off topic from what you originally posted now, which you have made clear and backed up in your most recent post. Thanks -
-
Something along these lines may work http://www.amazon.com/3-5mm-Plated-...1?s=electronics&ie=UTF8&qid=1324144553&sr=1-1 or alternatively do you have any kind of external speakers/home cinema with the tv? Because then you could use of those 3.5mm to RCA audio adapters which would give you sound.
-
Seems like exactly what I need. I could always go for external speakers, but I'd much rather keep the TV setup intact and just port the image and sound to it.
Thanks -
interesting Kade an xps m1710
wonder to fairly compare a gpu with another gpu > have some type of constant machine that can run each these gpu's
dam pity cant put single GTX 580m in me Beast to find out real diff
The NBR link insane about Crysis 100% better with sli, have to myth bust that
run some Crysis6115 Gpu Benchmarks DX10 64bit VeryHighspec HDR (MGPU) DevMode Streaming level=island
Win7 64, 290.36 beta, Gen2 9800mGTX sli default clks, @3.4GHz, 1920x1200, Vhigh, NoAA, NoVsync
sli disable avg=11.7 fps
sli enabled avg=22.4 fps
about 92% better...near enough confirmed
had bench prob with clean installs of Crysis on Win7 64 > Benchmark bat always kept running in custom mode for high or Vhigh
notice Texture Quality somtimes gets reset to custom.
on High or Vhigh in game console commands r_DynTexAtlasCloudsMaxSize=24 but should =32
was able fix VeryHighSpec mode by create autoexec.cfg file with sys_spec_Texture = 4 then add cfg file in Crysis folder
(with the Texture mod > Custom keeps appearing @ loading screen then changes to VeryHighspec at very start of first bench run (looped:0) -
Hikkoo, might as well ask you also: is there anything specific that you do to tweak your system?
My 9800M GTXS cannot go over 600/900/1500 without BSODing, and the CPU is at 3.6, but apart from that I'm not doing anything specific. So any tips would be appreciated. -
everyday use the usual tweaks
> have 2 single Hdrive setup, win7 64 on 320gb hitachi an other hitachi XPpro32, done to both is a few BlackViper tweaks an both OS's do not have Intel Matrix Storage Manager installed, not needed for single drive setup. of course must install Matrix Storage Manager if have RAID setup, no Dell MediaDirect installed, antivirus disabled when gaming offline or benching
eleron which XP gpu driver using? want to try 3dmk06 benchy on 3.6 default Clks an 600/900/1500 interest see if we get similar scores, Also which GPU OC Tool Version using? that BSOD, ile test on me XP
have me Beast raised atleast 20mm an plenty space @ rear about 30cm from back wall. what about your Beast raised or use laptop cooler? -
Kade Storm The Devil's Advocate
XPS_m1710, I think there's certainly some grounds for agreement. Where we possibly deviate is on the factor of your experience, and I've always agreed with the idea that dual card solutions aren't for everybody. This is one of the reasons why I am least interested in synthetic benchmarks. My original M1710 with the 7950go GTX scored much lower than my Toshiba X205-SLi (8600m GT SLi), but in-game performance was almost inverse because I had still not managed to tweak the system properly. Despite all of this, when it came to older, stable, PC-exclusive titles, the machines were relatively matched with the X205 only edging ahead in some of the newer shader intensive titles, and this was largely because of the fact that vertex shaders and pipelines were radically transformed from 8 series upwards due to unification and higher count. So again, if you can achieve almost the same with a single card. . . you're in a much better place. If you can do even better. . . well. . . even better then! I still primarily rely on the philosophy of using a single-GPU approach, while reserving secondary card for the real grinding software.
By the way, it's good to keep tabs on notebookcheck. In fact, my initial statement did take notebookcheck figures into account; it was their tests of that particular game, Crysis -- which I still consider THE benchmark -- that showed these results. I simply cite them a bit cautiously around here since some people take exception to the idea and often perceive at as a wholehearted gospel endorsement of notebookcheck.net, which is clearly not my aim.
Anyway. . .
Hikkoo, nice run. . . and with newer drivers. I know that Crysis performance has dropped with the new drivers. The SLi Alienware machine that's now with my friend, lost a good 3-5 FPS on average with the newer 280.xx nVidia drivers, but the average is still pretty consistent. Still, you've managed to demonstrate an almost 100% increase by dual card scaling, which should hopefully reinforce the idea that if people fully explore and exploit their solutions, then a near 100% boost isn't entirely impossible. It just takes more work.
Eleron, you must keep the system nicely cooled, which I think you're already doing. Now the newer 9800m GTX with the M1730 are a mixed bag, unfortunately. Some can overclock a bit higher and remain stable while others cannot. I think 600/1500/900 is very modest, actually. I remember Hikkoo churning out some insane clocks on his older cards. Hell, even I got revision 2 cards and was able to make it to 650/1625/960. Of course, I also experimented with over-volting the cards because driver crashing -- in my case -- positively correlated with lower voltage and higher clocks. I had to increase them by a small margin to get stable higher clock performance. I can't be certain that this is your problem. -
At the moment, I'm on 285.66. Beast raised about 50 mm in 3 points, so there's airflow beneath all of it.
I'll do another 3 runs and post the best score
So 3.6, 600/900/1500 OCed with MSI Afterburner.Attached Files:
-
-
Heh, how about that. Score non OCed is higher than OCed, although some scores are higher. I can't make anything out of it.
-
woow....somthing not right when OCing gpus.......lol 3dmk say 3.9 GHz
try uninstall msiAB an run on 3.6 to rule out if msi prob, possible BSOD's did somthing to gpu OC tool or in OS,
wierd 285.66 inf file is removed @ laptopV2go???? so try 290.36 an replace inf file get error say "Required files are missing" lol tried Have-Disk but screwed up xp so reinstall.
beta 267.24 from NV ,TS-3.6_1.375v-clkmod ticked
500/1250/799 3dmk06=15862 sm2=7401 sm3=7699 cpu=3332
600/1500/900 3dmk06=16199 sm2=7369 sm3=8311 cpu=3267
OCed
NoOC
odd gpuz cant get it to show enabled
3dmk06 probs on 650/1620/960 or 644/1610/950 > 3dmk freezes or glitchs with lag...have notice gen2 not as good as gen1 to OC , Kade maybe NV snuck in restriction in gen2? but probably right about voltage. -
Yea, I know, that's really odd.
I have always had weird results and performance issues when comparing to similar systems. I guess I'm missing something that other's are doing by reflex.
Oh well, at least gaming has never been an issue, the Beast handles all without problems.
I'm surprised how smooth COD MW3 works maxed out. -
cool
.....ar OCing gpu for MW3?
remember fresh install OS helps higher consistent scores , noticed somthing in OC pic appears have adjusted msiAB clk values while 3dmk open could be cause of gpu OC low score. me reboot for each bench, wait till Hdisk blue light goes quiet (OFF), make sure nothing running in Windows Task Manager-"Applications" except OC tools, adjust OC's clk values...... always open 3dmk last.
cas4 ram give better scores, if have Cas5 about -80 points off me total scores also need to take in account me fresh xp helps higher score.
okay we dont have same driver but do think your No OC score looks mighty good
tried older gpuz's still wont show enabled -
Nope, running MW3 on stock.
I've noticed that OCing is not necessary.
Also, just got Serious Sam 3 for a little while, man this game is a killer. Both in looks and demands. Hit 91C on the GPUs so I did a fan cleanup. Again, it tops at 75C -
Scratch that, it's only 69C when OCed
-
Greetings folks,
Newbie to the forum, Newbie to XPS, Ancient in the IT world.
My new "Beast"
Dell XPS m1730
CPU - T9300 2.5mhz
VIDEO - Dual Nvidia 8800GTX SLI (512mb ea - 1024mb total)
HDD 2x 200gb - Raid 0
MB Mem - 4gb
Red inserts.
MAN do I love this machine......!!!! Boy Howdy!
So, I found it listed on my local CL site for $600 FIRM. Advertised as 3gb memory, 512mg video (2x256) with two 200gb HDD's. Around here that usually means some defects to be overcome, usually about $200-500 worth.
This unit was purchased originally by a firm to do "Graphics Calculations" and they found that after about 30 days their IT group could not get it "fast enough" so they shelved it for 18 mos. then decided to "reduce clutter and sold it off.
There is not a single mark on this unit. period.
I bought it from them and did the following,
Wiped the HDD's blank
Set them as RAID 0 (were originally separate)
Installed Windows 7 Home Premium x64 (was Vista x32)
Got ALL of the drivers up and running... sheesh...
then I Ran the "Windows Experience"
Processor = 6.1
Memory (RAM) - 6.1
Graphics = 7.0
Gaming Graphics = 7.0
Primary HDD = 5.9
WOW!!!
I am now selling my gaming/CAD desktop, my basic desktop, AND my older DV9000 laptop. No need anymore.
So I called (chatted with) Dell to try to find out how to get copies of the original "accessories kit" (disks, manuals, cables, etc) and they said that I could buy a set, or since i had done the ownership transfer they could (for a fee) put me back on warranty until March of 2013 and it would be free!!!!
Seriously? that is just too cool!
I am so sold. -
Hey everyone i just bought a m1730 last week, ( i always wanted one since 2008 but ended up buying a xps 630). It is a great laptop but i have a few questions, what is the ideal temp for the GPUs while they are idle and while playing a game, for example call of duty 4? And the other question is would it be worth it to buy the nvidia 9800m gtx card and the intel x9000 processor? The money i pay for the upgrades i can sell the computer and just go buy a newer alienware laptop. But if it is really worth it i wouldn't mind keeping it.
-
-
Question about QUAD CPU's...
I have read OVER AND OVER that you cannot put a Penryn Quad in a m1730.
I just finished reading two site where folks "claim" to be running Quads. One says he is using a Q6600 that was installed when serviced (I suspect an E6600 would be more believable), the other a Q9650
Can someone verify that this CAN or CANNOT be done?
I have emailed both bloggers asking for clarification, but i do not really expect replies. -
sgheeter, cc203 welcome to The Lounge
http://forum.notebookreview.com/hardware-components-aftermarket-upgrades/393027-pll-pinmod-overclocking-methods-examples.html
Expand NO. 4 > "Is it possible to use a 1066Mhz FSB Penryn cpu on a 800Mhz FSB Santa Rosa platform"
remember Denz101 in part3 XPSm1730 Owners Lounge posted installed a X9100 but didnt work but never mentioned did any mods, needed Bios mod to recognize cpu but our PLL may not support
sgheeter could ask in above link about Quads
ideal temps, depends what bios have A10 or A11? GPU an CPU have? so important setup of Beast > is raised off benchtop or sits flush on benchtop or have cooling pad? NV driver using? Room temp? 100% certain internals cleaned of Dust?.......
main temp to be concerned about when game it doesnt consistently stay over 85C. be nice could open a Temp utilitly on Desktop then open game an play for 1 hour or 2, close game, then after 20 minutes of idle look @ readings, post here temps value =? min = ? max=? be so helpful > including specs an espically dells Bios version have? even temp utility pic be more helpful.
me use CPUID Hardware Monitor(free) because its light an seems to show consistent temps, been using it for nearly 3years in me Beast with X9000 an fans on 100% speed. never liked the fluctuating fan speeds so ive nearly always set to 100% speed in A10 bios...didnt @ very first, wasnt aware of weakness's of Beast that have workarounds, except for hardware issues
**OFFICIAL** Dell XPS M1730 Owner's Lounge, *Part 4*
Discussion in 'Dell XPS and Studio XPS' started by nzaptx, Jun 5, 2011.