The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    XPS 16 GPU overclocking

    Discussion in 'Dell XPS and Studio XPS' started by gpig, Sep 17, 2010.

  1. gpig

    gpig Notebook Deity

    Reputations:
    82
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    In one of the other overclocking threads we saw that the trial version of 3dmark uses a different (non-configurable) resolution for the 900p screen compared to the 1920*1080 (at least for 1 user), which makes a very significant difference in the number of marks. If I recall, both screens had the same horizontal resolution as the default in 3dmark but the 900p screen had a lower default vertical resolution. Something like 1280*1024 vs 1280*800.
     
  2. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Well I proved myself wrong. I ran a test to see if memory speed is indeed limited in relation to core speed.

    I ran 4 runs of 3dmark.

    My first run was with a mild 700 core speed, and a ratio correct 861 mem speed, which netted me 8517 points. Then I upped the memory speed to my stable max of 1100, which boosted my score to 9437. Then I ran a faster core speed of 800 with the more limited 861 mem speed from my first run, and only got 8837 points. This isn't a perfect comparison, as the difference in core speed was only 100, where the difference in mem speed was 240.

    So long story short, boosting mem speed actually gained me just as much points than boosting clock speed, and the ratio of the two is irrelevant. Lesson learned by me :)

    I know what you mean. I've been working in 25MHz blocks so far. I'm sure I could get a bit higher, but it really is alot of work, lol.
     
  3. jvilla

    jvilla Notebook Enthusiast

    Reputations:
    10
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    i run 780/910 on my 4670 using rivatuner. any higher on the ram and it starts to studder.
     
  4. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Try running 3dmark as well. I was able to get speeds around 875 stable in furmark, but anything much over 825 crashes instantly in 3dmark. I'm just glad that vista/windows 7 can recover from driver crashes in a matter of seconds. If I had to bluescreen and reboot every time my OC failed, like when running XP, this would really lose a lot of its appeal, lol.
     
  5. daver160

    daver160 Notebook Deity

    Reputations:
    148
    Messages:
    766
    Likes Received:
    5
    Trophy Points:
    31
    Sorry for my ignorance in this kind of stuff (I've never overclocked a GPU, just my desktop's proc + mem), but is there any particular gain in watching videos from OC-ing the GPU?

    For example, I've started watching BD-R movies, and when I jump to a particular position in the movie, I get a couple seconds of graphic/video artifacting before the video renders properly again (sound is fine and in sync). Would doing a small bit of overclocking speed up, or altogether eliminate, the graphical/video artifact recovery time?

    Just wondering. It's not a problem for me, but after all this discussion about overclocking, it's got me curious about this! Thanks!
     
  6. @Prime

    @Prime Notebook Enthusiast

    Reputations:
    3
    Messages:
    22
    Likes Received:
    0
    Trophy Points:
    5
    I see a lot of talk here about speed, but what about temps? I don't know a whole lot about overclocking on GPUs, but on CPUs, if you keep your temps in the high 80s all the time; under load you are shaving years off the life of your processor. I would assume the same holds true for GPUs. From what I have read, if you are keeping your GPU above 85 C all the time under load, you are asking for trouble.
     
  7. Gloomy

    Gloomy Notebook Evangelist

    Reputations:
    33
    Messages:
    422
    Likes Received:
    0
    Trophy Points:
    30
    Temperature is not a problem... I have a tower fan I sidle up to my desk that blasts air directly into the fan. I've spent about 3 or 4 hours gaming today and the laptop's cooling solution did not switch on.

    And the CPU temp capped at 70C... GPU at 60.

    I think I'm going to catch pneumonia though. It's damn cold in here ;_; (I need a proper laptop cooler)

    Anyway, seeker, you should use 1280x1024 (somehow) to benchmark. I can't change it on mine since I ain't payin for it.

    [​IMG]

    My CPU is better than yours by 100 points bro

    [​IMG]
     
  8. lee_what2004

    lee_what2004 Wee...

    Reputations:
    822
    Messages:
    1,137
    Likes Received:
    14
    Trophy Points:
    56
    SXPS 1640 with P8700 and HD4670 (835/920) @1280x768
    [​IMG]
     
  9. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Not really. Your artifact problems are much more likely to be caused by your player software. A hardware BD player doesn't have nearly the same amount of power as a stock clocked XPS16 does, so overclocking probably won't make a difference.
     
  10. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Even at my higher overclocks and running Furmark + Prime95, my GPU temp never exceeds 79C, even without external cooling.
     
  11. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    It won't let me select 1280x1024. I can run at native 1600x900 though.

    I think you're thinking of 3DMark Vantage. It gives me an error, and then forces me to run at a scaled 1280x1024, maybe that's the problem you're thinking of. Because of it, it doesn't give me an actual Vantage score, just the GPU score. I'm not going to actually pay for Vantage (I wouldn't pay for 06 either, but a copy came for free with my desktop motherboard), so I can't really play around with it.

    In other news, I've been able to upgrade my max OC to 830/1115 (before I was using 25MHz steps and developed 825/1100 in only an hour, now I was doing 5 and then 1MHz steps and testing for several hours) and its 99% stable. I'll give more details tomorrow, it's time for me to call it quits today.
     
  12. Gloomy

    Gloomy Notebook Evangelist

    Reputations:
    33
    Messages:
    422
    Likes Received:
    0
    Trophy Points:
    30
    Just out of curiosity... the 5730 in this laptop uses GDDR3 right? And the 4670 uses DDR3.
     
  13. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    Thats not true, both use GDDR3.
     
  14. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Yeah, they're both spec'd for either DDR3 or GDDR3, but I'm pretty sure that in the Dells they only used GDDR3.
     
  15. gpig

    gpig Notebook Deity

    Reputations:
    82
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    What differences in temps when running your benchmarks are you guys seeing between the normal clocks and the OC's?
     
  16. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Not much, maybe a 3-4C difference.
     
  17. LegendOfDellda

    LegendOfDellda Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    As an experiment, I decided to turn off ATI PowerPlay and set the clocks with RivaTuner to the normal ATI PowerPlay values (675/800 for the 4670). Performance (as well as temperatures) were way down (like 30% performance reduction, 18C temperature reduction). Anyone know what else PowerPlay does besides changing the clocks? Does it literally use more power, and what else does it do that changes performance?
     
  18. chrusti

    chrusti Notebook Evangelist

    Reputations:
    36
    Messages:
    607
    Likes Received:
    44
    Trophy Points:
    41
    +1



    10char
     
    Last edited by a moderator: May 8, 2015
  19. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Well, after some work I was able to test a 830/1105 (down a bit from the 1115 I claimed last night...) overclock as 100% stable. To determine max clock speed, I started with my previously max of 825/1100, and ticked up the 825 by 1MHz and ran 3dmark06. I got up to 835, where it was consistently stable, and 836, where it would crash every time. I then took 5MHz off for a safety margin, and ended up with 830.

    Then for max mem speed, I ran Furmark at 1050x850 res on extreme burning mode, and the ATI tools artifact scanner at the same time. I know I mentioned earlier that the ATI tools made no difference, but I must have been doing it wrong. Apologies to DanyR, the ATI tool will show artifacts long before FurMark will.

    I started at 1135 and slowly moved my way down by 1MHz increments. If you're more than 10MHz over your stable max, the artifacts will appear within 30sec. As you get closer, it takes longer to see them. You're within 2-3 by the time it takes over 2 min for them to show. I ended up at 1120 with 15 minutes no artifacts, and once again, dropped 5MHz off for safety margin (which is where I came up with the 1115 of yesterday).

    I then ran 3dmark06 looped 3 times, to include 'force full precision' 1600x900 res, 2xAA, bilinear filtering, and all the feature tests that aren't included in the free version. It ran without any problems.

    Today to verify, I ran Furmark and ATI tool as stated above, plus Prime95 on top of it. I ended up having to drop the mem clocks down to 1110 for max stable, then 1105 for wiggle room. Unlike the previous test, where I ran 3x Prime95 threads, this time I only ran 2x to avoid a CPU throttling bottleneck, then eventually dropping to 1x.

    I verified 830/1105 to be 100% stable and artifact free. I'll probably stick to 750/1000 (a 10% drop from max) for everyday non-benchmarking purposes though.

    Throughout the tests, my GPU never topped 80C. Also, when running Furmark without the Prime95, they never topped 76C. They never topped 67C on just 3DMark06. I think the overheating CPU overwhelmed the cooling solution, and heated up the GPU higher than it would normally go by itself.

    On a side note, I'm now convinced that the CPU throttling on my 1645 is power, not heat, related. Even with the 130W adapter. When running the benchmarks, even 3x 100% threads (1x Furmark, 2x Prime95), it would throttle after about 30min and hitting CPU temps of 85 with a 13x multiplier. When dropping down to 2x threads (1x Furmark, 1x Prime95), my multiplier jumped up to 16x, and my temps shot up immediately. For the cores running active 100% threads, the temps rose up almost instantly to 86C (beyond what I thought was the throttling threshold), and eventually climbed to 90C, however, there was no throttling, and the multiplier never dropped from 16x on the active threads. This leads me to believe that throttling is caused on the 1645 by inadequate power supply, when running 3 or more of the quad cores plus the GPU at 100%, and not temps. However, it could also be based on average temps of all the cores together, and not necessarily max temp on only 2x of the 4x cores, but that will require more testing.
     
  20. LegendOfDellda

    LegendOfDellda Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    No offense, but it seems odd that you went through calculating all that to the nearest MHz and then randomly decide to drop down by 80/105.
     
  21. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    I wanted to know what my max was, but I don't necessarily want to run at max all the time. I figured I have 3 steps now, stock for 80% of the time, 750/1000 for 15% of the time for games that need a little boost, and 830/1105 for the 5% of the time that I'm either trying a new benchmark, or playing a game that just won't run smoothly otherwise. Call me conservative :p

    My new 3dmark score is 10088, which really isn't that much of a boost from my previous score considering all the work I put into it, but it's something :). Plus, I killed a few hours having what a geek (or laptop enthusiast...) can call fun.
     
  22. gpig

    gpig Notebook Deity

    Reputations:
    82
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    I decided to mess around on "my friend's" XPS 1645 since I don't want to void "my" warranty. "His" XPS 1645 happens to have the same specs as mine (4670). Here are my observations.

    PERFORMANCE
    There is a direct relationship between fps and clock (core and memory) speeds in everything I tested. Raise both speeds 5%, there will be a 5% increase in fps, raise it 10%, you will get 10% more fps. This is probably because the GPU is the bottleneck in everything I tested. Just raising one value seems to still help out most of the time. The maximum values I could hit were around 850/960. I could probably go more on the core clock but Rivatuner had a (!) symbol right at 850 so I decided not to.

    POWER CONSUMPTION
    I tried measuring the power consumption of the overclock while running Furmark. I need to re-do this sometime, one attempt showed a 1 watt difference, and another attempt showed a 7 watt difference. I'm not sure which is right, if either.

    TEMPERATURES
    Starcraft II, 675/800, max GPU 89C
    Starcraft II, 800/950, max GPU 89C
    I'll test more some other time.
     
  23. I Never Relax

    I Never Relax Notebook Consultant

    Reputations:
    21
    Messages:
    201
    Likes Received:
    0
    Trophy Points:
    30
    Can anyone help me? I have a Mobility Radeon 4670. Latest drivers and ATI CCC installed. Intel P8700 C2D.

    I tried using the AMD GPU Clock Tool and messed around with a couple settings, and when they didn't work, I hit "restore default clocks" (300/795) and saved it. But when I try to play games, the clock speed wouldn't go higher than 300 MHz.

    I'm pretty sure the default settings were 675/795, so I set it to that. Still the clock won't go higher than 675 while playing games (I think I've hit around 1150-1350 under full load). Is there any way to return the clock speeds back to normal?
     
  24. MadAlienFreak

    MadAlienFreak Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    Will this totally brick the system if the overclocking failed...?
     
  25. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Open up the tool, click 'restore default clocks', click ok. Should go back to normal. 300/795 is the 'powerplay' clocks, what the GPU downclocks itself to when idle to save power. It will automatically go back to default clocks 675/whatever, when the GPU is under load like playing a game. As soon as the game is over, it will click back down to 300.

    To make sure that it's working like it should, run GPU-Z. Go to the sensor tab, and look at what freq it tells you. Then start a game (leaving gpuz open), play a few min, and exit. Go back to GPU-Z. You should see that while the game was running, the CPU clocks should have jumped up to normal.
     
  26. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    No, as long as you don't start messing with the voltages (which you can't do on a laptop GPU anyway), you'll be fine. When the overclock fails it reset itself, which may crash your game. The screen will flash a few times, the overclock will be removed, and Windows will tell you it recovered from a driver crash. No permanent damage, just turn down the clocks and try again.

    If you start seeing colored spots appearing randomly across the screen, your mem clocks are too high. Go back and turn them down until the spots disappear. Once again no permanent damage.

    Monitor your temps, make sure they don't go over 90, and you'll have no problems.
     
  27. jvilla

    jvilla Notebook Enthusiast

    Reputations:
    10
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    what bios is the best for overclocking the gpu so it doesn't throttle? a09?
     
  28. jvilla

    jvilla Notebook Enthusiast

    Reputations:
    10
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    I never relax, i use rivatuner but you have to mod the config file so it recognizes the 4670 gpu.
     
  29. I Never Relax

    I Never Relax Notebook Consultant

    Reputations:
    21
    Messages:
    201
    Likes Received:
    0
    Trophy Points:
    30
    I set the clocks back to default PP settings, however when I play games (I always play with the 90W adapter in, plus I have all the performance settings turned on when on the adapter), the GPU clock will jump from 675 MHz to 300 and back, even at low temps (50-60 C). I'm pretty sure this is "throttling" of sorts and I've never had this happen before. I could normally play ME 2 max settings at a stable frame rate for hours, but now I can barely play for a few minutes.

    Another thing I noticed is in GPU-Z, the default clock is set to 400 MHz. I'm not sure if this is normal or not.

    I've also tried OC'ing with RivaTuner, but it never gave me any problems.

    I monitor the clock speeds with CPU-Z.

    Is there any fix for this?
     
  30. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    That's strange. The clock speed shouldn't be dropping while the game is playing. It may actually be throttling... I don't know why it didn't start until after you used the clock tool though.

    You could try uninstalling the clock tool and the drivers, then re-installing the drivers. I'm pretty sure that the clock tool / rivatuner just changes the driver parameters, and doesn't do anything to the hardware. Resetting the drivers should fix this if it is indeed the problem. If not, I'd have to suggest getting dell to upgrade you to a 130W (which you really should have if you're trying to overclock anyway).
     
  31. I Never Relax

    I Never Relax Notebook Consultant

    Reputations:
    21
    Messages:
    201
    Likes Received:
    0
    Trophy Points:
    30
    I uninstalled the AMD Clock Tool. Don't think that helped, but am I supposed to uninstall the display driver via Device Manager? And if I do, can I just re-install right after that?
     
  32. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Go to Programs and Features in the control panel and remove it there, then remove it in the device manager. Restart your computer. Re-install drivers (probably a good idea to download them before hand). Restart again. Test to see if the problem's fixed. I think there's also a tool to ensure that they're completely removed, but you shouldn't need it. If you want to use it though just google something like 'ati driver remover'.
     
  33. I Never Relax

    I Never Relax Notebook Consultant

    Reputations:
    21
    Messages:
    201
    Likes Received:
    0
    Trophy Points:
    30
    * Update *

    This going to sound as weird as hell, but....

    I rolled back from the latest driver (8.762) to the previous one (8.753). This didn't help, still had the same problems as before.

    I then "searched for automatically updated software" through Device Manager (it always re-installs the oldest version, 8.631).

    I go to CPU-Z, and now the GPU clock sits at 879 MHz. Max is 1350 when playing games. Idling temps are around 50 C. Max I've seen is round 72 C (I have a Zalman NC2K cooler + 9 cell battery). Also, for some reason, ATI CCC got downgraded to 10.7.

    Funny thing is, GPU-Z readings are totally different. Default clock is still 400, GPU clock is 675, and Mem clock is 800 MHz. Sensors list Core clock as 220 MHz and Mem clock at 300 MHz.

    In short, ? Everything works now, but should I take a chance and update back to the latest driver software (the one where I was trying to OC and stuff)?
     
  34. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    You can if you want, and if it screws it up just roll back again :). If you're happy though there's no reason you can't stay where you are. There wasn't much difference between 10.7 and 10.8, plus 10.10 is going to be out in about a month, you could just wait for that.
     
  35. MadAlienFreak

    MadAlienFreak Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    Wow !! Its workeds !!!!


     
  36. jvilla

    jvilla Notebook Enthusiast

    Reputations:
    10
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    what program are you guys using to run the gpu at max? because in starcraft2 i am only getting 67C-68C max with a 4670 at 800/950.
     
  37. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
  38. gpig

    gpig Notebook Deity

    Reputations:
    82
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    Most of the time, for me, running a game will cause a slightly higher GPU temp then Furmark alone, since a game stresses the CPU and GPU, and they share a cooling system.
     
  39. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Or if you really want to have fun, run Furmark and Prime95 at the same time, if that doesn't stress your system, nothing will!
     
  40. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
  41. rob7779

    rob7779 Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    OK my 2 cents. The Studio XPS 1640 or any xps 16 is not designed for gaming, and the ATI graphics cards have also been admitted to not be for gaming as the fps is all over the place, The cooling vents are also totally insufficient for the job. I am just not impressed with it. The palmrest also gets so hot from the HDD. If your looking at getting a new laptop and what good gaming performance go for alienware. Its designed specifically for gaming. Get a refurbished one if you cant afford them new and they are just as good as the new ones just half the price. The ATI 4670 I can play on high graphics but if u reduce graphics u have no increase in FPS so weird. Gaming performance is brilliant but not for hours of gaming.

    Another mod for cooling is to have a notebook cooler like coolermaster or zalman NC2000 and then take the bottom cover off the xps 1640 which will significantly increase cooling. Do this at your own risk. will test this when my mom has got her laptop back so I get my xps back.

    With the zalman nc200 at idle I have temp decrease of 7C across the board. hard drive 2C. People think this is not alot but when you running at 85C dropping to 78 is huge difference. also your laptop will last longer under the cooler temps.
     
  42. gpig

    gpig Notebook Deity

    Reputations:
    82
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    We all know the SXPS 16 was never a 'gaming laptop.' But it is quite capable- I can run Left 4 Dead 2 at 1920*1080 on maximum settings across the board (besides AA) with an average of 45 fps without OC, so who cares if it's a 'gaming laptop'? And I have the older 4670 card.

    I recently re-pasted with AS5, and now I'm able to run Prime95+Furmark without overheating, so no game can make it overheat either.
     
  43. rob7779

    rob7779 Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Yeah I am just stating that for those that want one for gaming. I would highly recommend against it and rather get an alienware. Look it is serious impressive, I ran call of duty 6 at Full HD with High settings getting 40-60fps thats amazing but the fps spikes are so irritating. like a yo-yo. I have the results from some cooling experiment and some thermal paste. This was all with the zalman nc2000 laptop cooler. without the cooler the notebook fan would come on if the laptop was on a table when it reach 56C then dropped to 50C, with cooler the fan does not come on.
    1 2 3 4 5
    TZ00 75 77 74 51 49
    TZ01 50 55 52 42 41
    TZ02 69 75 73 46 49

    CPU 70 72 69 40 43

    GPU 78 83 81 46 50

    HDD 37 41 37 36 36

    1 gamin with back cover on
    2 gamin with back cover & gold thermal paste
    3 gamin without back cover & gold thermal paste
    4 idle without cover
    5 idle with cover

    Used league of legends with is pretty gpu and cpu intensive.
    1920x1080 all high graphics.
    Interesting to note the HDD which is open is obviously runs cooler as there is no ventilation with the back cover on. My thermal paste didnt make too much of a difference but must be noted I have read it takes a few days for it to reach peak. The dell thermal paste was a gray hard compound which must have been terrible. I noticed the northbridge had a blue pad on it connecting it to copper heat pipe. can one use a copper kit and improve cooling on it and what would the performance difference be.
     
  44. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    I don't get it. Why revive a thread that's been dead for over a month just to say that the XPS16 isn't a gaming laptop, and that an Alienware is. We all already know that. Besides, you can't even buy the XPS16 new anymore. Not to mention that the 5730 that they replaced the 4670 with doesn't have the same heat problems.
     
  45. daver160

    daver160 Notebook Deity

    Reputations:
    148
    Messages:
    766
    Likes Received:
    5
    Trophy Points:
    31
    and from his signature, it's a dual core SXPS with RGB too.

    maybe this thread needs to be closed.
     
  46. seeker_moc

    seeker_moc Notebook Virtuoso

    Reputations:
    354
    Messages:
    2,141
    Likes Received:
    21
    Trophy Points:
    56
    Closing the thread completely might be a bit harsh. There's nothing wrong in bumping an old thread, there's lots of good info in here. Also, there's still potential for discussion in case anybody has something new. It's just that this dude makes his first post on the entire website in this thread, and said nothing new that hadn't already been discussed, throughout the thread he probably didn't even bother to read.
     
  47. GogolMogol

    GogolMogol Notebook Enthusiast

    Reputations:
    0
    Messages:
    39
    Likes Received:
    0
    Trophy Points:
    15
    Was seriously considering to overclock my 1645 for some time, you guys talked me into it :)
     
  48. reas_seammes

    reas_seammes Notebook Geek

    Reputations:
    5
    Messages:
    86
    Likes Received:
    2
    Trophy Points:
    16
    hey, I overclocked mine to 800/1000 ! will keep it till that, no further freq increasing. will do benchmarking and post results soon !
    thanks guys !
     
  49. jdmik

    jdmik Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    I tried to OC my GPU (4670) but the screen got black already at 725/925, possibly lower too. Is it the ATI driver overriding the clock or something?

    Info: 1640 (as I've understood I can't run custom BIOS with 1640?), P8600 CPU, 4GB ram, A14 BIOS, 90W AC and I also undervolt the CPU to get a cooler laptop (performance on demand).
     
  50. gpig

    gpig Notebook Deity

    Reputations:
    82
    Messages:
    885
    Likes Received:
    0
    Trophy Points:
    30
    Try bumping the Core clock and Memory clock up by 15 or so at a time from the default, separately. The 925 in your 725/925 is actually quite high- I was only getting around there (945ish) before I re-pasted. The people getting up to 1000 for the memory clock have 5730 cards which seem to overclock a bit better.
     
← Previous pageNext page →