The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    m1730 Processor Upgrades

    Discussion in 'Dell XPS and Studio XPS' started by sunnah, Nov 2, 2009.

  1. sunnah

    sunnah Notebook Geek

    Reputations:
    6
    Messages:
    75
    Likes Received:
    0
    Trophy Points:
    15
    I just bought the Intel X9000 processor and was wondering if the processor alone will improve my graphics processing a little without getting the 8800m GTX card. I really do not want to pay for card with can cost as much as a new laptop. I got the processor under 500 dollars. Right now I installed Batman: Arkham Assylum and it plays good but there is room to get better on the response time. With the existing graphics card the game runs only with a slight lap but nothing that takes away from it. I am sure any online activity will be more prominent.

    BTW: does anyone have video or photo tutorials on the X900 processor replacement?
     
  2. Photolysis

    Photolysis Notebook Consultant

    Reputations:
    56
    Messages:
    206
    Likes Received:
    1
    Trophy Points:
    31
    Unless for some reason your processor is slowing the game down (for example, performing physics calculations), upgrading it will have essentially no impact.

    Even if the program did use the CPU power for rendering, it would have almost no impact. Due to my graphics cards being completely fried, on my X9000 in WinXP I get about 5 frames per second on the desktop at most, i.e. in 2D.

    If you want better graphics, upgrading the GPU is almost always the first thing you should do. It's true that there can be other bottlenecks, but these should be apparent anyway.
     
  3. sunnah

    sunnah Notebook Geek

    Reputations:
    6
    Messages:
    75
    Likes Received:
    0
    Trophy Points:
    15
    Thanks. Before I open it up I wanted to know.
     
  4. Commander Wolf

    Commander Wolf can i haz broadwell?

    Reputations:
    2,962
    Messages:
    8,231
    Likes Received:
    63
    Trophy Points:
    216
  5. spradhan01

    spradhan01 Notebook Virtuoso

    Reputations:
    1,392
    Messages:
    3,599
    Likes Received:
    5
    Trophy Points:
    106
    Any video link about that?
     
  6. Smooth_J

    Smooth_J Notebook Deity

    Reputations:
    500
    Messages:
    806
    Likes Received:
    3
    Trophy Points:
    31
    I did this myself a while back, and I took pictures at some point. Be prepared to have a big table handy cause you will surely need it.

    [​IMG]
     
  7. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    I have to step in and say that the above comments regarding 'no net improvement' and so on, are correct, but in a very general sense. When it comes to SLi/Xfire laptops, such opinions become inaccurate in my books.

    Look up the workings of SLi, and the first thing you will learn is that SLi scalability is highly contingent on two things: CPU power and resolution.

    Just put an SLi 9800m GT with T8300 against a similar system with a 3.4 GHz X9000 and test the following games (watch how the 3.4 GHz machine will provide a much more fluid, stable experience with SLi):
    - Crysis
    - GTA IV
    - Dragon Age Origins.
    - Lost Planet DX10
    - S.T.A.L.K.E.R. Clear Sky
    - Pretty much any other game where you see the SLi bar 'fluctuate'.



    I am serious on this subject. Too many people condemn CPUs when it comes to the graphics debate, and this becomes even more misleading when you are talking about 'dual video card solutions', which rely heavily on CPU bandwidth to scale properly. A majority of SLi laptops with 256-bit GPUs, aren't even allowing the GPUs to run at proper performance levels because the sheer power of those cards is bottle-necked by latency and sludge from the slower CPU. A machine with a fully overclocked X9100 and two 9800m GTS cards could compete with and possibly beat the best gaming laptop at the moment.

    I have upgraded from a T9500--which is the next CPU below the x9000 on the older plaforms--to an X9000, which I now run overclocked at 3.2-3.4 GHz. I can say that all my games that once used to stutter, don't stutter nearly as much as they used to, and most of the DX10 intensive games actually perform with a stable frame rate, while my older CPU caused the SLi performance to collapse many times.

    Batman Arkham Aslyum, eh? The bit where you glide down on the baddie and smash his face into the floor-tiles. My old T9500 + 8800m GTX SLi configuration used to drop frames all the way into the 15-and-blow range during that scene. With my current CPU, it sustains itself around 28-30, even at the most intense physics spots, including the same aforementioned scene where it dropped to only 25. Why? Because 'PhysX' isn't only GPU-driven, but it also relies on CPU performance. Nvidia's whole concept of 'PhysX' is tailored around designating calculations and suitable tasks to both the GPU and the CPU; it is -not- a GPU and GPU-only concept.
     
  8. Slammin

    Slammin Notebook Consultant

    Reputations:
    125
    Messages:
    281
    Likes Received:
    0
    Trophy Points:
    30
    Ditto Kade +rep if it would let me.

    This is coming up on so many threads and forums lately and the whole GPU is the only culprit because it's a game therefore graphics intensive is rubbish. I have similarly specced 1730 (see my sig). During intense scenes in Fallout3 I was getting stuttering and when I had a quick squiz at my logitech screen which component is going thru the roof? The CPU!. It's always been the bottleneck on my system and if it was still alive it would continue to be the bottleneck.

    Games these days rely so much more on all the components not just GPU. Yes the GPU is important but if your CPU isn't up to scratch then the high powered GPU is worthless.
     
  9. Kade Storm

    Kade Storm The Devil's Advocate

    Reputations:
    1,596
    Messages:
    1,860
    Likes Received:
    202
    Trophy Points:
    81
    I repped you for reinforcing a good point. Thanks. I almost forgot about Fallout 3, which is one intense monster, if run at maximum.

    It's a very bad, or rather, poor understanding of gaming architecture that takes the blame when it comes people downplaying CPUs.

    I can't expect to take the world's greatest GPU and merge with a crap processor to get perfect results. With SLi, you actually rely on CPU bandwidth and frequency to help the dual solution work most efficiently.

    This is why SLi/Xfire earned a bad reputation, and one that still stagnates all over the internet. The solution came too early, and as a result, was badly supported by drivers and poor CPU hardware. What's the next logical conclusion that people would draw? That SLi is crud, and even at its best, will only yield barely--maybe--50% increase in performance. All this because their graphics were not scaling properly.

    Then came the point where people starting using stronger CPUs, and other enthusiasts began to overclock their configurations. What was the result? Something much better, along with a newfound acknowledgement for the true strengths and weaknesses of dual-VGA solutions.

    With regards to this thread:
    SLi, by its very design, requires CPU power. The faster your CPU, the 'better' your SLi configuration will perform.

    Many newer titles, especially ones that operate on sheer scale and populated A.I., eat up CPU resources. It is also becoming pretty evident that more and more newer games are starting to employ CPU hungry engines. So that potential extra 500+ MHz of CPU power can really come in handy.