The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    CPU and GPU relationship

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by KidProdigy, May 19, 2008.

  1. KidProdigy

    KidProdigy Notebook Consultant

    Reputations:
    2
    Messages:
    280
    Likes Received:
    0
    Trophy Points:
    30
    So I know that many people have asked this question, including myself, and I'm hoping to clear this up a little. The question is, "Will so and so processor give better performance in gaming than if I get this processor? and if so how much?" or something of that sort. Well anyway, for those of you who want to know the answer to this question, read this: http://www.tomshardware.com/reviews/cpu-gpu-upgrade,1928.html


    As for those of you who are too lazy, I'll give you a general idea. Basically, it all depends on how good your GPU is. I know that before people have said that 2.0 ghz dual core is fine for even a 8800M GTX. Well apparently, those people were only partially right, as it really depends on the game. For example, CoD4 played at 1920x1200 and 4xAA showed an insignificant difference between 1.8ghz and 2.8ghz. However, Crysis showed a difference at 1920x1200 between every processor speed, so I assume Crysis is just more processor demanding or something. Of course, the difference isn't as large as comparing that between two GPUs, but it's still significant enough to show that a cpu upgrade might just increase fps in a game if your gpu is already fast.

    Well anyway, there you have it. Read the article for yourself to learn more =]
     
  2. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    Not bad but it doesn't really explan the actual relationship (the bottleneck) that well IMO. Alot of information there, mostly benchmarks so I will definitely read it all later.

    I have something I can link on the actual relationship tho if I can manage to find it....

    Edit: here it is
    http://www.techwrighter.com/index.p...ask=view&id=69&Itemid=27&limit=1&limitstart=0

    I do not like this version of the article too much. I wrote a guide on this on the widescreengamingforum.com site and it had such good feedback that I was asked permission to publish it on a website. I gave permission but when he edited it to his taste some of what I said was given a new meaning and he kind of took what I said too literally. However I cant find my original post for the life of me.
     
  3. Nirvana

    Nirvana Notebook Prophet

    Reputations:
    2,200
    Messages:
    5,426
    Likes Received:
    0
    Trophy Points:
    0
    I think the reason why crysis is demanding more processor is because the game involved lot of physic calculation, a physX card might help.
     
  4. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    Here found it: Here is my orignal post:

     
  5. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    To OP. And this is some kind of revelation? Because of what? All this is already known what is the point? The PhysX is crap. With that said all the rest of us know the issue/situation. Thanks, Glad you expanded.
     
  6. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    I heard that the whole phisix thing was just a "scam" of sorts. They purposely (or accidentally) programed the game in a way that without the unit present it ran absolutely horrible.

    I mean at first there was that one game that they said cant even run without it, but hackers found a way to allow the game to run without it and it did. Yes it ran like crap but it ran proving the phisix thing was not needed.

    the actual power in the phisix card is so small any current dual core cpu & modern gpu should be able to pull that extra weight no issues at all.

    I dunno, its just not my cup o' tea. There is enough stuff to spend money on in a computer already so I am not making a phisix card one of them.
     
  7. sirmetman

    sirmetman Notebook Virtuoso

    Reputations:
    679
    Messages:
    3,291
    Likes Received:
    0
    Trophy Points:
    105
    Well, of course you can emulate the physx harware on software, and of course it will run way slower than with the hardware. You could emulate an 8800m GTX in software too, and it too would run rediculously poorly. It's not a "scam", it's a hardware requirement to get a certain degree of performance. Keep in mind physx was announced before multi core consumer CPUs were in wide use. The general consensus seems to be that having multi core mitigates the benefits of having physx. I kinda feel bad for physx; it was a good idea, they just happened to come out with it at just about the worst time the could.
     
  8. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    It was not a good idea, unless you mean for them to make alot of money off of nothing. The specs of a phisicx card it should have been selling for 30$ not the ridiculous amount they were asking for.
     
  9. sirmetman

    sirmetman Notebook Virtuoso

    Reputations:
    679
    Messages:
    3,291
    Likes Received:
    0
    Trophy Points:
    105
    First gen hardware almost always underwhelms. It was a good idea though, overpriced or not. The idea of a dedicated device for managing physics had the potential to be as revolutionary as dedicated graphics. There's a difference between the potential of the idea and the implementation.

    That said, I bet most of the cost wasn't for just the hardware. They were probably covering the cost of R&D and API/software package development.
     
  10. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    Nvidia, since it has acquired Ageia, has rumored that 8th series card could virtually do what the PhysX card can with just a software update...
     
  11. sirmetman

    sirmetman Notebook Virtuoso

    Reputations:
    679
    Messages:
    3,291
    Likes Received:
    0
    Trophy Points:
    105
    That doesn't suprise me. The thing is we are talking about Ageia and the idea of a physics card as one in the same. I think Ageia was a failed venture, but the idea of a dedicated physics card is not a bad idea.