The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    quadcore vs dualcore BFBC2

    Discussion in 'Gaming (Software and Graphics Cards)' started by jacob808, Feb 16, 2011.

  1. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    I can't seem to find any benchmarks or performance differences between the architectures and how much difference each makes in Battlefield Bad Company 2, although I've read Dice made some significant optimizations to enhance the performance of the game when running with dual core CPUs in the R3 patch.

    I own the i5 460m and was wondering how much more advantage does the i7 740qm have over mine. Infact in general how much more advantage does a quadcore have over a dualcore in BFBC2 after the patch was released?
     
  2. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Well, DICE made a console port basically, enabling only DX10/11 render path to the pc version.
    The game uses 3 cores mainly. 2 for graphics/physics and one for the sound backend.
    The performance gain can be measured on average fps and minimum fps. Based on my tests (from E8500 to QX9650 and then to i5-2500) the gain is around 10-15fps. In some instances i got a gain around 40fps but thats due to Sandy Bridge cpu.
    With the old lynfield QX9650 i got a better min fps, with a 10fps gain, a general smoothness where the fps usually dropped.
    So i think its well worth the upgrade.
     
  3. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    I don't have any solid research on hand to show you, but last month I upgraded from a Core 2 Duo 3.0GHz to an AMD Phenom II x4 965 3.4GHz, and the difference in BC2 is extremely obvious. With the dual-core, I would get between 30 and 40 fps on low settings at 1680x1050 on most maps. After upgrading to quad core (same memory, graphics card, hard drive, OS, etc.), my frames are between 50 and 60 at the same resolution and medium settings. BC2 just loves a good quad core.
    Wat. BC2 uses DX9.
     
  4. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Consoles only run at dx9.5 , which means games that are multiplat are designed with that in mind, rather than an optimized dx10/11 pc version. This is why graphics are still primarily dx9, because of consoles. Anything else is merely added on, an after thought, more consumers on consoles vs pc, more money in dx9 still rather than dx11. That said BF3 may be the turning point being a dx11 only game, that and lets see how TESV turns out on pc vs console, it may be the reason to reinvest in pc gaming again depending on the success of the pc version of BF3.
     
  5. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    BF 3 is DX 10 and 11. Also, even though DICE SAYS that PC would only get DX 10/11 support, I am sure there is a hidden DX 9 mode somewhere that can exploited for the people on Windows XP

    TES V is probably going to be console-oriented. I am sure by BethSoft's history that they will give consoles the bigger push than PC's. I know it!
     
  6. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    BC2 ran like crap on my desktop 3.6GHz Core 2 Duo. So from personal experience, I'd say quad-core for sure.
     
  7. Mjolner

    Mjolner Notebook Evangelist

    Reputations:
    323
    Messages:
    590
    Likes Received:
    1
    Trophy Points:
    31
    My laptop has a core i5 520m and it runs BFBC2 just fine. My system is GPU bottlenecked, so any slowdowns I have are usually a result of my 5650M, not my CPU. My desktop with a quad core Q6600 seems to run at a similar level of performance (but at much higher settings), so it seems the new dual cores can run it about as well as the older quad cores can. I usually get at least 45 FPS on full 32 player servers except for on very small maps when there are a lot of particle effects going on.
    A friend of mine has a laptop with an i7 720QM video card, and BFBC2 doesn't seem to play any better on his laptop; it also has a 5650m and is also GPU bottlenecked.
     
  8. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    I didn't hear that BF3 was getting dx 10 just dx 11, dx10 is pretty much just dx9, anyways, TES4 was designed with pcs first, I thought, it ran and looked way better on the pc then either 360 or ps3 versions, ALOT of people had to upgrade their cards just to run it, also the texture mods really breathes life into it. Regardless I agree with you that it most likely will be a port , but doesn't mean that can't really beef up the pc side, bethesda's been pumpin hit after hit.
     
  9. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    thanks for the input guys. Does anyone have any links to benchmarks with graphs? you know like the kind you find on tomshardware, I'd like to see in detail what the numbers are, also what kind of performance increase was done with the R3 patch that supposedly gave significant performance boost to dual core CPUs.
     
  10. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Eh, Fallout 3 and New Vegas didn't improve that great graphically IMHO. Also, I ran vanilla Oblivion on Medium-High settings on a X600XT back then! lolz.
     
  11. stevenxowens792

    stevenxowens792 Notebook Virtuoso

    Reputations:
    952
    Messages:
    2,040
    Likes Received:
    0
    Trophy Points:
    0
    Why do we continue to get in these talks over and over. BF3 is dx10 and 11. Confirmed. I don't care what they provide for the console because the dev teams have confirmed they are paying special attn to the PC release. and I am holding them to that.

    @Jacob- I did some benchmarks jumping from a i3-350m (2.26ghz) to a I7-720qm (1.6-1.73ghz) and I got about a 15-20fps increase. I have helped the m11x folks optimize heavily for BFBC2. I have spent much time over and over and over benchmarking.

    External website tests confirmed that a high speed dual core is fine for BFBC2. However if you want all details enabled then quad or higher is recommended. The problem lots of folks run into is that we install so many games but we don't work to optimize our machines. We don't cleanup our startup items, we don't turn off processes in which we dont need. We don't clear up our registry. In my opinion, if you are gaming on a notebook you should be able to look at your performance monitor in the taskmgr. If it's at anything above 0 percent at idle and without mouse movement you have work to do. Feel free to pm me if you want chat more about your situation.

    Best Wishes,

    StevenX
     
  12. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Im just going off an old assumption, I remember watching youtube videos were people were like "man oblivion beats the snot out of my set up time to upgrade!" But this is a new enging for TESV don't forget either, so who knows maybe they added some DX11 support and really abuses the pcs memory bandwidth.

    @stevenxowens792 regardless of what devs say, its DICE and EA, a huge company that only wants your money, so regardless will it be that big of a difference over the console versions, enough to be want to spend a huge lump on a pc build, hardly. Garanteed, no matter how mind blowing the tesselation is, and DX10 is pretty much just a refresh on DX9, proven time and time again.
     
  13. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Now you want me to seriously go quad! You sure have a convincing case there...

    Considering the recent track record of Bethesda Softworks, I doubt they will spend the time or energy developing DX 10 for PC's on their new engine. Even Relic halted DX 10 integration for DoW II, yet they successfully attempted and implemented it for CoH. Also, Ubisoft gave up on attempting DX 10 for AC2, when they did for AC1.

    The thing is, many game studios still don't feel like giving PC Versions special treatment. They don't see the incentive worth it to expand the resources for it. In a sense, it is nice to see DICE trying to go back to their original PC roots. However, DICE's optimization and track record of making good FPS's has been lackluster ever since BF 2 (mostly because they though it was a GREAT idea to fire the BF 2 development team one week before BF 2 was launched). So I have to reserve any judgement until I see in-game demos. FrostBite in BC 2 is impressive, but still has its share of bugs and lack of optimization that could have been done to improve excellent performance.

    I want to also note that at the time of Oblivion, the X360 was just launched and the X360 had in a sense, a bit superior hardware than most average systems at the time. Therefore, an upgrade was quite necessary. Now as we all know, PC's are trumping consoles quite significantly, but the gaming industry has also shifted gears from PC-favoring to Console-favoring.

    Oh how times has changed for everyone in this world... :rolleyes:
     
  14. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    I agree, Im on your side, serious. Theres nothing wrong but one can hope they spend a little time, im more consoles then pc, owning both ps3 and pc, but certain games I will get on pc like BF3, and TESV. Regardless of if they do spend time making pc better then pc it won't be much better, just a tad bit. Not enough for casual gamers to be OMG over it, but a little bit of my hardcore side wishes there was juts one game that sparks the next gen, getting off the DX9 slump.
     
  15. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Weirdly, I like DX 9 a lot. I really haven't found DX 10 or 11 any significantly better. I have ranted about this in one of the two Crysis 2 threads, but I care less about lighting and shadows. DX 10 and 11 only improved the most on lighting and shadows and force hardware to run loads of power to render what I call, superfluous fluff in games. What sparks me a lot is textures and resolution of textures. To be honest with you, DX 10 and 11 has seen minimal improvement in textures from DX 9. I found the evolution from DX 8 to DX 9 very significant and noticeable and justify's the overall generation-span upgrades to PC's. DX 9 to DX 10/11, however I am really disappointed. I really don't see anything much better in my gaming immersion from 10/11 than from 9. Anything significant that is...
     
  16. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    I believe I agree with you on that front aswell in the other crysis thread, I posted this image File:Unreal Engine Comparison.jpg - Wikipedia, the free encyclopedia and like you said dx11 really doesnt bring much to the table, tesselation..... Yay! But I think GPU Developers need to start looking for the next thing, infinite textures, 1 billion polys, were the experience would be more like looking out a window in your house rather than looking at an image on a screen. And im sure somewere in some secret lab someones looking rediculous graphics running in real time.
     
  17. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    Thanks man, although I don't have the option of upgrading to a quad core, but I really was interested in seeing how efficient the dual optimization patch made the dual core CPUs, compared to before the update and studies on how much of an increase it gave before the R3 patch with numbers detailing what the advantage of the quads were compared to duals before the update, and then after getting the dual optimizations what the difference is now with the optimized dual patch applied vs the performance of quads. Does that make sense?
     
  18. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    well, depending on the game.... I have seen Metro 2033 look quite dead and bleak in DX9, whereas DX10 looks quite more atmospheric.

    I also notice a difference in DX9 vs DX10 in BFBC2 but it is not nearly as noticeable as with metro 2033.

    Most games use DX10 in a horrible way though, being a detrimet to performance and overall looking the same. DX9 is more than capable for great visuals.

    I think DX10 never kicked off and DX11 just arrived. DX9 has been around a long time with its revisions.
     
  19. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Before I had the desktop and laptop in my sig (and the only computer that was "mine" had integrated graphics), I'd use my wife's desktop to play BC2. Even on her 3.4 ghz overclocked E3200, I only got between 30-40 fps with dips into the 20s in heavy action areas, though only for a couple seconds. This was with both CPU cores constantly pegged at 90-100%, so the bottleneck was obvious.

    The desktop I built for myself has twice as many cores so that's not a fair comparison, but I did briefly own a dual core Y460 laptop with the i5 460. This one ran the game as well as the GPU would allow, reaching a VSync-limited 60 fps in many places. Given the vastly different performance on both these machines, I can reasonably come to the conclusion that BC2 needs more than two cores, even if the extra ones are hyper-threaded.
     
  20. Mjolner

    Mjolner Notebook Evangelist

    Reputations:
    323
    Messages:
    590
    Likes Received:
    1
    Trophy Points:
    31
    There is definitely something about the new dual core i5 CPUs that doesn't lead to performance drops in BFBC2 like core 2 duos will. I am not sure if it is the hyperthreading, but that might be a contributing factor. My CPU is usually used 90%+ when playing BC2 (with the clock constantly at 2.7 Ghz in turbo mode), but I still can get at least 50 FPS at most times.

    My desktop, on the other hand, played BFBC2 horribly at stock 2.4 Ghz clocks even though it has a quad core CPU (which is why it is overclocked to 3.2 Ghz). However, I think that was partially because of an issue that was only recently fixed involving low GPU usage with LGA 775 CPUs and nvidia GPUs; I would have only 30 percent usage on each of my cards when a lot of stuff started happening, but one of the newer Nvidia drivers fixed it and the cards are usually at 80-90 percent each and rarely drop below 60%. However, before they released this patch I was getting smoother gameplay at certain points on my Y460 than on my desktop that scores 13,000+ on 3dmark vantage.

    The main reason I got an i5 was for switchable graphics, but now with the Sandy Bridge CPUs you can get that on a quad core i7, so it might be worth it to look into one of those for maximum future proofing (is that a word?).
     
  21. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    uhm nope.
    If youre not used to pc gaming tweaks thats what you get, standard dx9 render path.
    Open the config file or use BC2configuration tool to enable dx10 or dx11.
    Game switches to dx11 automatically when a compatible hardware is found, enabling terrain tessellation.
    I think anyone knows this by now.
     
  22. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    However at if you put the Generation 1 Core i7 Quads in comparison, most barely hit over 2 GHz, so yes you get 4 dedicated cores processing the game, but each core is running on a slow speed, so in that sense, it is still a bottleneck that is slightly better than a high clock speed Arrandale. In other words, it is not that better than you think. In order to fully run BC 2 on maximum power CPU-wise, you must have a Quad Core that is running over 2.4 GHz constantly on 4 Cores to get the effect. Am I right?
     
  23. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    My i7 740qm runs the game flawlessly with no CPU-based slowdowns that I can notice. FRAPS puts me somewhere between 40 and 60 fps most of the time, except during large explosions which I'm sure are GPU-bound, not CPU-bound. I'd say that I don't notice any difference between the lower-clocked i7 and the higher-clocked i5 in BC2.
     
  24. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Hmmm... Now I kind of regret getting the i5. I always was on the pretense from people that the i5 will be cooler and faster with most games as it has HT and high clock-speed dual cores. However, it looks like that is not the case with some of the newer games coming then, that a slower-clocked i7-740QM is still the better option by the looks.
     
  25. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Same applies to desktops i3s and first gen i5 that are dual cores.
    Without doubt clock-per-clock performance is higher than the best core 2 duo intel ever released, series E8xxx, but still its a dual core.
    The fact that not all the programs/games supports HT, most of them dont.
    On a developer side is less problematic develop a game that uses 3 or 4 more physical cores instead creating a game engine/routine that uses 4 virtual threads (HT) or 6.
    Alot of peoples with Desktops i7 as an example (1st gen) are forced to disable HT under overclock as it gives less performance compared to 4 physical (or 6) cores always active.
    Intel also seems to have partially abandoned the HT solution, started from Pentium IV era, to favor Turbo Boost.
    Cpu Load across 4 physical cores is still better than 6 virtuals, as the game and os needs to be optimized for using them, which is a resource expensive investment.
    A few years ago Quads werent adviced for pc gaming as they were not optimized; times have changed since developers are using console sku's to direct port games on pc, therefore the need of 3 or more physical cores is necessary.
     
  26. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Hmmmm... I guess it is time to do a swap then. :p Fortunately all the game I have been playing so far hasn't been stressing the dual cores out too much. However, I have feeling it might change soon.

    PS: To your comment about BC 2 being only DX 10 and DX 11. You are wrong. BC 2 supports XP and DX 9 Graphics Cards. Its minimum is weirdly enough a 7600 GT.

    http://en.wikipedia.org/wiki/Battlefield:_Bad_Company_2
     
  27. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    I don't see any reason not to go with the quad, especially in my situation where battery life is rarely, if ever, a concern. I got the laptop to use around the house and to take to class once a week, so as long as it lasts 2 hours when not gaming I'm good there. And with the (wonderful) little invention called turbo boost, the quad cores will perform just like the higher-clocked dual cores when they need to.

    For instance, my i5 460 ran at 2.53 ghz on 2 cores and 2.8 ghz on one core. My i7 740 runs at the same 2.53 on 2 cores and 2.93 on one. And when something wants to use more than two cores, it's probably going to be optimized well enough to split the load so that even though individually they're clocked lower, the extra cores will still give a performance boost. This was my reasoning when ultimately deciding to go with a quad, and I have not regretted it yet.


    I have BC2 running in DX9 on my laptop to boost graphics performance. You have to manually enable it in the config file, but it's an option and it's supported.
     
  28. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    No im not wrong man.
    Please read carefully, im not born yesterday :) I said that the only thing DICE made into porting the game to pc was to enable DX10/11 render patch.
    That implies that the game can run in DX9 mode whenever the hardware is found if it does not support other render paths.
    Maybe i didnt explained myself, but i dont see anywhere in my posts sayin that BC2 supports ONLY Dx10/11. I just pointed out the fact that it supports DX11 as well on compatible hardware.

    To quote myself:
    Which means "as far pc optimizations exclusive features are intended" / "added features".

    As for your upgrade, well you need to. Newer console ports are strictly multi core optimized, NFSHP and Black Ops as an example.
     
  29. maksin01

    maksin01 Notebook Deity

    Reputations:
    446
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    +1
    Same here with my notebook. :D
     
  30. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    From your initial post, "enabling only DX10/11 render path to the PC version", it sounded like you meant there was no DX9 capability at all.
     
  31. djboz

    djboz Notebook Consultant

    Reputations:
    11
    Messages:
    128
    Likes Received:
    11
    Trophy Points:
    31
  32. Kingpinzero

    Kingpinzero ROUND ONE,FIGHT! You Win!

    Reputations:
    1,439
    Messages:
    2,332
    Likes Received:
    6
    Trophy Points:
    55
    Well my bad then. I was pointing out what they added compared to consoles. Still its a poor port.
     
  33. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Mastershroom and I misread and when I posted my reply and also for me, it was awfully late so errors do happen.

    Also, good to know stuff on Quads. Times has changed.
     
  34. Mastershroom

    Mastershroom wat

    Reputations:
    3,833
    Messages:
    8,209
    Likes Received:
    16
    Trophy Points:
    206
    I think it could have easily been much worse. The game runs great if you back it with decent hardware, and the mechanics feel pretty natural, not like mouse control is obviously forced.
     
  35. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Those benchmarks, especially the second set, do not coincide with my personal observations, where a 2.5 ghz i5 massively outperformed a 3.4 ghz Core 2 Duo. I know that architecture changes can't be that significant, so the only other thing I can think of that could cause such a difference in performance would be the presence of additional cores via hyperthreading.

    Also, the fact that the benchmark was run at 1920x1200 with high settings, 4xaa and 8xaf might have something to do with the i5, i7, and overclocked i7 runs being within 2 fps of each other. If it was a true test of CPU performance, they should have run it at 800x600 with everything on low or turned off. The fact that measured GPU usage was over 90% in all three tests points to that being the bottleneck, not the CPU.
     
  36. jacob808

    jacob808 Notebook Deity

    Reputations:
    52
    Messages:
    1,002
    Likes Received:
    0
    Trophy Points:
    55
    Oh wow! haha, this is exactly something I was looking for when I posted originally! I know alot of other peeps went of topic, in this thread, but thanks for finding this. It's also dated Oct. 2010, so it's after the R3 patch meaning it's showing the optimized performance on dual cores, and seeing the results DICE did an impressive job. Well even though we don't have a reference to compare with dual core performance before the optimization. lol

    Thanks again, and if anyone else finds more benchmark graphs similar to this one post em please. I'm very curious about this.