The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    HD3000 performance: 1600 vs 1866 mem

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Khenglish, Aug 24, 2011.

  1. Khenglish

    Khenglish Notebook Deity

    Reputations:
    799
    Messages:
    1,127
    Likes Received:
    979
    Trophy Points:
    131
    I mostly use an eGPU, but wanted to see what effect memory performance had on the HD 3000 for lan parties and such. I recorded benchmarks at 1600mhz CAS9, then flashed to 1866mhz CAS10.5 with thaiphoon burner. Since the HD 3000 shares with system memory, I expected a significant performance increase. Below are my results:

    ...........................................1600................1866...............gain

    3dmark vantage graphics:.......1623................1674...............3.1%
    3dmark06.............................4930.................5006...............1.5%
    RE5......................................31.5.................33.5................6.3%
    dmc4....................................40.3.................42.8................6.2%
    3dmark03.............................12302...............12803.............4.1%
    (why 3dmark03? still GPU limited!)

    So a 16.7% memory bandwidth increase resulted in a 4.2% performance increase. I also played around with only 1 stick of 1600mhz, which will have half the bandwidth of 2 sticks. From 1 to 2 I generally saw a 25% performance increase. Going off of my other results, I speculate that 1333 to 1600 is a 5-6% performance increase.

    So the main conclusion is that the HD 3000 scales pretty poorly with increased memory bandwidth. As long as you are running dual channel, you'll see pretty much all your HD 3000 has to offer.

    The secondary conclusion is that my memory kinda sucks for not even being able to do CAS10 1866mhz at 1.5V.
     
  2. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    Do you think there would be any performance gain on a dedicated GPU going from 1333mhz to 1866mhz?
     
  3. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    get a real GPU bro -_-
     
  4. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    says the guy with the macbook?
     
  5. Khenglish

    Khenglish Notebook Deity

    Reputations:
    799
    Messages:
    1,127
    Likes Received:
    979
    Trophy Points:
    131
    No. Sandy bridge has shown to not scale very much with faster memory, especially in games. Check out this link.
     
  6. Hungry Man

    Hungry Man Notebook Virtuoso

    Reputations:
    661
    Messages:
    2,348
    Likes Received:
    0
    Trophy Points:
    55
    I'd rather see game FPS than a benchmark.
     
  7. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    indeed, as the x220 users have done.

    there is quite a difference from going 1066 to 1866 using the igpu on these systems
     
  8. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    Benchmarks are standardized tests. Kept within the comparative scope, they are the best relative judge of performance.
     
  9. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    so far from reality
     
  10. hockeymass

    hockeymass that one guy

    Reputations:
    1,450
    Messages:
    3,669
    Likes Received:
    85
    Trophy Points:
    116
    I would doubt it, because dGPUs don't use UMA and GDDR5 is faster than DDR3 anyway.
     
  11. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    Well, in more CPU bound games, like SC2, there would be a much greater gain (X220 Lenovo owner who tested that, posted it in the lenovo section somewhere...)
     
  12. fred2028

    fred2028 Sexy member

    Reputations:
    196
    Messages:
    2,205
    Likes Received:
    1
    Trophy Points:
    56
    Lmao ahaha
     
  13. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    If you somehow managed to run out of dedicated VRAM then memory speed might make a difference. Chance of that happening is very slim though.
     
  14. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    Feel free to elaborate.
     
  15. fred2028

    fred2028 Sexy member

    Reputations:
    196
    Messages:
    2,205
    Likes Received:
    1
    Trophy Points:
    56
    I doubt the HD 3000 has dedicated RAM
     
  16. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    companies can rig the tests. ex: nvidia

    they arent related to game performance, just for bragging rights

    they havent a use

    they dont give meaningful info
     
  17. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    I know the HD 3000 doesn't, but someone asked whether it would make any difference to a dedicated GPU. Since most dedicated GPU's all (sensibly) still have the capability to use system RAM if they run out of their own, it is a situation where it could make a difference.

    The difference will most likely be a normal slideshow or a slightly faster slideshow since you've run out of real VRAM but eh.
     
  18. talin

    talin Notebook Prophet

    Reputations:
    4,694
    Messages:
    5,343
    Likes Received:
    2
    Trophy Points:
    205
    Synthetic benchmarks are rarely indicative of real world performance. Such as the case with 3dmark, where everything is preloaded in RAM, and then frames are rendered (actual pre-rendered frames, not on-the-fly as games are).
    That is why games are usually best to benchmark with.
     
  19. jeremyshaw

    jeremyshaw Big time Idiot

    Reputations:
    791
    Messages:
    3,210
    Likes Received:
    231
    Trophy Points:
    131
    and not the timedemo runthroughs that almost always have nothing to do with the actual game (i.e, Crysis island demo, metro 2033 tunnel benchmark)
     
  20. Generic User #2

    Generic User #2 Notebook Deity

    Reputations:
    179
    Messages:
    846
    Likes Received:
    0
    Trophy Points:
    30
    ya....if your dGPU needs to use system RAM, you're ed. no question about it.
     
  21. wrathchild_67

    wrathchild_67 Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Except that two of the benchmarks he used, DMC4 and RE5, actually do demonstrate in-game performance and they also happen to show the largest gains as well. To be fair, Capcom has optimized their PC games very well starting with DMC4 and onward.

    Going from 1333 to 1866 might be worth the 10-15% fps boost if it were only $75 for 8GB.
     
  22. Qing Dao

    Qing Dao Notebook Deity

    Reputations:
    1,600
    Messages:
    1,771
    Likes Received:
    304
    Trophy Points:
    101
    Yes, you are correct. Displaying min, max, and average game framerates from a standard sample is much more useful than any synthetic benchmark.
     
  23. Generic User #2

    Generic User #2 Notebook Deity

    Reputations:
    179
    Messages:
    846
    Likes Received:
    0
    Trophy Points:
    30
    thats not enough....they should be stating how much time it spends at any given framerate


    if a game runs for 1 frame/second for .1 second, its a hell of a lot different that then game running at 1 frame/second for 90% of the time.
     
  24. gdansk

    gdansk Notebook Deity

    Reputations:
    325
    Messages:
    728
    Likes Received:
    42
    Trophy Points:
    41
    What are you trying to say? The entire point of 3dmark is to use real time rendering technologies (such as Direct3D and OpenGL). If they were pre-rendered it would be a film/video. The only thing pre-rendered are textures (ie someone made them in Photoshop), which remains the case for games*. Just because it doesn't react to your input doesn't mean it isn't generated on the fly. A demo is a perfect example of this.

    * There are a few games that use procedural generation to create textures, but this an exception to the rule.