The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    HD 4000 vs HD 3000 (Ivy Bridge tested)

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Cloudfire, Feb 20, 2012.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  2. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    love it! now Im just going to wait for the new mbas to come out
     
  3. rflcptr

    rflcptr Notebook Consultant

    Reputations:
    49
    Messages:
    232
    Likes Received:
    0
    Trophy Points:
    30
    That's quite good.
     
  4. wild05kid05

    wild05kid05 Cook Free or Die

    Reputations:
    410
    Messages:
    1,183
    Likes Received:
    0
    Trophy Points:
    55
    Haswell will double that !
     
  5. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Not bad.. Its like going from "Worst" to just "Bad". :)
     
  6. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    It's going to be glorious! We're getting really close to not needing dedicated GPUs. There's only the fact that AMD and Nvidia specialize in doing GPUs (and therefore provide better drivers, etc.) that is holding me back.

    So what Nvidia/AMD GPU does this ballpark around then? Also, the Starcraft II and Far Cry scores are still a little off-putting. A two year old game simply becoming "playable" isn't something to go crazy over.
     
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Great info, thanks +1!

    Assuming that's desktop HD 4000, mobile might be slightly reduced performance?
     
  8. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    GPUs still totally own the Intel integrated solutions. The only thing that is on-die that even starts to have real performance are the AMD A Series of processors.

    Intel's Larrabee Platform was supposed to be the great equalizer.. but it went the way of the Dodo and never saw the light of day.
     
  9. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yeah, even the A8 6620G is a good performer, but still leaves a lot to be desired if you want to run at any amount of detail or higher than 720p resolution. I think the AMD Trinity APU's will improve on that and be more of a true low-mid end GPU. This HD 4000 is barely better than the mobile 6620G in the current gen Llano's, of which the Trinity GPU should trump by a solid 40-50%. Not to mention driver support. Intel just doesn't give the IGP the attention it deserves, and they should.
     
  10. ntrain96

    ntrain96 Notebook Evangelist

    Reputations:
    17
    Messages:
    564
    Likes Received:
    0
    Trophy Points:
    30
    OK, the Ivy Bridge in reality is only useful for those who have an integrated video gpu(aka HD 3000/4000 etc)that is actually used correct? What does Ivy Bridge offer to those with high end graphic solutions other than a die size and power reduction? And from what I understand these new CPU's will be backwards compatible in all Sandy Bridge laptops with a FW update correct?
     
  11. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    Not shabby. If Intel can work on drivers and continue to improve their iGPU solutions, it will be in no time that nVidia and ATI are no longer needed. A true performance CPU/GPU AIO solution for everything? I might be interested in that.

    For now, I will still hold on to using dGPU solutions for the drivers and higher-end performance, but Intel might be up there in a matter of years at this rate.
     
  12. ivan_cro

    ivan_cro Notebook Consultant

    Reputations:
    23
    Messages:
    121
    Likes Received:
    0
    Trophy Points:
    30
    way lower power consumption under load (less heat/ more battery life for notebooks, and higher overclocking for desktops), and a true random number generator :)
     
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Don't get your hopes up. Considering the desktop HD 4000 is only marginally faster than the AMD mobile Llano 6620G doesn't impress me much, and their history of driver updates is lackluster.
     
  14. calc_yolatuh

    calc_yolatuh Notebook Evangelist

    Reputations:
    153
    Messages:
    309
    Likes Received:
    1
    Trophy Points:
    31
    I don't entirely buy those numbers. Was SF IV run on Low or High? (it changes between the two sheets but FPS are the same)

    The synthetic scores are great but the actual game scores I think are worse than 6620G. There were no screenshots, which are important because HD3000 had plenty of graphical problems when tested before.

    The page claims most Ivy Bridge will have a lesser HD2500 but there are no clear numbers yet. They did not supply any details about system wattage.

    The thing you are looking for here is the improved transcoder. That should be common to all the new chips. Possibly Intel has either improved quality, speed and/or color reproduction, the latter being most important to me.

    Could really hope the same guys benchmark against AMD desktop A6 or mobile A8, same games and settings.
     
  15. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Actually they should benchmark against Trinity since they should be released about same time frame. I'll have to bench mine against what they did and see results with 6620G. But problem is I don't own DiRT 3 and not sure how they tested Starcraft 2 since there's no standard benchmark.
     
  16. calc_yolatuh

    calc_yolatuh Notebook Evangelist

    Reputations:
    153
    Messages:
    309
    Likes Received:
    1
    Trophy Points:
    31
    That is why I take issue with their numbers, they're kinda scattershot and not properly documented. They also tested with 1333 RAM supposedly, though at that speed there should not be much improvement with 1600 or faster.

    If AMD delivers, even the new desktop A6 should be faster by some margin. A10 should flatten 3750K by a large margin. Most curious about AMD claims that 17W package will have 50-60% faster GPU than IB-ULV, 25W 160%+ faster. We're just nervously counting down to a major comparison.
     
  17. calc_yolatuh

    calc_yolatuh Notebook Evangelist

    Reputations:
    153
    Messages:
    309
    Likes Received:
    1
    Trophy Points:
    31
    Dammit, 3570K. Mixed up my prime numbers.

    If we're lucky there will be a large head-to-head when the wall comes down. HT your numbers have been nice because we can see the game setting that achieved them. Thanks again.

    Dirt 3 number has me suspicious. Game defaulted to "optimal" for me, 1024x768 with settings all over the place including MSAA and postFX turned on. I upped to 1366x768 and left the rest alone, it gave me 45avg/35min fps with 6750M/6620G crossfire and all clocks at stock (catalyst 11.11c still). Visually smooth with no jitter. If I force everything to Medium and disable MSAA it should run much faster. Will try it.

    Far Cry 2 benchmark stock settings (everything at High and using DirectX10), same crossfire and stock clocks, my FPS ran as high as 120 during flyby and more like 30-something for combat demo.

    I will re-run and post a few numbers in HT's 6620G thread with appropriate screenshots.
     
  18. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    Right. My point was more that for most people's usage, the reasons to get a dedicated GPU are slowly getting less and less.

    Also, I don't think it would make a lot of sense to compare it to a 6620G, as the CPUs in that comparison would be vastly different, skewing the results.
     
  19. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    What would skew the results? Game performance is game performance whether it comes from CPU or GPU. So far with a much more powerful DESKTOP GPU the performance is marginally better than the current gen MOBILE AMD GPU.
     
  20. Vect

    Vect Notebook Evangelist

    Reputations:
    428
    Messages:
    409
    Likes Received:
    12
    Trophy Points:
    31

    That's called "overclocked amd A4-A8"
     
  21. Mechanized Menace

    Mechanized Menace Lost in the MYST

    Reputations:
    1,370
    Messages:
    3,110
    Likes Received:
    63
    Trophy Points:
    116
    Intel should just buy NVidia. :p
     
  22. somedood

    somedood Notebook Enthusiast

    Reputations:
    36
    Messages:
    16
    Likes Received:
    0
    Trophy Points:
    5
    This is good news for when I buy a laptop for my wife who does light gaming, but I'm curious to see if the results can be duplicated. I remember the sandy bridge igp was soused to be awesome but that hasn't really been the case
     
  23. calc_yolatuh

    calc_yolatuh Notebook Evangelist

    Reputations:
    153
    Messages:
    309
    Likes Received:
    1
    Trophy Points:
    31
    If your wife needs good battery, modest CPU and light gaming, just get an AMD laptop. Cheap and not very difficult to squeeze more out of it. However if she needs a super CPU or continually uploads to youtube, a good i7 may be the better choice.

    Look at the links in HTWingnut's sig. The 6620G is AMD integrated graphics in A8 series mobile APUs. It will play Dirt 3, Skyrim, etc....
     
  24. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yes HTWingNut, the IGP inside the desktop CPUs are a little higher clocked so the performance is a bit higher. However the percantage boost in IGP performance in desktop CPUs should be the same in the notebook IGP performance.

    Oh well, since nobody else bothers I might as well just do it myself :p

    Average FPS increase with HD 4000 compared to HD 3000 is +56%. Mind you that these tests are from a 2820QM and some of these games are probably a little CPU bound, which means that the calculations are a bit off. A 3720QM++ should score better with these games. And most importantly, the scores from the first post in this thread is from an engineering sample. Take the following comparison with an enourmous bucket of salt. Just me a little bit bored :p

    HD 4000 scores 54.8FPS. 13.9% better than the Llano
    [​IMG]

    HD 4000 scores 38.5FPS. 3.2% better than the Llano
    [​IMG]

    HD 4000 scores 27.0FPS. Llano scores 5.9% better
    [​IMG]

    HD 4000 scores 12.3FPS. Llano scores 33.3% better
    [​IMG]

    HD 4000 scores 79.6FPS. 16.9% better han the Llano
    [​IMG]

    HD 4000 scores 38.0FPS. Llano scores 13.7% better
    [​IMG]

    HD 4000 scores 82.4FPS. 23.0% better than the Llano
    [​IMG]

    HD 4000 scores 60.2FPS. 23.9% better than the Llano
    [​IMG]

    HD 4000 scores 27.9FPS. Llano scores 22.6% better
    [​IMG]

    HD 4000 scores 24.5FPS. Llano scores 24.1% better
    [​IMG]

    HD 4000 scores 28.5FPS. Same performance
    [​IMG]

    HD 4000 scores 24.3FPS. Llano scores 10.7% better
    [​IMG]

    HD 4000 scores 103.2FPS. 30.7% better than the Llano
    [​IMG]

    HD 4000 scores 27.6FPS. Llano scores 17.1% better
    [​IMG]

    AnandTech - The AMD Llano Notebook Review: Competing in the Mobile Market
     
  25. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Sounds like Intel is now on-par with the current 6620G in the A8. That really is great news for integrated graphics users, people with low end machines or stuck with business laptops. It will open up a lot of games to people that wouldn't normally have access with their machines.

    Now, the HD4000, are there several variants of it? Or a standard version across all platforms and CPUs?
     
  26. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    If Intel can work on their drivers though, then it's quite possible they could squeeze out more performance... unless of course, they already did.
     
  27. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Thanks Cloudfire, for doing that +1. (wups, I'm outta rep atm)

    I didn't even see that article in my searches. My googling needs some tuning up. :p

    The trend there is that the Llano does better with added detail. HD 4000 is definitely a leap ahead of the HD 3000, although Intel needs to keep on the ball with drivers. Not to mention the Trinity GPU supposedly has 50%+ GPU performance improvement at 1366x768, although don't have anything to back that up. I hope AMD releases review samples sometime soon.
     
  28. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    the latest intel drivers gave me some very needed improvement regarding fps and stability.

    I had to use older drivers to run me and me2 without locking up.
     
  29. MAA83

    MAA83 Notebook Evangelist

    Reputations:
    794
    Messages:
    604
    Likes Received:
    3
    Trophy Points:
    31
    I like this trend. Hopefully one day discrete GPU's will be limited to the high end/enthusiast gaming rigs, and on-die graphics will eat up the midrange cards just as they (almost) have with entry level GPU's!
     
  30. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    They've been saying that IGP's will replace mid-range GPU's for a while. The Llano is the closest an IGP has come, but its still not quite there for low end. Again, driver support is critical. Intel will never get there until they put more effort into regular release of drivers. What they need to do is add even a small amount of dedicated GDDR5 and it might have a fighting chance. Even 256MB GDDR5 as a "buffer" RAM from the system RAM would help things immensely. There's been rumors that AMD will be doing that, but still nothing solid anywhere. Trinity has been so hush-hush it's getting annoying. I hope they're just waiting to release a phenomenal product instead of trying to hide issues.
     
  31. RainMan_

    RainMan_ Notebook Evangelist

    Reputations:
    180
    Messages:
    396
    Likes Received:
    11
    Trophy Points:
    31
    Still not meant for gaming ( obviously ).
    Sadly some people are being fraud and told that this is a gaming gpu.

    My friend went to buy a laptop last week seller told him that this Intel HD 3000 is 1.6GB graphics card and can play any game easily.

    People need to know that graphics cards doesn't depend at all on memory amount!
     
  32. MAA83

    MAA83 Notebook Evangelist

    Reputations:
    794
    Messages:
    604
    Likes Received:
    3
    Trophy Points:
    31
    To be fair though the current crop of truly on-die gpu's have only been around for 2 years or so. I don't think intel was really trying to compete with dGPU solutions with GMA and Extreme Graphics, those were targeted at equivalent nForce and ATI integrated northbridges. They've come a decent way in 2 years since clarkdale/arrandale introduced Intel HD. And they seem like the company who would take a slow and steady approach to chip away (no pun intended) at entry level/midrange dGPU turf, rather than spearhead it. But they've negated the need for entry level cards, why get a GT 510/520 for the desktop or a laptop with 520m when there's an HD3000?
     
  33. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Right. And you CAN game with these, as clearly shown, I just wouldn't make it your dedicated gaming machine. An occasional romp in Skyrim or playing classic titles, it will work wonders. Once they're able to make any new release title playable, i.e. 720p > 30FPS, then they've made great strides.

    In any case I'm impressed with the improvement over HD 3000. And HD 3000 is a great improvement over the 4500MHD that was so prevalent prior to that. Kudos to Intel. Hope they can keep the momentum going.
     
  34. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    that is if you can, try playing rise of nations with a hd 3000, compatibility is growing, but it is still an issue
     
  35. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    Yeah, as others have said, the one thing that would hold me back about a mid-range Intel GPU and a dedicated GPU are the drivers. Though as also mentioned, a few months ago, Intel released a new driver for the HD3000 that massively improved performance.

    You also have Intel's weird decisions too, like not supporting OpenCL on the HD3000, and only supporting OpenGL 3.0 (even though Apple has been able to get the HD3000 to support 3.2).
     
  36. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    You make a good point. It's all about resolution. I paid a whopping $29 for an AMD 5450 card for my Kids PC. Its a 1.6ghz Dual Core Celery with 2GB. They have a 17" LCD with a 1024x768 Native Resolution.

    That thing, is a beast for the games they play. They just finished Dragon Age Origins, with great 30fps frame-rate and decent eye candy. Dragon Age II runs equally as well. Modern Warfare 2 runs good enough on lower settings. Deus Ex: HR runs well. They have also played through several other major, newer titles with no problems at all.

    Running the highest settings is not possible on several of them.. but most of the games look great and run great at Medium to Higher settings.

    That is on a GPU with only 80 Shaders running at 675mhz on DDR3. (1gb). Most smaller laptop displays are only 720p, which is only around 20% more pixels than the 1024x768 display.
     
  37. calc_yolatuh

    calc_yolatuh Notebook Evangelist

    Reputations:
    153
    Messages:
    309
    Likes Received:
    1
    Trophy Points:
    31
    Worst bait n switch I've seen was a Cyberpower Gamer Ultra...with gt520 "multimedia player" card. Runs a few games okay, but really neither Gamer nor Ultra....just a gimmie til some kid saves enough allowance to buy a card off the shelf at Best Buy...

    I take issue with those numbers from Anandtech. Several of the games are practically guaranteed to have been limited by the 3500M's low base clock. After a few hours with K10Stat you should have an OC in the 2.2-2.5GHz range, definitely going to improve a few of those titles. AMD drivers have improved since then, in fairness the Intel drivers probably have too. (but I have no direct exposure)

    I think Shogun 2's lowest quality was rewritten entirely since that review, explicitly to improve Sandy Bridge performance. Lots of goodies in 5GB of patches y'know. However DX11 performance is key, if HD4000 can manage that it will make Shogun 2 look much better than HD3000.

    Like I said, try to draw some comparison from HT's up-to-date 6620G numbers with a fast CPU. I still need to get some stuff up, there's a storm coming so I'll have the time for it. I can provide numbers for the following games:

    Mafia 2
    Far Cry 2
    Shogun 2
    Dirt 3

    I intend to check my clocks again and profile 1.5GHz vs 2.3+, though for giggles I may sanity-check with forced low clocks.
     
  38. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah it seems that the Llano does indeed a bit better than the HD 4000 at higher settings with the games. Is it one thing AMD knows, it is GPUs. Right now it is kinda like AMD one generation ahead of Intel at graphics and Intel one generation ahead with raw CPU performance. Kinda funny really :)

    I don`t doubt at all that faster CPU than 3500M will give better results than what Anandtech got. There are like 3 (?) higher clocked Llano`s with +10 W TDP. Which is why I said take the calculations with lots of salt. Many of the games are probably a bit CPU bound too. And then we have the improved drivers from AMD. Which makes it highly relevant to see how much better the Llano is now compared to launch, but in all "fairness" a bit unfair to compare since the HD 4000 too will improve with newer drivers. :)

    I am super excited to see the official results from the top reviewers once Intel release the damn Ivy Bridge, and equally eager about the Trinity APU that AMD promised us so much about. I am just hoping that they don`t pull a Bulldozer on us for those who know what it is (lots of promises but poorly execution). Especially for those who suffer through the months after Ivy is released to wait and see how the Trinity will be :)

    @TheBluePill: While the desktop have several different IGP versions, HD 2500 and HD 4000, HD 2500 with 8EUs and HD 4000 with 16 EUs, and a bunch of different clocks, mobile CPUs will only have the HD 4000 but with minor differences in clocks of the IGP
    See here: http://vr-zone.com/articles/intel-s-mobile-ivy-bridge-cpu-line-up-revealed/14148.html
    http://vr-zone.com/articles/intel-s-core-i7-3612qm-verified-as-35w-part/14229.html
     
  39. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Yeah I'm really afraid for a Bulldozer mobile fiasco, but I doubt it. I think AMD learned a hard lesson. If Intel could get their pricing much lower it'd be a more attractive option for me. But as it stands their quad core stuff especially is horribly expensive. Same for nVidia mobile GPU's.
     
  40. calc_yolatuh

    calc_yolatuh Notebook Evangelist

    Reputations:
    153
    Messages:
    309
    Likes Received:
    1
    Trophy Points:
    31
    Would really help if these idiots documented their benchmark settings. I am definitely more than 5fps above Anandtech's Shogun 2 "low" iGPU test (~85+)

    Far Cry 2 has such huge variance between different benchmarks. If they did flyby, 6620G is way ahead of HD4000 even now. With a combined average the two are closer together.

    Mafia 2 I need to recheck and Dirt 3 presets are fiddly. I know I need to bench it on Ultra Low but it looks so pretty and smooth on Medium.

    That engineering sample preview I am continuing to think is full of s***, the two charts have sloppy english and list different quality settings with identical FPS. This gives me no confidence that they knew how to take pictures/notes and check their numbers before publishing.

    Many games list at least Max FPS and Average FPS, and the chart file should include Minimum. But it looks like they use only Max even when the Average or Minimum are better indicators of playability.
     
  41. Syndil001

    Syndil001 Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    The HD3000 are integrated into high spec core i5 and i7 2nd generation sandybridge CPU's and the newer and faster [by around 20%] HD4000 chips are found in the latest high spec core i5 and i7 3rd gen IvyBridge CPU's, you can see the detail comparison here
    Get the HD4000 graphics as the CPU will be much faster and far more power efficient too [important for laptops running on battery]