Here are some goodies for us who wait for Ivy Bridge. The reviewers have tested the i5 3570K (Ivy Bridge) vs the i5 2500k (Sandy Bridge). Both setups have only the IGP and not a dGPU, and both CPUs clocked at the same frequency 3.3GHz.
Impressive?
![]()
![]()
Source: http://en.expreview.com/2012/02/19/ivy-bridge-core-i5-3570k-engineering-sample-test/21214.html
-
-
Karamazovmm Overthinking? Always!
love it! now Im just going to wait for the new mbas to come out
-
That's quite good.
-
Haswell will double that !
-
TheBluePill Notebook Nobel Laureate
Not bad.. Its like going from "Worst" to just "Bad".
-
So what Nvidia/AMD GPU does this ballpark around then? Also, the Starcraft II and Far Cry scores are still a little off-putting. A two year old game simply becoming "playable" isn't something to go crazy over. -
Great info, thanks +1!
Assuming that's desktop HD 4000, mobile might be slightly reduced performance? -
TheBluePill Notebook Nobel Laureate
Intel's Larrabee Platform was supposed to be the great equalizer.. but it went the way of the Dodo and never saw the light of day. -
Yeah, even the A8 6620G is a good performer, but still leaves a lot to be desired if you want to run at any amount of detail or higher than 720p resolution. I think the AMD Trinity APU's will improve on that and be more of a true low-mid end GPU. This HD 4000 is barely better than the mobile 6620G in the current gen Llano's, of which the Trinity GPU should trump by a solid 40-50%. Not to mention driver support. Intel just doesn't give the IGP the attention it deserves, and they should.
-
OK, the Ivy Bridge in reality is only useful for those who have an integrated video gpu(aka HD 3000/4000 etc)that is actually used correct? What does Ivy Bridge offer to those with high end graphic solutions other than a die size and power reduction? And from what I understand these new CPU's will be backwards compatible in all Sandy Bridge laptops with a FW update correct?
-
Star Forge Quaggan's Creed Redux!
Not shabby. If Intel can work on drivers and continue to improve their iGPU solutions, it will be in no time that nVidia and ATI are no longer needed. A true performance CPU/GPU AIO solution for everything? I might be interested in that.
For now, I will still hold on to using dGPU solutions for the drivers and higher-end performance, but Intel might be up there in a matter of years at this rate. -
-
Don't get your hopes up. Considering the desktop HD 4000 is only marginally faster than the AMD mobile Llano 6620G doesn't impress me much, and their history of driver updates is lackluster.
-
I don't entirely buy those numbers. Was SF IV run on Low or High? (it changes between the two sheets but FPS are the same)
The synthetic scores are great but the actual game scores I think are worse than 6620G. There were no screenshots, which are important because HD3000 had plenty of graphical problems when tested before.
The page claims most Ivy Bridge will have a lesser HD2500 but there are no clear numbers yet. They did not supply any details about system wattage.
The thing you are looking for here is the improved transcoder. That should be common to all the new chips. Possibly Intel has either improved quality, speed and/or color reproduction, the latter being most important to me.
Could really hope the same guys benchmark against AMD desktop A6 or mobile A8, same games and settings. -
Actually they should benchmark against Trinity since they should be released about same time frame. I'll have to bench mine against what they did and see results with 6620G. But problem is I don't own DiRT 3 and not sure how they tested Starcraft 2 since there's no standard benchmark.
-
That is why I take issue with their numbers, they're kinda scattershot and not properly documented. They also tested with 1333 RAM supposedly, though at that speed there should not be much improvement with 1600 or faster.
If AMD delivers, even the new desktop A6 should be faster by some margin. A10 should flatten 3750K by a large margin. Most curious about AMD claims that 17W package will have 50-60% faster GPU than IB-ULV, 25W 160%+ faster. We're just nervously counting down to a major comparison. -
Dammit, 3570K. Mixed up my prime numbers.
If we're lucky there will be a large head-to-head when the wall comes down. HT your numbers have been nice because we can see the game setting that achieved them. Thanks again.
Dirt 3 number has me suspicious. Game defaulted to "optimal" for me, 1024x768 with settings all over the place including MSAA and postFX turned on. I upped to 1366x768 and left the rest alone, it gave me 45avg/35min fps with 6750M/6620G crossfire and all clocks at stock (catalyst 11.11c still). Visually smooth with no jitter. If I force everything to Medium and disable MSAA it should run much faster. Will try it.
Far Cry 2 benchmark stock settings (everything at High and using DirectX10), same crossfire and stock clocks, my FPS ran as high as 120 during flyby and more like 30-something for combat demo.
I will re-run and post a few numbers in HT's 6620G thread with appropriate screenshots. -
Also, I don't think it would make a lot of sense to compare it to a 6620G, as the CPUs in that comparison would be vastly different, skewing the results. -
-
-
Mechanized Menace Lost in the MYST
Intel should just buy NVidia.
-
This is good news for when I buy a laptop for my wife who does light gaming, but I'm curious to see if the results can be duplicated. I remember the sandy bridge igp was soused to be awesome but that hasn't really been the case
-
If your wife needs good battery, modest CPU and light gaming, just get an AMD laptop. Cheap and not very difficult to squeeze more out of it. However if she needs a super CPU or continually uploads to youtube, a good i7 may be the better choice.
Look at the links in HTWingnut's sig. The 6620G is AMD integrated graphics in A8 series mobile APUs. It will play Dirt 3, Skyrim, etc.... -
Yes HTWingNut, the IGP inside the desktop CPUs are a little higher clocked so the performance is a bit higher. However the percantage boost in IGP performance in desktop CPUs should be the same in the notebook IGP performance.
Oh well, since nobody else bothers I might as well just do it myself
Average FPS increase with HD 4000 compared to HD 3000 is +56%. Mind you that these tests are from a 2820QM and some of these games are probably a little CPU bound, which means that the calculations are a bit off. A 3720QM++ should score better with these games. And most importantly, the scores from the first post in this thread is from an engineering sample. Take the following comparison with an enourmous bucket of salt. Just me a little bit bored
HD 4000 scores 54.8FPS. 13.9% better than the Llano
HD 4000 scores 38.5FPS. 3.2% better than the Llano
HD 4000 scores 27.0FPS. Llano scores 5.9% better
HD 4000 scores 12.3FPS. Llano scores 33.3% better
HD 4000 scores 79.6FPS. 16.9% better han the Llano
HD 4000 scores 38.0FPS. Llano scores 13.7% better
HD 4000 scores 82.4FPS. 23.0% better than the Llano
HD 4000 scores 60.2FPS. 23.9% better than the Llano
HD 4000 scores 27.9FPS. Llano scores 22.6% better
HD 4000 scores 24.5FPS. Llano scores 24.1% better
HD 4000 scores 28.5FPS. Same performance
HD 4000 scores 24.3FPS. Llano scores 10.7% better
HD 4000 scores 103.2FPS. 30.7% better than the Llano
HD 4000 scores 27.6FPS. Llano scores 17.1% better
AnandTech - The AMD Llano Notebook Review: Competing in the Mobile Market -
TheBluePill Notebook Nobel Laureate
Now, the HD4000, are there several variants of it? Or a standard version across all platforms and CPUs? -
If Intel can work on their drivers though, then it's quite possible they could squeeze out more performance... unless of course, they already did.
-
Thanks Cloudfire, for doing that +1. (wups, I'm outta rep atm)
I didn't even see that article in my searches. My googling needs some tuning up.
The trend there is that the Llano does better with added detail. HD 4000 is definitely a leap ahead of the HD 3000, although Intel needs to keep on the ball with drivers. Not to mention the Trinity GPU supposedly has 50%+ GPU performance improvement at 1366x768, although don't have anything to back that up. I hope AMD releases review samples sometime soon. -
Karamazovmm Overthinking? Always!
the latest intel drivers gave me some very needed improvement regarding fps and stability.
I had to use older drivers to run me and me2 without locking up. -
I like this trend. Hopefully one day discrete GPU's will be limited to the high end/enthusiast gaming rigs, and on-die graphics will eat up the midrange cards just as they (almost) have with entry level GPU's!
-
They've been saying that IGP's will replace mid-range GPU's for a while. The Llano is the closest an IGP has come, but its still not quite there for low end. Again, driver support is critical. Intel will never get there until they put more effort into regular release of drivers. What they need to do is add even a small amount of dedicated GDDR5 and it might have a fighting chance. Even 256MB GDDR5 as a "buffer" RAM from the system RAM would help things immensely. There's been rumors that AMD will be doing that, but still nothing solid anywhere. Trinity has been so hush-hush it's getting annoying. I hope they're just waiting to release a phenomenal product instead of trying to hide issues.
-
Still not meant for gaming ( obviously ).
Sadly some people are being fraud and told that this is a gaming gpu.
My friend went to buy a laptop last week seller told him that this Intel HD 3000 is 1.6GB graphics card and can play any game easily.
People need to know that graphics cards doesn't depend at all on memory amount! -
To be fair though the current crop of truly on-die gpu's have only been around for 2 years or so. I don't think intel was really trying to compete with dGPU solutions with GMA and Extreme Graphics, those were targeted at equivalent nForce and ATI integrated northbridges. They've come a decent way in 2 years since clarkdale/arrandale introduced Intel HD. And they seem like the company who would take a slow and steady approach to chip away (no pun intended) at entry level/midrange dGPU turf, rather than spearhead it. But they've negated the need for entry level cards, why get a GT 510/520 for the desktop or a laptop with 520m when there's an HD3000?
-
Right. And you CAN game with these, as clearly shown, I just wouldn't make it your dedicated gaming machine. An occasional romp in Skyrim or playing classic titles, it will work wonders. Once they're able to make any new release title playable, i.e. 720p > 30FPS, then they've made great strides.
In any case I'm impressed with the improvement over HD 3000. And HD 3000 is a great improvement over the 4500MHD that was so prevalent prior to that. Kudos to Intel. Hope they can keep the momentum going. -
Karamazovmm Overthinking? Always!
-
You also have Intel's weird decisions too, like not supporting OpenCL on the HD3000, and only supporting OpenGL 3.0 (even though Apple has been able to get the HD3000 to support 3.2). -
TheBluePill Notebook Nobel Laureate
That thing, is a beast for the games they play. They just finished Dragon Age Origins, with great 30fps frame-rate and decent eye candy. Dragon Age II runs equally as well. Modern Warfare 2 runs good enough on lower settings. Deus Ex: HR runs well. They have also played through several other major, newer titles with no problems at all.
Running the highest settings is not possible on several of them.. but most of the games look great and run great at Medium to Higher settings.
That is on a GPU with only 80 Shaders running at 675mhz on DDR3. (1gb). Most smaller laptop displays are only 720p, which is only around 20% more pixels than the 1024x768 display. -
Worst bait n switch I've seen was a Cyberpower Gamer Ultra...with gt520 "multimedia player" card. Runs a few games okay, but really neither Gamer nor Ultra....just a gimmie til some kid saves enough allowance to buy a card off the shelf at Best Buy...
I take issue with those numbers from Anandtech. Several of the games are practically guaranteed to have been limited by the 3500M's low base clock. After a few hours with K10Stat you should have an OC in the 2.2-2.5GHz range, definitely going to improve a few of those titles. AMD drivers have improved since then, in fairness the Intel drivers probably have too. (but I have no direct exposure)
I think Shogun 2's lowest quality was rewritten entirely since that review, explicitly to improve Sandy Bridge performance. Lots of goodies in 5GB of patches y'know. However DX11 performance is key, if HD4000 can manage that it will make Shogun 2 look much better than HD3000.
Like I said, try to draw some comparison from HT's up-to-date 6620G numbers with a fast CPU. I still need to get some stuff up, there's a storm coming so I'll have the time for it. I can provide numbers for the following games:
Mafia 2
Far Cry 2
Shogun 2
Dirt 3
I intend to check my clocks again and profile 1.5GHz vs 2.3+, though for giggles I may sanity-check with forced low clocks. -
Yeah it seems that the Llano does indeed a bit better than the HD 4000 at higher settings with the games. Is it one thing AMD knows, it is GPUs. Right now it is kinda like AMD one generation ahead of Intel at graphics and Intel one generation ahead with raw CPU performance. Kinda funny really
I don`t doubt at all that faster CPU than 3500M will give better results than what Anandtech got. There are like 3 (?) higher clocked Llano`s with +10 W TDP. Which is why I said take the calculations with lots of salt. Many of the games are probably a bit CPU bound too. And then we have the improved drivers from AMD. Which makes it highly relevant to see how much better the Llano is now compared to launch, but in all "fairness" a bit unfair to compare since the HD 4000 too will improve with newer drivers.
I am super excited to see the official results from the top reviewers once Intel release the damn Ivy Bridge, and equally eager about the Trinity APU that AMD promised us so much about. I am just hoping that they don`t pull a Bulldozer on us for those who know what it is (lots of promises but poorly execution). Especially for those who suffer through the months after Ivy is released to wait and see how the Trinity will be
@TheBluePill: While the desktop have several different IGP versions, HD 2500 and HD 4000, HD 2500 with 8EUs and HD 4000 with 16 EUs, and a bunch of different clocks, mobile CPUs will only have the HD 4000 but with minor differences in clocks of the IGP
See here: http://vr-zone.com/articles/intel-s-mobile-ivy-bridge-cpu-line-up-revealed/14148.html
http://vr-zone.com/articles/intel-s-core-i7-3612qm-verified-as-35w-part/14229.html -
Yeah I'm really afraid for a Bulldozer mobile fiasco, but I doubt it. I think AMD learned a hard lesson. If Intel could get their pricing much lower it'd be a more attractive option for me. But as it stands their quad core stuff especially is horribly expensive. Same for nVidia mobile GPU's.
-
Would really help if these idiots documented their benchmark settings. I am definitely more than 5fps above Anandtech's Shogun 2 "low" iGPU test (~85+)
Far Cry 2 has such huge variance between different benchmarks. If they did flyby, 6620G is way ahead of HD4000 even now. With a combined average the two are closer together.
Mafia 2 I need to recheck and Dirt 3 presets are fiddly. I know I need to bench it on Ultra Low but it looks so pretty and smooth on Medium.
That engineering sample preview I am continuing to think is full of s***, the two charts have sloppy english and list different quality settings with identical FPS. This gives me no confidence that they knew how to take pictures/notes and check their numbers before publishing.
Many games list at least Max FPS and Average FPS, and the chart file should include Minimum. But it looks like they use only Max even when the Average or Minimum are better indicators of playability. -
The HD3000 are integrated into high spec core i5 and i7 2nd generation sandybridge CPU's and the newer and faster [by around 20%] HD4000 chips are found in the latest high spec core i5 and i7 3rd gen IvyBridge CPU's, you can see the detail comparison here
Get the HD4000 graphics as the CPU will be much faster and far more power efficient too [important for laptops running on battery]
HD 4000 vs HD 3000 (Ivy Bridge tested)
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Cloudfire, Feb 20, 2012.