ever since this video
Intel Haswell integrated graphics are on par with GeForce GT 650M - TechSpot
which claims that Haswell's integrated graphics are on par with the Nvidia Geforce Gt 650M,
I've been intrigued by how good this iGpu really is.
CeBIT this week has shown some haswell proto type notebooks, perhaps we could get actual 3dmark scores?
does anyone think it will really be like the 650..or even 640?
-
-
Karamazovmm Overthinking? Always!
-
I found a Haswell 3dmark11 benchmark (the GT3e) to be 1500
Nvidia Geforce 640M is 1700
650M is 2000
this is with a desktop version -
My guess is it will be on par with the AMD Trinity 7660G mobile which is about 630m. It will be restricted to 720p due to bandwidth reasons too. 7660G is about 1200 3DMark11, and is not indicative of actual in game performance. While a dedicated 630m you should be able to play a game like Battlefield 3 at 720p low at 30-40FPS. Due to the shared resources and bandwidth limitations it dips to 15fps and averages about 25.
Reminds me of this:
<iframe width='560' height="315" src="http://www.youtube.com/embed/35qYpRxj4XU?list=PLY9F-MGmttjT-tP2NHncCDSiOEK_v35SJ" frameborder='0' allowfullscreen></iframe>Last edited by a moderator: May 6, 2015 -
I think you guys are being to conservative on the computing power of this new unit, i think you will be surprised.
-
Karamazovmm Overthinking? Always!
-
It better not be as fast as my 650M. I know it'd be for the greater good of PC gaming but for purely selfish reasons I'd rather not have my system be overtaken by integrated GPUs just yet, would hurt my pride too much
I can't see it being as fast as the 650M anyways, I mean, the Intel drivers are far inferior to NVIDIAs for a start. Then add in the fact it's sharing its bandwidth with the rest of the system. -
Im saying it will be on par or a fraction higher then the HD 7660q. I dont see intel pulling ahead of AMD on this one. I mean my HD 7660g would pull over 1300-1400 all day long so im not seeing this super GPU they are hyping up. Not to mention that the ram will bottleneck the living crap out of it. So it looks like they cought up to Trinity and good for them took them long enought and soon new APU's will be out again and intil will be playing the catch up game yet again.
They won the cpu race. However they wont win the igp race....ever.
Also dont worry your GT 650m will still kill haswell any day of the week. You play games on a igp then play it on the GT 650m and you can see the IGP's are higher res's and when you turn the settings up take hits way harder then your GT 650m will ever take them. Example.
Wow ran at 720 on ultra in towns and 5 mans. 10mas had to turn it down a little to shadows on low and disable ssao. 25m all the way down to good some fights would dip into the teens. 40's all the way down as low as i could and i would get about15fps.
Also lets not look past the fact they have usb 3.0 issues on haswell now. -
Karamazovmm Overthinking? Always!
1) Its entirely possible that AMD is going to lose again on the igpu race. The problem here is that you are thinking only on the pc market (laptop and desktop), there are several different form factors that make your personal information environment, and one thing is for sure, the miniaturization process is coming and is going to happen.
What I mean by this is that, the future is on to make the cpus that are as powerful using as little as possible of energy and dissipating even less of heat in a much more crumped space than we have now. That is actually the race to be won, and thats one thing that AMD has the lead, not with trinity, but with jaguar, and intel to try to avert that win is making their main line up go down several lvls of power consumption and dissipation.
2) One of the main reasons that trinity lacks consistent performance is the lack of dedicated ram to it, DDR3 is just too slow and lacks the bandwidth that we allocate for GDDR5, one of the ideas to overcome that problem is exactly what intel did with their GT3e, but a very wide bus, with a very fast ram, however there is not enough to make it compete with higher end gpus, from the leaks we can expect something along the lines of 64mb to 128mb, that hopefully should be enough for what we are missing here. I actually hope intel uses more of that edram on their future designs.
3) whats their USB 3.0 problem? and on what chipset? or on what MCD?
Unifying 1 and 2 gives us that intel strategy is pretty clear and when we insert the typical consumption of tablets, pcs and other devices we come forward with a great need for cpu and gpu speed and power, its simple, they saw the need and are going after it. -
I sure know the feeling!
-
-
Karamazovmm Overthinking? Always!
-
Well that's the entire problem. Very few thin and light laptops, let alone handheld devices, are using AMD APU's. I'm hoping PS4 will change all that. I mean it's hard to even find a 14 inch with the 35W AMD Trinity APU, when in the same chassis they fill it with a 35W Intel CPU + 45W dedicated GPU. I mean really? The A10-4600m should easily be able to fit inside a 13" thin form factor. They run cool and are only 35W.
AMD spanks Intel when it comes to graphics. And AMD screwed the pooch when they limited the CPU and GPU by TDP alone, even though the chips run cool, they still throttle. My A10-4600m doesn't break 70C yet it throttles because TDP is too high. At least with Llano you could punch the APU to 90C before it started to throttle. At least with Kaveri it sounds like they are not going to limit frequency by TDP alone (why did you change in the first place AMD, duh). -
Karamazovmm Overthinking? Always!
But yes I know, I have been following the amd notebooks thread and its abysmal, either the quality, the choice of materials, the choice of parts, the design, everything speaks, cheap things that will break down, or average things that are only cheap because they really lack power.
AMD should have released a 45w part, strange is that they didnt, maybe its a stability problem? there is something really wrong for them to not use 45w on their line up.
So in the end, amd is directing its efforts wrongly, kaveri is still not power efficient enough for tablets and other smaller devices, I hope they are not following intel on this one, atom is a failure, so we are going to use our main line up on tablets to kill ARM? really that surely must work -
-
Karamazovmm Overthinking? Always!
-
MY A10 4600m has never throttled and over all i been impressed with it.
-
-
Never tested it using anything less then 4 cores but some times i did see it boost to 2.6ghz running 4 cores for some odd reason it was all really depending on the game because i only seen it do that a few times in world of warcraft and nothing else i tested it with. But i never seen it drop under 2.4 unless i set it to. -
Well then it's throttling. It should boost to 3200 but it doesn't exceed 2400. Same as mine. And if I OC the GPU then it can affect CPU performance due to shared resources.
-
I was under the impression that it was 3.2 for one thread. 2.7 for 2 threads and 2.4 for 4 threads. So really i dont consider it throttling if it only does 2.4 for 4 threads, Its working how it was meant to work however i do wish it would run at 3.2. -
I guess I should have said it throttles with an OC'd GPU, although in some games you get an FPS improvement. But CPU won't boost past 2.6GHz at best even with single thread, usually max 2.4GHz regardless. If you can get more than 2.4GHz consistently please let me know how you did it and I'll try to recreate it.
Usually boost is on the higher end of the rated max boost even with all cores enabled, more like 3.2GHz for single core, 3.1GHz for dual core, 2.9GHz for 3 or 4 cores. 2.4GHz is not considered a boost. It's considered a slight bump. -
Intel’s Next-gen Haswell Graphics to Enable Rendering Techniques That Other GPUs Can’t Handle « Ultrabook News and the Ultrabook Database
Intel’s Next-gen Haswell Graphics to Enable Rendering Techniques That Other GPUs Can’t Handle
Posted on 31 March 2013 By Ben Lang
At GDC 2013, Intel announced that their next-gen Haswell graphics will support a new bit of tech called PixelSync as well as DirectX 11.1. PixelSync on Hawell graphics enables two rendering techniques called ASVM and AOIT which can cost a standard GPU up to 80% of it’s performance but runs smoothly with Haswell’s integrated graphics. The developers behind Grid 2 are using Haswell’s new tech to bring graphically rich games to the masses.
According to Intel, PixelSync “allows the programmer control over the ordering operations across pixel pipes.” Ordered pixels means predictable rendering behavior and “allows programmers to properly composite partially transparent pixels without the need for an expensive sorting operation”. Another technique enabled by Haswell is InstantAccess which allows the CPU to write to GPU memory.
Combined together, PixelSync and InstanAccess allow for high performance Adaptive Volumetric Shadow Mapping (AVSM), something that game developers have wanted for a long time, according to Intel. With AVSM, volumetric effects like smoke can be properly lit in real-time without a huge drain in performance. Intel told us that attempting AVSM without their Haswell tech could cost a game 80% of it’s performance. With Haswell, that comes down to 5-10%. The techniques enabled by Haswell graphics are also useful for rendering hair, windows, foliage, fences and other complex geometry.
Legendary racing game developer CodeMasters showed us AVSM put to practical use in the upcoming Grid 2. With AVSM, smoke trails from drifting vehicles achieved much greater depth thanks to proper shading and looked more realistic than with AVSM disabled.
Grid 2 also put to use another technique achievable with PixelSync and Instant Access called Adaptive Order Independent Transparency which can “correctly render overlapping semi-transparent objects without having to sort them before they are being rendered,” according to CoderMind.
In practical terms, this allowed the Grid 2 team to render more realistic overlapping foliage which is useful for tracks with dense tree coverage.
We saw and played Grid 2, using AVSM and AOIT, running on next-gen Haswell graphics at 30 FPS or more on medium settings at 1920×1080:
GDC 2013: Intel's Next-gen Haswell Graphics Running Grid 2 with AVSM and AOIT - YouTube
Haswell Integrated Graphics thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Goren, Mar 6, 2013.