Technology@Intel An Update On Our Graphics-related Programs
-
Too bad. I was really hoping Intel would lead the way for alternative methods for graphics, like ray tracing and voxel based 3D rendering for games. Read some interesting blogs from Carmack and was thinking that Intel's new graphics would be able to make what his wishes for ID Tech 6 reality. That and was hoping Intel's acquisition of Havok was indication of some exciting intel graphics solutions to come.
-
Unfortunately, it means Intel will go back to "concentrating" on their integrated cards which are no longer the only option considering what Nvidia and ATI have to offer.
-
They made the right call -- it's too late to get into the discreet GPU market. Intel's on-package IGPs are about halfway to being good enough and AMD's Fusion will almost certainly be a lot better. With any luck at all, the era of discreet graphics cards will finally end in the next few years.
-
Intel IGP's aren't half bad at what they do, they do what they were designed for very well imo.
However I like having discreet graphics cards, it means you can replace them and swap them around and stuff. Think about it, when an IGP goes out of date (things in the gaming world go out of date VERY quickly) you have to buy a whole new mobo, if you're a gamer and have a decent mobo like an asus rampage or something like that then it would mean spending an extra £200 each time you need to update. -
-
mobius1aic Notebook Deity NBR Reviewer
Intel's latest HD graphics integrated onto i3 and i5 packages are pretty decent, and with proper drivers it could match a Radeon 3200.
-
That said though, even with an IGP potentially that strong I will always want a discrete GPU to go with it...running in hybrid crossfire. -
thinkpad knows best Notebook Deity
-
Uh, yeah. Discrete will never go away, period. It's nice to have decent performing integrated GPU's, but discrete will always be over ten times better performance.
-
If the integrated variety becomes good enough, the discreet will fall by the wayside. They'll still be purchased by the "60 FPS in Crysis III or bust" crowd, but that market is not large enough for a company like Intel to get into it. The way the gaming industry currently stands, they just have to match the consoles and that is not a very high standard. -
thinkpad knows best Notebook Deity
I think that "era" must have been an illusion, even then the market share was about 70-90% Intel, and the rest split about the major dedicated card companies. The consoles are not a high standard to beat at all, they're getting terribly outdated, with those crappy 7900 GTX based chips in them, with apparent processing power making up for it.
-
I honestly couldn't care less what the gaphics card market does provided they still perform well and I can yank them out from one computer and put them in another, that's all that really matters to me. -
mobius1aic Notebook Deity NBR Reviewer
As nice as the whole GPU on CPU die package sounds, there's no telling how they could be sold. Combinations of low to high end CPUs with GPU could be endless, though it would be nice for DAAMIT to rest on a single GPU until a next CPU generation. Quad core + 400 SP like GPU would be a very good starting point. Relatively good performance with great CPU could provide very good video/audio decode or graphics or perhaps GPGPU physics while a discreet board handles graphics output. There is alot of potential, but it comes with various needs, like chip yields, power needs, and I think most importantly, memory bandwidth. Current DDR3-1333 dual channel bandwidth provides 21 GB/s and that would not be enough to get the most out of a quad core + 400 SP ATi GPU part. Bare minimum I think 40 GB/s would be my preference. A 400 SP part like the desktop 5570 or laptop 5650 is around 25 GB/s in memory bandwidth on G or DDR3 128 bit buses which is performance limiting for them in many cases. The more bandwidth the better.
-
that isn't suprising.. been looking like a failure ever since i heard of it... good thing intel killed it instead of wasting money.. anyways , intel gpu's have always been crap.
-
Uh, larrabee was supposed to compete with CUDA and stream openCL or what ever they have, pretty much GPGPU stuff. Not sure whether this is a good move or not for Intel, maybe they've perfected the parallelism on the CPU front. But they better get in the game some way some how with massively parallel computing or they'll lose server market share to daamit or nvdia in the not too distant future.
Intel ends it's plans for Larrabee as a discrete GPU
Discussion in 'Gaming (Software and Graphics Cards)' started by Phinagle, May 26, 2010.