Whats with all this larrabee we hear of? What are we in for? Intel or AMD?
Well its hard to say.
However, we have some exciting things coming in the future. We all know that Intel as been winning the race so far in the CPU world. However how long will this last, I remember it doesn't seem too long ago when AMD's were the best to game on.
But that has all changed in the last couple years, Ive had a pretty good sense of these companies are headed. I think AMD may become once again the winner at the release of Bulldozer which everyone has high expectations of.
We don't know much about it yet, except that its a whole new design, offering amazing performance/per watt, and up to 16 cores. Set for release in 2011 we can only wait. One thing is for sure, if it does offer amazing performance per watt and AMD lives up to the hype this could possibly make one hell of a mobile CPU line. Another thing to keep a look out for is "Deneb" a AMD CPU that was meant to counter the i7, its supposed to make a appearance in the near future.
For the last year or so we have been hearing small blips about Larrabee, Intels first dedicated line of GPU's. It was a shock at first when the whole thing just seemed like a giant CPU. Intels take on their GPU is to use pure processing power and cores. But don't get lost its not a i7 crammed into the PCI express slot it is much different. Larrabee's x86 cores use a much simpler design, each core also contains a 512bit vector processing unit which is able to process 16 single precision floating point numbers at a time, this is nearly 4x larger than SSE's on normal x86 processors.
Larrabee will also include very little specialized graphics hardware, instead of performing things in a traditional tile based rendering approach Larrabees renderer implemented in software can be easily modified.
I don't think Larrabee will wipe the battlefield, but I do think it will be a new approach having such processing power in a GPU makes it ideal for calculations and phyiscs, as well as movie encoding and more. Larrabee might just end up bee'ing (lol) what ageia was for geforce, except on a far larger scale. Larrabees gaming performance is unkown to a degree, however it seems that it is able to scale well and run a game at a solid fps as you can see below.
![]()
It seems to take things on more of a "How much do I need to run this" approach rather than just saying "IM GONNA GIVE 100% ERRRRG!".
Larrabee to me may just end up a different approach to GPU's. ATI focuses on powerful memory (GDDR5) and tons of stream processors. As for Nvidia holds back on the Stream processors and memory but puts out more speed/ghz/mhz. As for Intel? Maybe their thing will be to bust out the processing power and forget everything else. I just wonder how Larrabee will do on texture intensive games.
Ever since we started seeing shaders become a big part of gaming was when we knew the future was slowly going to change. Around Pixel Shader 2.0 I started to realize that all these neat effects performed on GPU offer a huge amount of flexibility in the future. Now today we see shaders being one of the main parts in a game. Just look at Crysis, turn the shaders all to low and everything else on enthusiast. It wont look nearly as nice as shaders on very high and everything else on low. Shaders are calculations performed via gpu which is another reason why Larrabee may end up having a strong fight against Nvidia & ATI.
The future is headed back to software, 3D rendering is here to stay but It will only be a base. In the future we can expect to see 3D rendering making up the world, but when it comes down to it, shaders, voxels, post pro, will all be doing most the detailed work.
Games are like Magic. Its a bunch of tricks. When it comes down to it, its just a bunch of numbers and calculations. No living world. No actual living city. Its all a act around you.
As for consoles, I doubt the PS4 will have the CD format. Developers hinted at PS4 being downloadable games/content. However I think they will still sell games in stores. USB 3.0 is comming, and boy do I like USB.
Let me know your thoughts, will Intel become the monopoly? Will Larrabee take the cake? Is the gaming industry becoming too advanced for its own good?
-
As long as they come up with very very cheap video cards that can rock the hell out of any game I do not mind
.
But in my opinion the GPU technology is starting to show its limits. Before, every new generation of video cards used to almost double the performance as in 6800 go vs 7900 go vs 8800M and so on... it was always a massive increase in performance but then .... 8800M to 9800M .... 9800M to 280M instead of 70% performance increase we got 20%, even less.
I think this is also one of the main reasons that now there has been a sort of rush to get down with the nm. The technology is reaching its limits and that... is worrying. -
-
On AMD's side, their 4xxx generation IS a huge leap from the last, and in a few months, DX11 cards will be released with another new architecture which should bring a huge leap in performance. -
-
and every body crushed ATI .... well guess what their back with a vengence
-
what i think we need to see is a multi core gpu....not dual gpu's on one PCB..but dual core....maybe that will eliminate sli scaling issues to...have two cores throwing frames out as a whole rather than sli which does them seperately...what dvelopers also need is another language to program with...c++ is too hard to code games...ive done it here at university of michigan and its HORRIBLE....wasnt MS working on a new language for game developers?
-
Isn't 16 the most cores anything can use right now anyway?
-
Red_Dragon Notebook Nobel Laureate
Would have liked to see Crysis on that chart
-
Two words:
Neural Implants -
As for GPU's they are all heading in their own directions, but were in a difficult time in the CG world. Games are trying to slowly make this drastic change, the future is undetermined. But us, developers have a Idea where its going and its not the easiest thing to accomplish. We have reached a point where its like, whats next? And were rushing to pump out new ideas, Carmack is looking into some crazy things with voxels, something that we will start to see in games again. Im not talking about worms/2d platformers, but voxels have amazing capability's and performance ++s. They self occlude, and they have a fantastic detail level and realtime editabilty. Crysis uses voxels for some of the terrain features, but nothing compared to where it may be headed in the future. Thats why your hearing about Larrabee not only performing traditional methods such as rasterization but newer more programmable pipelines and renderers that give the developers more freedom in what they want to do. -
Tbh, I hope Larrabee isn't as good as it makes its self out to bee (lol), or I just hope Nvidia/ATI can match or exceed what Larrabee can do, because the thing I would hate the most is to have computer hardware become monopolized. -
Sword and Scales Notebook Consultant
Related. -
mobius1aic Notebook Deity NBR Reviewer
-
Sword and Scales Notebook Consultant
-
What we are headed for is more unoptimised coding and higher recommended specifications.
-
Future Of Gaming - What are we headed for?
Discussion in 'Gaming (Software and Graphics Cards)' started by MonkeyMhz, Mar 27, 2009.