The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Future Of Gaming - What are we headed for?

    Discussion in 'Gaming (Software and Graphics Cards)' started by MonkeyMhz, Mar 27, 2009.

  1. MonkeyMhz

    MonkeyMhz Notebook Evangelist

    Reputations:
    68
    Messages:
    303
    Likes Received:
    0
    Trophy Points:
    30
    Whats with all this larrabee we hear of? What are we in for? Intel or AMD?

    Well its hard to say.

    However, we have some exciting things coming in the future. We all know that Intel as been winning the race so far in the CPU world. However how long will this last, I remember it doesn't seem too long ago when AMD's were the best to game on.

    But that has all changed in the last couple years, Ive had a pretty good sense of these companies are headed. I think AMD may become once again the winner at the release of Bulldozer which everyone has high expectations of.

    We don't know much about it yet, except that its a whole new design, offering amazing performance/per watt, and up to 16 cores. Set for release in 2011 we can only wait. One thing is for sure, if it does offer amazing performance per watt and AMD lives up to the hype this could possibly make one hell of a mobile CPU line. Another thing to keep a look out for is "Deneb" a AMD CPU that was meant to counter the i7, its supposed to make a appearance in the near future.

    For the last year or so we have been hearing small blips about Larrabee, Intels first dedicated line of GPU's. It was a shock at first when the whole thing just seemed like a giant CPU. Intels take on their GPU is to use pure processing power and cores. But don't get lost its not a i7 crammed into the PCI express slot it is much different. Larrabee's x86 cores use a much simpler design, each core also contains a 512bit vector processing unit which is able to process 16 single precision floating point numbers at a time, this is nearly 4x larger than SSE's on normal x86 processors.

    Larrabee will also include very little specialized graphics hardware, instead of performing things in a traditional tile based rendering approach Larrabees renderer implemented in software can be easily modified.

    I don't think Larrabee will wipe the battlefield, but I do think it will be a new approach having such processing power in a GPU makes it ideal for calculations and phyiscs, as well as movie encoding and more. Larrabee might just end up bee'ing (lol) what ageia was for geforce, except on a far larger scale. Larrabees gaming performance is unkown to a degree, however it seems that it is able to scale well and run a game at a solid fps as you can see below.

    [​IMG]

    It seems to take things on more of a "How much do I need to run this" approach rather than just saying "IM GONNA GIVE 100% ERRRRG!".

    Larrabee to me may just end up a different approach to GPU's. ATI focuses on powerful memory (GDDR5) and tons of stream processors. As for Nvidia holds back on the Stream processors and memory but puts out more speed/ghz/mhz. As for Intel? Maybe their thing will be to bust out the processing power and forget everything else. I just wonder how Larrabee will do on texture intensive games.

    Ever since we started seeing shaders become a big part of gaming was when we knew the future was slowly going to change. Around Pixel Shader 2.0 I started to realize that all these neat effects performed on GPU offer a huge amount of flexibility in the future. Now today we see shaders being one of the main parts in a game. Just look at Crysis, turn the shaders all to low and everything else on enthusiast. It wont look nearly as nice as shaders on very high and everything else on low. Shaders are calculations performed via gpu which is another reason why Larrabee may end up having a strong fight against Nvidia & ATI.

    The future is headed back to software, 3D rendering is here to stay but It will only be a base. In the future we can expect to see 3D rendering making up the world, but when it comes down to it, shaders, voxels, post pro, will all be doing most the detailed work.

    Games are like Magic. Its a bunch of tricks. When it comes down to it, its just a bunch of numbers and calculations. No living world. No actual living city. Its all a act around you.

    As for consoles, I doubt the PS4 will have the CD format. Developers hinted at PS4 being downloadable games/content. However I think they will still sell games in stores. USB 3.0 is comming, and boy do I like USB.

    Let me know your thoughts, will Intel become the monopoly? Will Larrabee take the cake? Is the gaming industry becoming too advanced for its own good?
     
  2. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    As long as they come up with very very cheap video cards that can rock the hell out of any game I do not mind :).

    But in my opinion the GPU technology is starting to show its limits. Before, every new generation of video cards used to almost double the performance as in 6800 go vs 7900 go vs 8800M and so on... it was always a massive increase in performance but then .... 8800M to 9800M .... 9800M to 280M instead of 70% performance increase we got 20%, even less.

    I think this is also one of the main reasons that now there has been a sort of rush to get down with the nm. The technology is reaching its limits and that... is worrying.
     
  3. CA36GTP

    CA36GTP Notebook Evangelist

    Reputations:
    15
    Messages:
    454
    Likes Received:
    0
    Trophy Points:
    30
    Athlon 2500+ overclocked to 3.2GHz FTW!
     
  4. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    This has nothing to do with a limit in technology, it's because nVidia is being cheap, and instead of spending more money on R&D, they keep recycling the previous generation of cores but on a new fabrication process so that they can raise the clocks an extra 10-20%, and this is why we only get minimal increases from them. And they can do this because they had no competition from AMD, and everyone was still buying.

    On AMD's side, their 4xxx generation IS a huge leap from the last, and in a few months, DX11 cards will be released with another new architecture which should bring a huge leap in performance.
     
  5. avanish11

    avanish11 Panda! ^_^

    Reputations:
    956
    Messages:
    896
    Likes Received:
    0
    Trophy Points:
    30
    Exactly. It is pure greed on Nvidia's part. People have been content with having the same tech rehashed and made slightly faster for the past 2 years. Not anymore. I think that AMD's 4xxx series is a huge step in the right direction.
     
  6. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    and every body crushed ATI .... well guess what their back with a vengence
     
  7. HaloGod2007

    HaloGod2007 Notebook Virtuoso

    Reputations:
    461
    Messages:
    2,465
    Likes Received:
    0
    Trophy Points:
    0
    what i think we need to see is a multi core gpu....not dual gpu's on one PCB..but dual core....maybe that will eliminate sli scaling issues to...have two cores throwing frames out as a whole rather than sli which does them seperately...what dvelopers also need is another language to program with...c++ is too hard to code games...ive done it here at university of michigan and its HORRIBLE....wasnt MS working on a new language for game developers?
     
  8. Krazzy

    Krazzy Notebook Consultant

    Reputations:
    31
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    Isn't 16 the most cores anything can use right now anyway?
     
  9. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    Would have liked to see Crysis on that chart ;)
     
  10. Signal2Noise

    Signal2Noise Über-geek.

    Reputations:
    445
    Messages:
    1,970
    Likes Received:
    0
    Trophy Points:
    55
    Two words:

    Neural Implants
     
  11. MonkeyMhz

    MonkeyMhz Notebook Evangelist

    Reputations:
    68
    Messages:
    303
    Likes Received:
    0
    Trophy Points:
    30
    I would have to dis-agree with you on the C++. Ive been programming C/C++ for a while and I can whip a game together pretty quick. C/C++, is fast, efficient, mutli-platform. Of course another higher level language would be easier, but really, when it comes down to it. C++ is the industry standard and theres a reason why. MS has C# but thats un-appealing to many commercial developers, but for PC its a pretty good choice. But I wouldn't count on seeing big budget games switch to any other language any time soon. If anything happens down the road it would just be more advanced middleware, or perhaps a re-evaluation of the C++ library's. However if we eventually do switch to another language, im pretty sure it will be similar to C/C++.

    As for GPU's they are all heading in their own directions, but were in a difficult time in the CG world. Games are trying to slowly make this drastic change, the future is undetermined. But us, developers have a Idea where its going and its not the easiest thing to accomplish. We have reached a point where its like, whats next? And were rushing to pump out new ideas, Carmack is looking into some crazy things with voxels, something that we will start to see in games again. Im not talking about worms/2d platformers, but voxels have amazing capability's and performance ++s. They self occlude, and they have a fantastic detail level and realtime editabilty. Crysis uses voxels for some of the terrain features, but nothing compared to where it may be headed in the future. Thats why your hearing about Larrabee not only performing traditional methods such as rasterization but newer more programmable pipelines and renderers that give the developers more freedom in what they want to do.
     
  12. MonkeyMhz

    MonkeyMhz Notebook Evangelist

    Reputations:
    68
    Messages:
    303
    Likes Received:
    0
    Trophy Points:
    30
    Remember its not a CPU, larrabee is a GPU. It can have as many cores as it wants. The PC just views it as a device to chuck crap to get it processed.

    Tbh, I hope Larrabee isn't as good as it makes its self out to bee (lol), or I just hope Nvidia/ATI can match or exceed what Larrabee can do, because the thing I would hate the most is to have computer hardware become monopolized.
     
  13. Sword and Scales

    Sword and Scales Notebook Consultant

    Reputations:
    55
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    One word:

    Related.
     
  14. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    Eh, but it's not like you'd think, it's tied to facial muscle movement, not actual true triggers to the device.
     
  15. Sword and Scales

    Sword and Scales Notebook Consultant

    Reputations:
    55
    Messages:
    228
    Likes Received:
    0
    Trophy Points:
    30
    I know, I was just saying that we're moving in that direction.
     
  16. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    What we are headed for is more unoptimised coding and higher recommended specifications.
     
  17. avanish11

    avanish11 Panda! ^_^

    Reputations:
    956
    Messages:
    896
    Likes Received:
    0
    Trophy Points:
    30
    That's what Stream Processors and Shaders are...