The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Rumors of Next Gen Consoles, Already Outdated

    Discussion in 'Gaming (Software and Graphics Cards)' started by Zymphad, Nov 23, 2011.

  1. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    There are rumors of next gen consoles since a few Sony developers like Square Enix and 360 Developers like Ubisoft Montreal are already developing for the next gen consoles.

    Both sides have been given equivalent off the shelf hardware of what the consoles are to be expected. This is pure speculation on my part, but running off the ground, in 2012, laptops running the 28nm parts will already be outpacing the next gen consoles. Next gen consoles seem to be targeting for performance of Epic Game's Unreal Engine with DX11. My guess that means the target performance is likely 720p with tessellation, or something like an Nvidia GTS 450 or AMD HD5750. CPU probably like an i5 first gen quad. Development from both Sony and Microsoft seems to indicate this is a hasty project. Nintendo Wii U threw them for a spin and so they are looking to release something just slightly better since they don't have years to release. They want to release as close to when Wii U releases, so they don't lose their market share. That to me means that the hardware will be a few generations behind PC at release already. These consoles are not being designed with future tech, but old tech.
    - Nintendo Wii U is expected to be a 3 core CPU with something like an AMD HD4650. MS and Sony probably just want to be 1 gen ahead of Nintendo. Nintendo should have gone for big rather than slightly better than current consoles. Huge mistake on their part.
    - I think this ok speculation since even a GTS 450 with a first gen i5 quad is a massive upgrade over the previous hardware from 360 and PS3 which are running on 512mb of ram and equivalent GPU of an Nvidia 7700.
    - And remember they will probably be selling these new consoles somewhere around $400-500, so they can't equip them with GTX 570s or whatever, it's too expensive. Even a HD5770/GTX 460 I think would be too costly for a console.

    But this will still probably be a boost over PC since Microsoft doesn't seem to want to address the issues that are plaguing PC Game development and the issues of API overhead and limited draw calls.

    What do you think?
     
  2. Eldaren

    Eldaren Notebook Evangelist

    Reputations:
    132
    Messages:
    514
    Likes Received:
    13
    Trophy Points:
    31
    Well when the PS3 and Xbox 360 first came out, both companies were taking a loss in terms of hardware costs. They may shoot for the same thing for the next gen too. I believe they made up for this loss in game sales, app stores, and as manufacturing got cheaper. So they may end up putting slightly better hardware in there. But who knows, you could be right.
     
  3. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    Nintendo always try to make sure their hardware brings in money (aka print) as soon it reaches the market.
    My guess is that the Wii U will use something similar to the ATI HD 4770 (40nm) since its pretty inexpensive and actually got allot of performance.

    But no reason to flame the Wii U, just like PC all the games for it will be crappy ports from Xbox and Playstation.
    I buy Nintendo for its first party games and they never reach multiplatform.
     
  4. DR650SE

    DR650SE The Whiskey Barracuda

    Reputations:
    7,383
    Messages:
    8,222
    Likes Received:
    182
    Trophy Points:
    231
    I don't have any consoles at the moment, but as long as the ports can run well on my M17x R2 in the future, I'd be happy with whatever they release :eek:
     
  5. alexUW

    alexUW Notebook Virtuoso

    Reputations:
    1,524
    Messages:
    2,666
    Likes Received:
    2
    Trophy Points:
    56
    What Eldaren said. I think when the PS3 was launched, SONY was losing $100 per console [maybe more or less].

    Point being, next gen consoles could launch at a price of $400-$500 which an actual cost of $100-$200 higher; but maybe expect some expensive initial games :p
     
  6. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    I wouldn't be surprised. I mean the PS3 has what, a 7900 GT equivalent? I believe the 8800 GTX was already out at the time of launch (ps3), and that's a much better card.
     
  7. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    900
    Trophy Points:
    131
    They will most likely have great hardware for the time. The Cell was awesome for it's time and XDR ram has a very fast pipeline. The gpu was lackluster but the ps3 was originally designed without one so they tacked on whatever was cheap.
     
  8. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    Even if it did have an HD 5870 which is an average card in pc land, thats still well over 4X more powerful then whats inside the ps3. Equip it with 1gb of ram, load up the cell with 1gb of ram, and you have a new console. Don't even bother including blu ray, it would be a waste of money and space, they should design with the intent of digital distribution.

    Just upgrading the consoles GPU would help gaming out as a whole. We've been playing with DX9 for too long and its time to get developers to move on to the much more efficient DX11 code structure.
     
  9. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    A 5870 is WELL above average in PC land. Even a 5870M is faster than what most people have. The vast majority of machines ship with mid to low end cards, and often integrated graphics. There's a reason AMD has their Vision branding any more.

    Also note that a 5870 is a 200W card all by itself. The entire PS3 consumes between 90 and 120W (the original fat PS3's are around 200W). There's a power envelope you have to reasonably develop for, as well as durability with game consoles. They're likely going to be looking at cut-down 6870 or 6950 range with custom modifications.
     
  10. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    Would like to see a PS4 with 7990's CF and dual Cell CPUs
     
  11. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Doubtful. I think they are looking at more around the HD5750/HD5770 performance. And also rumor is both Sony and Microsoft are going with AMD on their next gen. Looks like AMD will have all 3 consoles.

    I still see cost as the biggest factor. I just don't see a console being sold with a price tag high enough for a HD6870, especially a HD6950.
     
  12. KernalPanic

    KernalPanic White Knight

    Reputations:
    2,125
    Messages:
    1,934
    Likes Received:
    130
    Trophy Points:
    81
    Understand that even if they ship with 5770 hardware, that this is a MASSIVE step up and games in general will be better.

    Imagine how much better Skyrim would be graphically if they had a console with a 5770 as the baseline instead of a xbox360 with a nerfed nvidia 7800GT.
     
  13. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    The console makers are already likely paying for custom modified chips. Doesn't matter which architecture they base it on, that's already in the cost. A 6950 costs AMD just about as much per unit to make as a 6870 does, or even a 5770 (depending on the process size). It's all the other RAM and stuff that goes with the card that costs more. The leading edge cards pay extra for the performance, but the per-unit costs aren't significantly different than on the lower end cards. I would bet that the console manufacturers use a VLIW4 Cayman derivative of some sort, rather than the VLIW5 architecture of Barts or Juniper which is basically legacy at this point as far as AMD cares. The main holdup is the power envelope they need to run in, which is what will determine the selection and performance IMO.

    As for why new consoles are going AMD, it's probably because AMD still does better power/performance than Nvidia, even if Nvidia wins in many specific instances. Console games have the benefit of being developed close to the metal, so you could probably squeeze more performance out of AMD than you can Nvidia, especially if you take current consoles into account (how much better the same games perform on the 360 than on the PS3, on average).

    The nerfed 7800GT is in the PS3. The XBox 360 has roughly a X1950XT GPU.
     
  14. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    The ps3 cost $800 for sony to make and they sold them for 600. i hope they do that again, when i first got my ps3 i was like damn... a supercomputer :eek:
     
  15. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Also need to take into account when the consoles will be released? End of 2013 likely? I mean HD 5770 in two years time will be very old school, even though it is mid-grade now. If they use 28nm process they can cut down power consumption considerably with great performance, and I don't think 6970 is unreasonable by the time the consoles are released. By then we'll have 8900 / 790 series GPU's from AMD and nVidia that will put the 6970 to shame.

    And part of what cost Sony so much was incorporation of the Blu-Ray player which was completely new tech at the time. This time around that won't be a factor and should drop their manufacturing costs considerably.
     
  16. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Sony made a loss on the PS3 because it meant they could force Blu-Ray into the market, meaning a long term source of income for then since they get money for every single disc manufactured.

    As for the prcessor, I would expect the PS3 to carry on the with cell, maybe with a die shrink though.
     
  17. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    I also expect Sony to carry with the cell, unless they drop backwards compatibility. Since they did it once with the PS2 -PS3 compatibility, they might do it again, but i doubt it. Consoles will always be outdated due to their business model/life cycle. On the other hand, it allows for some impressive performance (for the console, don't compare to PC games :p) due to the unchanging architecture.
     
  18. Syberia

    Syberia Notebook Deity

    Reputations:
    596
    Messages:
    1,611
    Likes Received:
    1
    Trophy Points:
    56
    Pfft, when the PS4/720 come out, I'll toss in another 6950 and call it a day :)
     
  19. shinakuma9

    shinakuma9 Notebook Deity

    Reputations:
    172
    Messages:
    1,512
    Likes Received:
    0
    Trophy Points:
    55
  20. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    If it's released at the end of 2013 it probably won't be a 5770. The previous consoles at least used tech that was modern at the time. The ps3 was released with an Nvidia 7800 variant custom chip right about when the 8800 was launching. The xbox used an x1900 variant custom chip from the same time frame as the Nvidia GPU (the tech was released early for the xbox).

    If the console was being released today, you'd probably see some custom chip based off of either the AMD 6800 or Nvidia 560.
     
  21. daranik

    daranik Notebook Deity

    Reputations:
    57
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    30
    The cell is a nice piece of cpu thats for sure, but again, how dated will it be in 2 years time, intel is talking huge amount of cores already. I think the thing that will be a little disappointing is that these next gen consoles will pop out last through the early stages of quantum computing I bet. Then again lets say these consoles come out in 2013, last roughly to 2017-2020, that puts it right in time for quantum, thats if quantum can be figured out by then.

    Oh and lets all not forget about this little gem : http://www.youtube.com/watch?v=00gAbgBu8R4
    This guys already gotten some government grants and tons of buyers knocking on his door for his engine already.
     
  22. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I don't think anyone is knocking down their doors. It's a lot of hype on youtube, and about it. Government grant? Who cares, it's Australia, there aren't that many new graphics tech worth noting coming from Australia. You BTW know why majority of games recently have been developed in Montreal? Quebec province pretty much subsidizes any game development. Deus Ex is Montreal. Nearly all Ubisoft games are Montreal. EA also has large gaming studios in Montreal as well. Visceral Games, Eidos, Ubisoft, BioWare just name a few. And pretty sure that Bioshock Infinite is being developed in Montreal as well.

    Game/Graphics companies in Montreal, Quebec. Yay for government subsidies. Because games are so vital and important...
    A2M
    Akoha (@Akoha)
    Airborne Mobile (@airbornemobile)
    Anomalik
    Audiokinetic
    Autodesk (@Autodesk_ME)
    Babel Media (@babelmedia)
    Bioware Montreal (@biofeed)
    Bug Tracker (@bugtrackergroup)
    CAE
    Compulsion Games
    Cyanide Studio
    Crankshaft Games
    (@crankshaftgames)
    Darwin Dimensions
    D-BOX Technologies
    Di-O-Matic (@diomatic)
    DTI Software
    EA Montreal (Visceral ) (@eamontreal)
    EA Mobile (@EAmobile)
    Frima Studio
    Funcom (@funcom)
    Eidos (@eidosmontreal)
    Enzyme
    Feeling Software
    Fugitive Interactive
    Gameloft (@gameloft)
    Gamerizon (@gamerizon)
    GRIP Entertainment
    Hybride
    Immersion
    Javaground
    Kutoka
    Ingenio/Loto-Quebec
    Ludia
    lvl studio
    Loopycube (@loopycube)
    Matrox
    MindHabits
    Mistic Software
    Modus FX (@modusfx)
    Monde Media
    NDi Media
    North Side
    ODD1
    Ovolo Games (@ovologames)
    Polytron (@polytron)
    Presagis (@presagis)
    Q8ISMobile
    Quazal
    Sarbakan
    Sensio
    Side City
    Skyriser Media (@skyrisermedia)
    Softimage (@AdskSoftimage)
    Strategy First
    TakeOff Creative Services House
    Toon Boom (@toonboom)
    Trapdoor Inc.
    Tribal Nova
    Triotech Amusement
    Ubisoft (@ubisoft)
    VMC Game Labs (@VMCGameLabs)
    Wave Generation
    Warner Brothers Interactive
    Xaracom
    Xtreme Prototypes
    Yu Centrik
    - Getting a government grant for graphics and gaming is not a big deal. Just move to Montreal, Quebec. Voila.

    This news was around for years and nothing ever came of it.

    Intel's Larrabee would have been perfect for this, but Intel scrapped it. Maybe will see it again in the future.

    1). No one is going to invest in this unless it's profitable. And the amount to needed to invest in this I think can only be covered by a cross-platform AAA title. Gaming is about cross platform, that engine is not cross-platform, at least doesn't seem it. If they want to get more news, they'll have to prove it can run at smooth 30FPS for a game level on the PS3, 360 and PC.

    2). In all this time, 2 years now, this group has failed to address what graphics in games is about. It's about shaders, it's about post-processing. All they have gone on about is polygons. What really crushes game systems today are not polygons. Our GPU's are polygon monsters. It's the shaders that matter. That's what is crushing computers. The lighting affects, the shadows, the blurs, the snow effects, rain effects, transparency etc.

    3). So in closing, for what this company has demonstrated so far, the UE3 is still better, by leagues.

    For example: Skyrim is not loved because of it's textures. Because that would be just funny since Skyrim's textures are actually poor in comparison to other games out (Witcher 2, Crysis, etc). It's not the polygons that are wowing the gamers.

    - Skyrim is very atmospheric.
    - Smoke, wind, fire, snow, all look great.
    - The lighting effects, the shadows reflecting in the dungeons are all great.
    - It's the special effects from the weapons, the blood, the gore, explosions.
    - NOT the polygons.

    When this company can prove their shadows, effects, full scene lighting, high definition radiosity, transparency and other things that make BF3, Skyrim, Witcher 2, Crysis 2 look so amazing, then I'll be amazed. Because right now it's not the textures, the polygons that are crushing my GPU, it's the shaders, everything else.

    The morons who are running around saying this will make GPU's obselete are well morons, idiots. This engine is not Voxel/Ray Tracing, not something General Processing Units can't handle. The FERMI and the new Graphics Core Next from AMD will be fine with this. General processing cores I think would be faster than CPU. They claim their engine is a search engine, looking for points in the unlimited cloud, seems like computation that over a 1000 of these general processing units can crush. AMD's new GCN is supposed to also increase the speed and power of FP64, possibly faster than Intel's CPUs. That and AMD is still pushing the idea of a combined CPU/GPU anyway and Intel is moving in that direction as well, GPU is not leaving anytime soon.
     
  23. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    1) Keep in mind they haven't proven anything, they just have a video. However, there's nothing about this that inherently makes it not cross-platform. How did you get that idea?

    2) Agree
     
  24. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    It's not inherent, but they haven't demonstrated it. I said they have to prove it is. When UE3 and CryTek and CD Projekt's RED Engine were being awed, it was because they were demonstrating how awesome everything looked on a Xbox 360. And that's why UE3 is the #1 engine being developed on. And it's most certainly why CryEngine 3 is cross-plaform. In terms of licensing success, CryEngine 2 is a failure, being PC only.

    The other issue I have with these demonstrations is that everything is static.

    To wow me, they will have to demonstrate real time lighting, shadow effects on moving fully detailed, unlimited detailed objects. They have to show me lighting and shadow perfect as I launch a spell causing the ground to shake and rocks to fall. With ragdoll enemies rolling down the hill. And even better would be rain effect on all the rocks, grass etc on this unlimited detail. Because these are things that CD Projekt's RED engine already does and something the UE3 DX11 engine does. And when I have rain effect, full scene ambient occlusion, lighting and shadow, sun rays etc, it just destroys my GPU.

    I don't think it's a scam as others claimed. I just having seen anything I'm impressed by yet. Everything he's shown, demonstrated, it looks very bland and boring.

    I think Unlimited Detail's fans haven't considered this. If you have unlimited detail, unlimited points, you have to make sure lighting, shadow etc has to be drawn on all those unlimited points.... OMG, I just picture .01 FPS. If they can do it, then I'll be on the bandwagon and be amazed.
     
  25. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Euclideon & Unlimited Detail - Bruce Dell Interview - YouTube

    He demonstrates on an Asus G73SW. Which has a powerful Intel SandyBridge CPU. He gets 15-20 FPS on a static scenery, nothing moving, no shaders, lighting etc.

    I'm no impressed. Gamers like to play at smoother 30-60 FPS with SSAO, HDR, 4096 resolution shadows etc.
     
  26. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    They haven't demonstrated much of anything as hard fact, but you're comparing apples and oranges here. UE3, CryEngine 2/3, CD Projekt all make GAME engines. Some game engines are not cross platform because they focus on specific features offered by a specific platform.

    This is just a renderer. All of the work that goes into it is math. There are no platform specific features to focus on (touchscreen, accelerometer, wiimote, 3D screen, etc) - none of that applies to this type of work. Again, it's JUST the rendering engine, not a game engine.

    It's not their job to prove to us that it's cross platform before they release any details on their work. They haven't even confirmed whether they will be running it on a GPU or CPU. If you haven't decided whether or not you are going to execute your code on a GPU or CPU, you are absolutely cross platform.

    If we want to get technical, we would say that the work they are doing is portable. It's all written in C/C++ (guaranteed). Cross platform technically means available on multiple platforms. It may not ever make it to a console, but it's also likely that it wont ever make it off the ground at all. It's not finished technology, and it's not on ANY platforms right now, so I just don't see the need for them to *prove* that it's multiplatform. That doesn't even make sense.

    ---

    I agree that their tech demo is underwhelming. Good graphics is so much more complex than high polygon counts. It honestly has more to do with animation than anything else. Good animation mandates good physics and good interactivity. It's more appealing to watch a blob animate well than a perfect human walk awkwardly and choppy.
     
  27. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Well...

    Just to get it right, to run Unreal Engine 3.97 maxed, at 60 FPS, you need AT LEAST 3x GTX 580. And even in the presentation, there were dips to 50s, and rarely to 40s. So an AMD Radeon HD 5750 is not even enough to run on MINIMUM on 720p.

    The consoles will probably be on par with computers. MS and Sony aren't going to screw up this time. I'm quite sure a minimum of a 5870 or equivalent performance is to be expected.
     
  28. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    We can agree to disagree then. I think a 5870 currently will blow away anything the consoles have. 5870 consumed about 280/300 watts of power on load and it cost a heft $$$ and it still does.

    And I do not believe even for a second that next gen consoles will be running UE3 DX11 on max. There is a huge difference between a tech demo and what is actually viable for gaming. No one is playing Heaven Benchmark, so I really don't care how many GTX 580s were used for that tech demo.

    I still believe these consoles are coming out rushed. Nintendo Wii U I think took Microsoft for a surprise. And when Sony heard in the wind that Microsoft had developers developing for next gen, they rushed theirs out as well. Both companies wanted the 360 and PS3 to last 9 years and 2012 is not 9 years. For both companies I think it was almost 3-4 years that they were able to recoupe development and manufacturing costs for the 360 and PS3. So I see this next gen consoles being of low development in time and in cost. Something they can get AMD have ready and available. Like maybe a quad core/6 core based on AMD's Phenom II tech paired with a HD5750 or something is what I'm thinking. Or if a dual GPU rumor is try, a CFX HD5650.

    And consoles work so much differently than PC does. You can't judge the capabilities of consoles based on a tech demo on PC. The 360 and PS3 can do nearly 10x more draw calls right now than a PC can due to PC limitations from Direct X and API ovhead and operating system layers.
     
  29. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    2GB of RAM? Only?

    Again, big mistake. All modern games now require at LEAST 2GB, plus the amount of DDR3 RAM that is going to be dedicated to other HW parts, it'll leave with around 1.5GB or less. That amount of is going to be quite of a bottleneck for devs.

    But, of course, everything that change.

    At least the CPU seems to be nice, but PS3's CPU will probably be better, because I'm sensing the CPU of the XBOX is AMD, and AMD isn't very good on CPU's. But if they put a AMD GPU there, they canuse Fusion and increase the bandwidth, which is a good plus.
     
  30. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Again you are thinking of how PCss work. And that's your biggest problem in your speculations. Consoles do not function the same way as PC do and they do not suffer the limitations that PCs do. Yeah right, AMD isn't good for CPU, that's why the most powerful CPU right now is from AMD... And the most overclocked, benchmarked CPU is also AMD. AMD has both crowns right now.
    - In terms of multi-threaded processing the AMD Bulldozer is better than the Intel SandyBridge. Just happens that we don't actually use a lot of multi-threaded software to take advantage of what Bulldozer offers. Not even Windows 7. Windows 7 is better optimized for Intel than it is for AMD. AMD claims that Windows 8 should resolve that issue.
    - Because console game development is command line access/direct to metal, they won't have any of these issues. And AMD CPU for console gaming will destroy anything from Intel on PC gaming.

    And actually despite what a lot of people in this thead think the PS3 Cell processor isn't that great at all. It's pretty awesome on paper. But in practice, it's not. Sony's PS4 will definitely will not be a cell processor again. No one liked developing for the PS3. The biggest problem for the PS3 was the cell processor.
     
  31. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Indeed. UE3.97 level of graphics in games is still a good 5 years away from us.

    That's the issue : they rush things out. If Sony didn't rush, and were careful amd smart, they'd have crushed the XBOX from the beginning.

    If I'm not mistaken, draw calls are how much it can be rendered and processed in a scene?
     
  32. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    Really? Is it the FX CPU? Sorry, I'm just busy lately, and didn't update myself on the latest news :eek:

    Well, the Cell is good on paper, that's true, but when a team is dedicated to use it wisely ( Naughty Dog and Guerilla Games to name some ), it can turn out as a top-end CPU.
     
  33. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    The AMD Opteron new CPU beats Intel's Xeon handily right now. But the Bulldozer and SandyBridge are about on par had to head, as they each have their strengths and weaknesses. Intel SB is better for gaming, but not by much. For multi-threaded computation, Bulldozer is better though. Game developers want multi-threaded operations right now. More draw calls, more stuff rendered on the screen, more AI operations. The AMD CPU's would be great for that. But more than anything, I think MS and Sony want something cost-efficient. AMD offers both CPU and GPU at a good price. Since I don't think Sony wants to deal with the development and manufacturing costs of Cell Processor again.

    The companies that did well with Sony's Cell processor ONLY made PS3 games far as I know. For everyone else it was a pain the butt.
    - Also these companies are subsidiaries of Sony...

    I still think cost will be the biggest factor here. Consoles are popular because anyone with $300-350 can have a gaming machine. If you had $600 you would build a PC. You can build a nice HTPC with the black friday, holiday deals on NewEgg and other sites. Something like a Phenom II with HD6870 is still much better than my gaming laptop. Due to current economy, I don't think MS and Sony will be charging more than 350-400 at most.

    How do you sell a HD5870+6 core Phenom II with 2GB ram for $450? That's a massive loss I would think.
     
  34. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    See that's where I think you are wrong. Microsoft and Sony already have given developers hardware equivalent to their next gen consoles NOW, off the shelf hardware, anyone can buy at NewEgg. Whatever tech/hardware the game developers are working on is already one generation old. Game development takes a long time. That means I think MS has had developers working in secret for over a year now, on tech that is over a year old.
    - But a HD5770 with to the metal game development, is probably a lot more powerful than what we expect on PC. See I think that's the problem here, we are measuring performance based on PC performance. If game developer can do what they do on a super buff AT X1900? It's so old that it's a ATi, not an AMD! X1900... That's how old the 360 GPU is. An HD5770 with 2GB of available ram is already like 10x more powerful :D
     
  35. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    All what you said is very true, +rep.

    But still, they'll want to be for at least 9 years again I believe, so they'll want something powerful, pretty much what you said about the GPU, but a tad more powerful on the CPU side.

    They have to avoid Sony's mistake : the semi-propietary CPU. If they avoid such mistakes, they're good to go.
     
  36. naticus

    naticus Notebook Deity

    Reputations:
    630
    Messages:
    1,767
    Likes Received:
    0
    Trophy Points:
    55
    Rumor is that Xbox devkit's have been released to game developers. If this is true then the next gen consoles will be working with current tech -- probably the cheapest hardware available as well.

    Their is a good in-depth conversation on this topic at [H] forums with some very good info, if anyone wants to dig a bit deeper into the subject.

    IMO they will use current tech; cheap AMD quadcore, mid-range desktop card AMD/nVidia, and no more than 2GB DDR3 ram. And all of the console gamers will go "WOWW.Best.Graphics.EVAAAA."
     
  37. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    The PS3 was $600 when it came out, there's no reason to think that the PS4 is going to be cheaper than the PS3 was at release... Both your numbers and logic are flawed.
     
  38. Phistachio

    Phistachio A. Scriabin

    Reputations:
    1,930
    Messages:
    2,588
    Likes Received:
    145
    Trophy Points:
    81
    I would disagree.

    Back in the day, the crisis wasn't as big as it is today. Sony and MS know that if they want to be in the biggest audience possible, they'll need to lower the price so nearly everyone is able to afford.
     
  39. GamingACU

    GamingACU Notebook Deity

    Reputations:
    388
    Messages:
    1,456
    Likes Received:
    6
    Trophy Points:
    56
    If you haven't noticed prices only seem to get higher in our economy, not lower. Games are now more expensive, consoles continuely get more expensive, same with cpus and gpus. Nothing is getting cheaper unless it's outdated.
     
  40. alexUW

    alexUW Notebook Virtuoso

    Reputations:
    1,524
    Messages:
    2,666
    Likes Received:
    2
    Trophy Points:
    56
    TVs are cheaper :)
     
  41. ccohen322

    ccohen322 Notebook Enthusiast

    Reputations:
    0
    Messages:
    20
    Likes Received:
    0
    Trophy Points:
    5
    Seems a little early IMO but maybe they're trying to get it out asap following the WiiU.
     
  42. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    With console development being very low level and optimised for a specific system, it is true that you can squeze more out of the equivalent hardware in a console than a PC, that's just how it works. So in that respect a 5770 would probably end up being pretty respectable. Whether it could stand up to an SLI gaming rig I don't know, but it would be pretty competant.

    Also, draw calls is a limitation on PC, but you just design your game to work around it and use other merits to your advantage. I agree that it would be better if we didn't have that limitation, it would allow us to do more, but at this very moment in time it's not as much of an issue. There was a thread about this earlier, it's true that consoles can make far more draw calls, but no one in their right mind would say that consoles can compare to a high end PC anymore. How much of an improvement we would see on PC if they fixed it is anyones guess.

    As for the console hardware, I'm still predicting that Sony will continue to use the cell. If they do as other have predicted and ditch it for something like an AMD opteron then I'd be over the moon since it would make consoles a lot more like PC's in their operation, which would in turn probably mean that we get higher quality ports (not to imply that I approve of console ports).

    I'm know this was a few pages ago... but troll-y post was troll? Current AMD tech was on about the same level as socket 1155 CPU's in multitasking, socket 2011 CPU's are out now with six cores which would walk all over a bulldozer chip, not to mention the octa core release which is planned soon. However, I won't deny that AMD are masters of overclocking and that the liquid helium stuff they do is pretty cool.
     
  43. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    Here's what I speculate:

    CPU: custom AMD octo-core x86-64 or ppc processor - $100
    GPU: cut down version of a 6870 - $100
    Memory: 4 GB very fast memory shared by the GPU and the CPU. - $40
    Hard Drive: 60 GB or whatever - $25
    Other components (motherboard, 802.11, controller, bluetooth, psu, case, etc): $100

    Cost (of parts only) at launch: ~$350

    Street price $399. They'll break even right out of the gate, rather than sell at a loss. This would be an epically powerful machine. There is a definite performance benefit out of optimizing your software to run on a specific platform. This would run circles around any single GPU pc available today (for games)
     
  44. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    I doubt it's based on 5770, not sure where that info surfaced anyhow. They'd be silly not to utilize 28nm tech for better overall heat, power, and performance. And just because dev kits came out doesn't mean the hardware has to be something like an HD 5770, or that hardware is even finalized yet. Perhaps they are using a current tech to test and develop to as far as a performance target. But that in no way means they have to use a tech that exists today for the final spec. The Xbox 360 was something like a 7900 GT which was released to the PC sometime around the same time as the XBOX 360 was released.

    Heck maybe this time around it will be based on the AMD 7900! How ironic.
     
  45. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    They don't even need to do that. With a large budget, you can produce test chips long before they are released to consumers.
     
  46. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Very true. It would be nice to see what they have in the works, even if a basic overview. But if they had test chips out you'd think they'd need to be close to production fabrication because design is complete, just a matter of testing and validating the design before full blown production. Maybe newer consoles are closer to release than we think. Perhaps Q3/Q4 2012?
     
  47. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    I guess that would be the absolute earliest, but it's definitely possible. Wii U is coming out mid 2012. Sort of crazy.
     
  48. usapatriot

    usapatriot Notebook Nobel Laureate

    Reputations:
    3,266
    Messages:
    7,360
    Likes Received:
    14
    Trophy Points:
    206
    I think Quad-core will be the standard for next gen consoles along with a <$399 price point for the basic system.