There are rumors of next gen consoles since a few Sony developers like Square Enix and 360 Developers like Ubisoft Montreal are already developing for the next gen consoles.
Both sides have been given equivalent off the shelf hardware of what the consoles are to be expected. This is pure speculation on my part, but running off the ground, in 2012, laptops running the 28nm parts will already be outpacing the next gen consoles. Next gen consoles seem to be targeting for performance of Epic Game's Unreal Engine with DX11. My guess that means the target performance is likely 720p with tessellation, or something like an Nvidia GTS 450 or AMD HD5750. CPU probably like an i5 first gen quad. Development from both Sony and Microsoft seems to indicate this is a hasty project. Nintendo Wii U threw them for a spin and so they are looking to release something just slightly better since they don't have years to release. They want to release as close to when Wii U releases, so they don't lose their market share. That to me means that the hardware will be a few generations behind PC at release already. These consoles are not being designed with future tech, but old tech.
- Nintendo Wii U is expected to be a 3 core CPU with something like an AMD HD4650. MS and Sony probably just want to be 1 gen ahead of Nintendo. Nintendo should have gone for big rather than slightly better than current consoles. Huge mistake on their part.
- I think this ok speculation since even a GTS 450 with a first gen i5 quad is a massive upgrade over the previous hardware from 360 and PS3 which are running on 512mb of ram and equivalent GPU of an Nvidia 7700.
- And remember they will probably be selling these new consoles somewhere around $400-500, so they can't equip them with GTX 570s or whatever, it's too expensive. Even a HD5770/GTX 460 I think would be too costly for a console.
But this will still probably be a boost over PC since Microsoft doesn't seem to want to address the issues that are plaguing PC Game development and the issues of API overhead and limited draw calls.
What do you think?
-
Well when the PS3 and Xbox 360 first came out, both companies were taking a loss in terms of hardware costs. They may shoot for the same thing for the next gen too. I believe they made up for this loss in game sales, app stores, and as manufacturing got cheaper. So they may end up putting slightly better hardware in there. But who knows, you could be right.
-
Nintendo always try to make sure their hardware brings in money (aka print) as soon it reaches the market.
My guess is that the Wii U will use something similar to the ATI HD 4770 (40nm) since its pretty inexpensive and actually got allot of performance.
But no reason to flame the Wii U, just like PC all the games for it will be crappy ports from Xbox and Playstation.
I buy Nintendo for its first party games and they never reach multiplatform. -
I don't have any consoles at the moment, but as long as the ports can run well on my M17x R2 in the future, I'd be happy with whatever they release
-
What Eldaren said. I think when the PS3 was launched, SONY was losing $100 per console [maybe more or less].
Point being, next gen consoles could launch at a price of $400-$500 which an actual cost of $100-$200 higher; but maybe expect some expensive initial games -
I wouldn't be surprised. I mean the PS3 has what, a 7900 GT equivalent? I believe the 8800 GTX was already out at the time of launch (ps3), and that's a much better card.
-
ratchetnclank Notebook Deity
They will most likely have great hardware for the time. The Cell was awesome for it's time and XDR ram has a very fast pipeline. The gpu was lackluster but the ps3 was originally designed without one so they tacked on whatever was cheap.
-
Even if it did have an HD 5870 which is an average card in pc land, thats still well over 4X more powerful then whats inside the ps3. Equip it with 1gb of ram, load up the cell with 1gb of ram, and you have a new console. Don't even bother including blu ray, it would be a waste of money and space, they should design with the intent of digital distribution.
Just upgrading the consoles GPU would help gaming out as a whole. We've been playing with DX9 for too long and its time to get developers to move on to the much more efficient DX11 code structure. -
A 5870 is WELL above average in PC land. Even a 5870M is faster than what most people have. The vast majority of machines ship with mid to low end cards, and often integrated graphics. There's a reason AMD has their Vision branding any more.
Also note that a 5870 is a 200W card all by itself. The entire PS3 consumes between 90 and 120W (the original fat PS3's are around 200W). There's a power envelope you have to reasonably develop for, as well as durability with game consoles. They're likely going to be looking at cut-down 6870 or 6950 range with custom modifications. -
Would like to see a PS4 with 7990's CF and dual Cell CPUs
-
I still see cost as the biggest factor. I just don't see a console being sold with a price tag high enough for a HD6870, especially a HD6950. -
Understand that even if they ship with 5770 hardware, that this is a MASSIVE step up and games in general will be better.
Imagine how much better Skyrim would be graphically if they had a console with a 5770 as the baseline instead of a xbox360 with a nerfed nvidia 7800GT. -
The console makers are already likely paying for custom modified chips. Doesn't matter which architecture they base it on, that's already in the cost. A 6950 costs AMD just about as much per unit to make as a 6870 does, or even a 5770 (depending on the process size). It's all the other RAM and stuff that goes with the card that costs more. The leading edge cards pay extra for the performance, but the per-unit costs aren't significantly different than on the lower end cards. I would bet that the console manufacturers use a VLIW4 Cayman derivative of some sort, rather than the VLIW5 architecture of Barts or Juniper which is basically legacy at this point as far as AMD cares. The main holdup is the power envelope they need to run in, which is what will determine the selection and performance IMO.
As for why new consoles are going AMD, it's probably because AMD still does better power/performance than Nvidia, even if Nvidia wins in many specific instances. Console games have the benefit of being developed close to the metal, so you could probably squeeze more performance out of AMD than you can Nvidia, especially if you take current consoles into account (how much better the same games perform on the 360 than on the PS3, on average).
-
The ps3 cost $800 for sony to make and they sold them for 600. i hope they do that again, when i first got my ps3 i was like damn... a supercomputer
-
Also need to take into account when the consoles will be released? End of 2013 likely? I mean HD 5770 in two years time will be very old school, even though it is mid-grade now. If they use 28nm process they can cut down power consumption considerably with great performance, and I don't think 6970 is unreasonable by the time the consoles are released. By then we'll have 8900 / 790 series GPU's from AMD and nVidia that will put the 6970 to shame.
And part of what cost Sony so much was incorporation of the Blu-Ray player which was completely new tech at the time. This time around that won't be a factor and should drop their manufacturing costs considerably. -
Sony made a loss on the PS3 because it meant they could force Blu-Ray into the market, meaning a long term source of income for then since they get money for every single disc manufactured.
As for the prcessor, I would expect the PS3 to carry on the with cell, maybe with a die shrink though. -
I also expect Sony to carry with the cell, unless they drop backwards compatibility. Since they did it once with the PS2 -PS3 compatibility, they might do it again, but i doubt it. Consoles will always be outdated due to their business model/life cycle. On the other hand, it allows for some impressive performance (for the console, don't compare to PC games
) due to the unchanging architecture.
-
Pfft, when the PS4/720 come out, I'll toss in another 6950 and call it a day
-
Actually theres a rumor going around that the next xbox will have a 6 core cpu and 2 gb of ram or something. I don't even know how that's possible and will be shown at CES.
Source:
Report: Next-gen Xbox details at CES, hex-core CPU inside | VG247 -
masterchef341 The guy from The Notebook
If the console was being released today, you'd probably see some custom chip based off of either the AMD 6800 or Nvidia 560. -
The cell is a nice piece of cpu thats for sure, but again, how dated will it be in 2 years time, intel is talking huge amount of cores already. I think the thing that will be a little disappointing is that these next gen consoles will pop out last through the early stages of quantum computing I bet. Then again lets say these consoles come out in 2013, last roughly to 2017-2020, that puts it right in time for quantum, thats if quantum can be figured out by then.
Oh and lets all not forget about this little gem : http://www.youtube.com/watch?v=00gAbgBu8R4
This guys already gotten some government grants and tons of buyers knocking on his door for his engine already. -
Game/Graphics companies in Montreal, Quebec. Yay for government subsidies. Because games are so vital and important...
A2M
Akoha (@Akoha)
Airborne Mobile (@airbornemobile)
Anomalik
Audiokinetic
Autodesk (@Autodesk_ME)
Babel Media (@babelmedia)
Bioware Montreal (@biofeed)
Bug Tracker (@bugtrackergroup)
CAE
Compulsion Games
Cyanide Studio
Crankshaft Games
(@crankshaftgames)
Darwin Dimensions
D-BOX Technologies
Di-O-Matic (@diomatic)
DTI Software
EA Montreal (Visceral ) (@eamontreal)
EA Mobile (@EAmobile)
Frima Studio
Funcom (@funcom)
Eidos (@eidosmontreal)
Enzyme
Feeling Software
Fugitive Interactive
Gameloft (@gameloft)
Gamerizon (@gamerizon)
GRIP Entertainment
Hybride
Immersion
Javaground
Kutoka
Ingenio/Loto-Quebec
Ludia
lvl studio
Loopycube (@loopycube)
Matrox
MindHabits
Mistic Software
Modus FX (@modusfx)
Monde Media
NDi Media
North Side
ODD1
Ovolo Games (@ovologames)
Polytron (@polytron)
Presagis (@presagis)
Q8ISMobile
Quazal
Sarbakan
Sensio
Side City
Skyriser Media (@skyrisermedia)
Softimage (@AdskSoftimage)
Strategy First
TakeOff Creative Services House
Toon Boom (@toonboom)
Trapdoor Inc.
Tribal Nova
Triotech Amusement
Ubisoft (@ubisoft)
VMC Game Labs (@VMCGameLabs)
Wave Generation
Warner Brothers Interactive
Xaracom
Xtreme Prototypes
Yu Centrik
This news was around for years and nothing ever came of it.
Intel's Larrabee would have been perfect for this, but Intel scrapped it. Maybe will see it again in the future.
1). No one is going to invest in this unless it's profitable. And the amount to needed to invest in this I think can only be covered by a cross-platform AAA title. Gaming is about cross platform, that engine is not cross-platform, at least doesn't seem it. If they want to get more news, they'll have to prove it can run at smooth 30FPS for a game level on the PS3, 360 and PC.
2). In all this time, 2 years now, this group has failed to address what graphics in games is about. It's about shaders, it's about post-processing. All they have gone on about is polygons. What really crushes game systems today are not polygons. Our GPU's are polygon monsters. It's the shaders that matter. That's what is crushing computers. The lighting affects, the shadows, the blurs, the snow effects, rain effects, transparency etc.
3). So in closing, for what this company has demonstrated so far, the UE3 is still better, by leagues.
For example: Skyrim is not loved because of it's textures. Because that would be just funny since Skyrim's textures are actually poor in comparison to other games out (Witcher 2, Crysis, etc). It's not the polygons that are wowing the gamers.
- Skyrim is very atmospheric.
- Smoke, wind, fire, snow, all look great.
- The lighting effects, the shadows reflecting in the dungeons are all great.
- It's the special effects from the weapons, the blood, the gore, explosions.
- NOT the polygons.
When this company can prove their shadows, effects, full scene lighting, high definition radiosity, transparency and other things that make BF3, Skyrim, Witcher 2, Crysis 2 look so amazing, then I'll be amazed. Because right now it's not the textures, the polygons that are crushing my GPU, it's the shaders, everything else.
The morons who are running around saying this will make GPU's obselete are well morons, idiots. This engine is not Voxel/Ray Tracing, not something General Processing Units can't handle. The FERMI and the new Graphics Core Next from AMD will be fine with this. General processing cores I think would be faster than CPU. They claim their engine is a search engine, looking for points in the unlimited cloud, seems like computation that over a 1000 of these general processing units can crush. AMD's new GCN is supposed to also increase the speed and power of FP64, possibly faster than Intel's CPUs. That and AMD is still pushing the idea of a combined CPU/GPU anyway and Intel is moving in that direction as well, GPU is not leaving anytime soon. -
masterchef341 The guy from The Notebook
2) Agree -
The other issue I have with these demonstrations is that everything is static.
To wow me, they will have to demonstrate real time lighting, shadow effects on moving fully detailed, unlimited detailed objects. They have to show me lighting and shadow perfect as I launch a spell causing the ground to shake and rocks to fall. With ragdoll enemies rolling down the hill. And even better would be rain effect on all the rocks, grass etc on this unlimited detail. Because these are things that CD Projekt's RED engine already does and something the UE3 DX11 engine does. And when I have rain effect, full scene ambient occlusion, lighting and shadow, sun rays etc, it just destroys my GPU.
I don't think it's a scam as others claimed. I just having seen anything I'm impressed by yet. Everything he's shown, demonstrated, it looks very bland and boring.
I think Unlimited Detail's fans haven't considered this. If you have unlimited detail, unlimited points, you have to make sure lighting, shadow etc has to be drawn on all those unlimited points.... OMG, I just picture .01 FPS. If they can do it, then I'll be on the bandwagon and be amazed. -
Euclideon & Unlimited Detail - Bruce Dell Interview - YouTube
He demonstrates on an Asus G73SW. Which has a powerful Intel SandyBridge CPU. He gets 15-20 FPS on a static scenery, nothing moving, no shaders, lighting etc.
I'm no impressed. Gamers like to play at smoother 30-60 FPS with SSAO, HDR, 4096 resolution shadows etc. -
masterchef341 The guy from The Notebook
They haven't demonstrated much of anything as hard fact, but you're comparing apples and oranges here. UE3, CryEngine 2/3, CD Projekt all make GAME engines. Some game engines are not cross platform because they focus on specific features offered by a specific platform.
This is just a renderer. All of the work that goes into it is math. There are no platform specific features to focus on (touchscreen, accelerometer, wiimote, 3D screen, etc) - none of that applies to this type of work. Again, it's JUST the rendering engine, not a game engine.
It's not their job to prove to us that it's cross platform before they release any details on their work. They haven't even confirmed whether they will be running it on a GPU or CPU. If you haven't decided whether or not you are going to execute your code on a GPU or CPU, you are absolutely cross platform.
If we want to get technical, we would say that the work they are doing is portable. It's all written in C/C++ (guaranteed). Cross platform technically means available on multiple platforms. It may not ever make it to a console, but it's also likely that it wont ever make it off the ground at all. It's not finished technology, and it's not on ANY platforms right now, so I just don't see the need for them to *prove* that it's multiplatform. That doesn't even make sense.
---
I agree that their tech demo is underwhelming. Good graphics is so much more complex than high polygon counts. It honestly has more to do with animation than anything else. Good animation mandates good physics and good interactivity. It's more appealing to watch a blob animate well than a perfect human walk awkwardly and choppy. -
Well...
Just to get it right, to run Unreal Engine 3.97 maxed, at 60 FPS, you need AT LEAST 3x GTX 580. And even in the presentation, there were dips to 50s, and rarely to 40s. So an AMD Radeon HD 5750 is not even enough to run on MINIMUM on 720p.
The consoles will probably be on par with computers. MS and Sony aren't going to screw up this time. I'm quite sure a minimum of a 5870 or equivalent performance is to be expected. -
And I do not believe even for a second that next gen consoles will be running UE3 DX11 on max. There is a huge difference between a tech demo and what is actually viable for gaming. No one is playing Heaven Benchmark, so I really don't care how many GTX 580s were used for that tech demo.
I still believe these consoles are coming out rushed. Nintendo Wii U I think took Microsoft for a surprise. And when Sony heard in the wind that Microsoft had developers developing for next gen, they rushed theirs out as well. Both companies wanted the 360 and PS3 to last 9 years and 2012 is not 9 years. For both companies I think it was almost 3-4 years that they were able to recoupe development and manufacturing costs for the 360 and PS3. So I see this next gen consoles being of low development in time and in cost. Something they can get AMD have ready and available. Like maybe a quad core/6 core based on AMD's Phenom II tech paired with a HD5750 or something is what I'm thinking. Or if a dual GPU rumor is try, a CFX HD5650.
And consoles work so much differently than PC does. You can't judge the capabilities of consoles based on a tech demo on PC. The 360 and PS3 can do nearly 10x more draw calls right now than a PC can due to PC limitations from Direct X and API ovhead and operating system layers. -
Again, big mistake. All modern games now require at LEAST 2GB, plus the amount of DDR3 RAM that is going to be dedicated to other HW parts, it'll leave with around 1.5GB or less. That amount of is going to be quite of a bottleneck for devs.
But, of course, everything that change.
At least the CPU seems to be nice, but PS3's CPU will probably be better, because I'm sensing the CPU of the XBOX is AMD, and AMD isn't very good on CPU's. But if they put a AMD GPU there, they canuse Fusion and increase the bandwidth, which is a good plus. -
- In terms of multi-threaded processing the AMD Bulldozer is better than the Intel SandyBridge. Just happens that we don't actually use a lot of multi-threaded software to take advantage of what Bulldozer offers. Not even Windows 7. Windows 7 is better optimized for Intel than it is for AMD. AMD claims that Windows 8 should resolve that issue.
- Because console game development is command line access/direct to metal, they won't have any of these issues. And AMD CPU for console gaming will destroy anything from Intel on PC gaming.
And actually despite what a lot of people in this thead think the PS3 Cell processor isn't that great at all. It's pretty awesome on paper. But in practice, it's not. Sony's PS4 will definitely will not be a cell processor again. No one liked developing for the PS3. The biggest problem for the PS3 was the cell processor. -
That's the issue : they rush things out. If Sony didn't rush, and were careful amd smart, they'd have crushed the XBOX from the beginning.
If I'm not mistaken, draw calls are how much it can be rendered and processed in a scene? -
Well, the Cell is good on paper, that's true, but when a team is dedicated to use it wisely ( Naughty Dog and Guerilla Games to name some ), it can turn out as a top-end CPU. -
The companies that did well with Sony's Cell processor ONLY made PS3 games far as I know. For everyone else it was a pain the butt.
- Also these companies are subsidiaries of Sony...
I still think cost will be the biggest factor here. Consoles are popular because anyone with $300-350 can have a gaming machine. If you had $600 you would build a PC. You can build a nice HTPC with the black friday, holiday deals on NewEgg and other sites. Something like a Phenom II with HD6870 is still much better than my gaming laptop. Due to current economy, I don't think MS and Sony will be charging more than 350-400 at most.
How do you sell a HD5870+6 core Phenom II with 2GB ram for $450? That's a massive loss I would think. -
- But a HD5770 with to the metal game development, is probably a lot more powerful than what we expect on PC. See I think that's the problem here, we are measuring performance based on PC performance. If game developer can do what they do on a super buff AT X1900? It's so old that it's a ATi, not an AMD! X1900... That's how old the 360 GPU is. An HD5770 with 2GB of available ram is already like 10x more powerful -
But still, they'll want to be for at least 9 years again I believe, so they'll want something powerful, pretty much what you said about the GPU, but a tad more powerful on the CPU side.
They have to avoid Sony's mistake : the semi-propietary CPU. If they avoid such mistakes, they're good to go. -
Rumor is that Xbox devkit's have been released to game developers. If this is true then the next gen consoles will be working with current tech -- probably the cheapest hardware available as well.
Their is a good in-depth conversation on this topic at [H] forums with some very good info, if anyone wants to dig a bit deeper into the subject.
IMO they will use current tech; cheap AMD quadcore, mid-range desktop card AMD/nVidia, and no more than 2GB DDR3 ram. And all of the console gamers will go "WOWW.Best.Graphics.EVAAAA." -
-
Back in the day, the crisis wasn't as big as it is today. Sony and MS know that if they want to be in the biggest audience possible, they'll need to lower the price so nearly everyone is able to afford. -
-
-
Seems a little early IMO but maybe they're trying to get it out asap following the WiiU.
-
With console development being very low level and optimised for a specific system, it is true that you can squeze more out of the equivalent hardware in a console than a PC, that's just how it works. So in that respect a 5770 would probably end up being pretty respectable. Whether it could stand up to an SLI gaming rig I don't know, but it would be pretty competant.
Also, draw calls is a limitation on PC, but you just design your game to work around it and use other merits to your advantage. I agree that it would be better if we didn't have that limitation, it would allow us to do more, but at this very moment in time it's not as much of an issue. There was a thread about this earlier, it's true that consoles can make far more draw calls, but no one in their right mind would say that consoles can compare to a high end PC anymore. How much of an improvement we would see on PC if they fixed it is anyones guess.
As for the console hardware, I'm still predicting that Sony will continue to use the cell. If they do as other have predicted and ditch it for something like an AMD opteron then I'd be over the moon since it would make consoles a lot more like PC's in their operation, which would in turn probably mean that we get higher quality ports (not to imply that I approve of console ports).
-
masterchef341 The guy from The Notebook
Here's what I speculate:
CPU: custom AMD octo-core x86-64 or ppc processor - $100
GPU: cut down version of a 6870 - $100
Memory: 4 GB very fast memory shared by the GPU and the CPU. - $40
Hard Drive: 60 GB or whatever - $25
Other components (motherboard, 802.11, controller, bluetooth, psu, case, etc): $100
Cost (of parts only) at launch: ~$350
Street price $399. They'll break even right out of the gate, rather than sell at a loss. This would be an epically powerful machine. There is a definite performance benefit out of optimizing your software to run on a specific platform. This would run circles around any single GPU pc available today (for games) -
I doubt it's based on 5770, not sure where that info surfaced anyhow. They'd be silly not to utilize 28nm tech for better overall heat, power, and performance. And just because dev kits came out doesn't mean the hardware has to be something like an HD 5770, or that hardware is even finalized yet. Perhaps they are using a current tech to test and develop to as far as a performance target. But that in no way means they have to use a tech that exists today for the final spec. The Xbox 360 was something like a 7900 GT which was released to the PC sometime around the same time as the XBOX 360 was released.
Heck maybe this time around it will be based on the AMD 7900! How ironic. -
masterchef341 The guy from The Notebook
-
-
masterchef341 The guy from The Notebook
-
usapatriot Notebook Nobel Laureate
I think Quad-core will be the standard for next gen consoles along with a <$399 price point for the basic system.
Rumors of Next Gen Consoles, Already Outdated
Discussion in 'Gaming (Software and Graphics Cards)' started by Zymphad, Nov 23, 2011.