The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Terrible AMD CPU may finally force devs to use proper Multi-Threading in Games!

    Discussion in 'Gaming (Software and Graphics Cards)' started by Zymphad, Oct 6, 2013.

  1. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    To this day since 360 and PS3 been around, I believe we have all suffered from games not using all that PC has to offer, namely multi-threaded support. Both for CPU and GPU, GPU in that CPU feeding info to GPU being single threaded.

    The APU used by consoles, based on Kabini/Jaguar, whatever, has already have complaints from devs. It's weak, super weak compared to the GCN GPU. Per core, it's pathetic. To actually get anything done, devs will have to really make sure everything is multi-threaded to spread the work among the 8 cores.

    I wonder if NBR has thought about this. This means games will run even better on PC, with us who have i7 Quads with HT, that destroys far more powerful 8 core AMD chips used on desktops. Last i checked, even out mobile i7 Quads are faster than AMD desktop 8 cores which are way more powerful than what is being used in consoles.

    This also means PhysX will finally be multi-threaded. Consoles using weak CPUs, there is no way Nvidia could encourage games to use PhysX unless they make sure PhysX is super multi-threaded now. At least that's my hope.
     
  2. MegaBUD

    MegaBUD Notebook Evangelist

    Reputations:
    45
    Messages:
    670
    Likes Received:
    3
    Trophy Points:
    31
    You dont go with AMD for the raw speed... you go with amd for the price/perfomance ratio.

    Also the easy overclocking.

    Maybe one day video game will use more than 4gig of ram...
     
  3. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    As weak as the Jaguar cores may be, the CPU component of X-Box One and PS4 is still about 10x faster/more powerful compared to the previous generation if I'm not mistaken.
    Granted, I would have preferred AMD use similarly clocked Steamroller cores instead (which would likely be far better), but for consoles, this will do.
    Also, consoles do come equipped with fully enabled HSA (like Kaveri will have).

    I think the main point here is that consoles got a significant hardware boost for one thing (especially on the GPU), and those games will be coded for x86-x64 and multi-core cpu's (and probably to specifically take advantage of AMD hardware).

    Also, the CPU is not really as important as it once was. More and more things are delegated to the GPU.
    Games (for the most part) rely on the GPU, not the CPU.
    The differences in architectures between consoles and PC's created relatively lousy ports in the past... but now, things would be much easier to work with, and with HSA in the picture (which seamlessly makes life easier for programmers), AMD hardware could effectively trounce anything several times over than Intel has to offer on brute force alone (when it comes to HSA at the very least - or so goes the theory).

    And with the incoming possibility of Mantle... well, I would say that Nvidia's PhysX doesn't really stand a chance. PhysX is proprietary, whereas AMD uses open software - and with consoles taking advantage of what AMD has to offer, Nvidia would probably have to make serious adjustments.
     
  4. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    People keep saying Mantle is "open," but I seriously couldn't find anywhere where this is confirmed. I watched the entire AMD GPU14 live stream and read through all of DICE's slides talking about Mantle, but it didn't mention that it was open anywhere, it just talked about Mantle being for GCN and improving performance for GCN.
     
  5. Benmaui

    Benmaui Notebook Evangelist

    Reputations:
    153
    Messages:
    577
    Likes Received:
    169
    Trophy Points:
    56
  6. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I'm getting conflicting info on whether Mantle is open or not. I've read articles that totally contradict each other. I really hope it's open. I would consider that a genuine step forward for PC gaming.
     
  7. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    AMD Confirmed PS4 and XBone does not have Huma... The CPU just plain sucks. HSA is just another name for APU/Fusion, which still sucks.
    - I also doubt that AMD CPU is significant upgrade over PS3's Cell.

    Mantle is AMD's proprietary... And it's only available on certain GCN. And Mantle is a step backwards from what PS3/360 devs were using before, metal to metal. I see no reason for a dev to use mantle except to get paid by AMD to do so. And if Mantle gets in any way of DX 11.2, Microsoft will destroy it immediately since that's Microsoft's ticket to getting PC Gamers to upgrade to W8. XBone games uses DX 11.2 features that are only available on W8.
    - I hope Mantle dies in a fantastic huge fire.

    This thread was the good news that came about from MS and Sony agreeing to use such incredibly bad CPUs that AMD uses.
    - You can read a blog from Planetside 2's devs who are porting the game to next gen consoles. They go into details of how pisspoor the AMD CPU's are and how it's forcing them to make everything in the game highly threaded to compensate for it. Threaded wasn't enough, it had to be distrbuted across all 8 cores to compensate.
    - This is good news for us PC Gamers I think and it shows that when Nvidia still said PhysX for next gen consoles, means finally we will have highly threaded PhysX.

    As for CPU not as used today is completel bollocks. With better CPU, devs can now do a lot more for AI and phsyics which is what will make games better, not better graphics. And it's the limit for games like GTA V for example. Even in single player games like BioShock Infinity devs said the biggest limitation for their game was CPU and AI, not the 3D graphics. The number of objects on screen, the number of cars, AI, number of actions they can do is all limited by CPU. That's what limited games on consoles and then carried over to PC.

    With 8 cores and highly threaded, I expect a lot more from future games.
    - And of course the faster your GPU, the faster CPU you need.
     
    carage and HopelesslyFaithful like this.
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    The PS3 cell sucked, hard, to actually get it to perform properly was difficult and even then it only did better on certain tasks, the jaguar cores are 8 actual processors which will be much easier to schedule for.
     
    HopelesslyFaithful likes this.
  9. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Yep. I wouldn't even say the Cell is much better than the Xbox 360 CPU. Jaguar is a huge upgrade.

    Also, Mantle is a direct to the metal API, so it's not a step down from what console devs were using before, it's the same thing. And it's a step up performance-wise for PC developers that choose to use it.
     
  10. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Evidence please that PS4 and XBone do not have Huma. If I remember correctly, this here ( AMD: PlayStation 4 supports hUMA, Xbox One does not - NeoGAF) states that PS4 will have hUMA, but the XBone will instead use an alternative.

    As for HSA... its a full blown integration where copying of data between the CPU and GPU back and forth is no longer necessary, so, while it is a replacement for Fusion, I wouldn't say it 'sucks'.
    Emotional much?
     
    HopelesslyFaithful likes this.
  11. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    If you read more, you'll see AMD replied saying that no such tech from Desktop Kaveri will be available on the consoles. Makes sense since Kaveri for PC won't even be released until Q2 2014 or later and these console APUs have to have been in manufacturing for months now in order to have enough stock for release day.
    - That Neogaf rumor bs is old and it's Neogaf. I wouldn't trust anything posted on Neogaf, that forum should be renamed, TrollsRUs.

    All the sites who say PS4 will have hUMA unsurprisingly did not get any verification from AMD. They just went on the assumption that since Kaveri may use hUMA, then so will PS4. This is also assuming that PS4 and XBone APU are based on Kaveri. But AMD has denied both. Consoles are NOT using Kaveri and it's a previous generation. And PS4 rumor about hUMA was a inaccurate, a rumor.
    - It's too bad tech sites have the worst journalists and have no regard for fact checking. It's weird too that reputable news who do feature some space to gaming/new tech don't have fact checking either. In general it seems being a journalist doesn't require you to fact check anymore. Don't publish if you don't have verification isn't something that is approved of on the internet... Morons.

    Suffice to say, I think there is a lot of rumor mills and marketing bs buffoons hyping the consoles as much as they can. Because it's becoming more and more evident as we know more and more details about the APU that these consoles are using, the more people are realizing how weak and terrible they are.

    I think any of us who have become accustomed to using Intel CPU and Nvidia hardware would only describe AMD APU as terrible.
     
  12. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    If I remember correctly, the AMD CPUs inside the consoles do not have hUMA but the reason they also thought they had is because they have their own propietary fixes that work similarly. PS4 uses a bit flag to determine the origin of the information, and with the capability of both the CPU and GPU to access the whole memory pool gives the PS4 a similar hUMA idea.

    And you are wrong, they are not weak and terrible. The CPUs inside this things are simply low power devices. They are on the Mobile Core i3 and ULV core i5 performance levels in multi threading. Sure they are weak compared to our notebook CPUs and desktop CPUs but these also use significantly more power. And they are also quite much more expensive. And they are still a performance increase over previous gen.

    When programming to the metal, you reduce significant overhead to the CPU thanks to higher drawcalls.

    I think your ignorance of the complete portfolio and what certain products are headed for allows you to call AMD APUs terrible. If you only see things in black an white where the bare minimum is a core i7 and a higher end nvidia GPU, obviously everything else won't cut it. It doesn't mean it's terrible, just that you won't see anything below high end performance.

    I wouldn't mind consoles using jaguar cores, but at least I would have hoped for them to clock them at 2,5ghz or so, so that they offer a sizable improvement, instead of just the bare minimum.
     
    HopelesslyFaithful likes this.
  13. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Thy can't increase the clock speed because of TDP restrictions. And I personally think it was very stupid from a performance standpoint to use 8 Jaguar cores at such a low clock speed. Jaguar is an incredibly weak architecture meant for tablets and other super low-power devices. I don't understand why they couldn't have used a quad-core part at twice the clock speed from the much-better Piledriver architecture, which also powers the desktop Vishera FX-series CPU's . Something like the fastest Trinity and Richland A10 APU's fit in the same 30-35 W TDP envelope as the octal-core Jaguar but offer vastly improved CPU performance.
     
  14. sniffin

    sniffin Notebook Evangelist

    Reputations:
    68
    Messages:
    429
    Likes Received:
    256
    Trophy Points:
    76
    Saying that Jaguar sucks is a little ridiculous. In a gaming PC it does, but consoles don't have to deal with huge amounts of overhead.

    Xenon and Cell are dinosaurs that don't even compare to Jaguar, and yet they play the same games 2013 gaming PCs equipped with the latest Intel tech do, albeit with lower detail. Close to the metal programming makes this possible. PCs have mountains of overhead to deal with that consoles just don't.

    It's the price PCs pay for having interchangeable software/hardware. It's open but it's also extremely inefficient and hardware won't ever be properly utilized.

    Jaguar is restricted to 2GHz because of the physical layout of the chip. For high frequency designs you require more area, Jaguar cores are very tiny so the kind of frequency's you see on Intel and AMD's bigger cores isn't possible.
     
  15. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    TDP is not an issue here. Tweaking the architecture to fit the needs to the client is rather easy to do. However the main thing here is cost. Jaguar CPUs are surely made for low power devices, but by saying "tablet" devices (it is also for notebooks and netbooks) you are mistaking the actual performance of this laptop. This is not an "ipad" CPU, There is no tablet CPU right now on the same level unless you reach the top end tablets that use ULV Core i5s. Using low clockspeed with multiple cores is better when you have in mind multithreading instead of actual muscle.

    I do agree and I wish they went with a more powerful CPU variant including the Steamroller but even with the 2Ghz limit, instead of 1.6ghz, we could have seen benefit. At any rate, there isn't much you can ask from a 399dlrs machine. Hopefully this will mean more focus on multithreading so we can better utilize the CPUs.
     
  16. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Of course TDP is an issue. It has been a big issue from a design standpoint from day one. Have you seen the massive fan inside the Xbox One as well as the much-smaller power brick and much-lower power consumption compared to the first Xbox 360 model? It's clear Microsoft made low-noise, low-heat, and longevity a major design focus with the new console. They don't want a repeat of RRoD this time around.

    And of course I was referring to x86-based tablets, not ARM-based ones like the iPad. It's not an apples-to-apples comparison. And your assertion that no tablet CPU can compete with the Jaguar except for the ULV Core i5 in the high-end devices like Surface Pro is false. AnandTech already reviewed the quad-core AMD A4-5000 Kabini APU, which is near-identical to what is in the consoles except it has one module instead of two and slightly-lower clocks. So the CPU power of the next-gen consoles is a known quantity at this point, and it is seriously unimpressive in my eyes. The i5-3317U in the Surface Pro obviously smacks it around, but even the Atom Z3770 is faster while consuming less power:

    AnandTech Portal | The AMD Kabini Review: A4-5000 APU Tested

    AnandTech | The Bay Trail Preview: Intel Atom Z3770 Tested

    You take the 15 W A4 APU, increase clock speed from 1.5 GHz to 1.6-1.75 GHz or whatever it is on the consoles, add another quad-core module, and you've essentially got the 30-35 W part in the next-gen consoles. Per AnandTech:

    So to answer your question, that's why it isn't 2 GHz.

    Also, no matter how much multi-threaded optimization the game devs do, the poor single-threaded performance of the Jaguar will still come back to haunt them. Games won't even be able to utilize and scale up to all eight of the CPU cores because as much as half of them (one module) will be reserved for system overhead like the OS, two OSes in Microsoft's case. This has already been confirmed by the M$ and Sony and this is what the first paragraph of the above quote is referring to. So weak cores and not even being able to use all available cores and CPU power for the game will kill performance. Now it's just time to kick back and watch what the magic of console optimizations do to alleviate these obvious hardware shortcomings.
     
  17. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Again TDP is not an issue. These machines have much lower TDP than PS3 had. They designed the cooling solution based on the end TDP, not the other way around. Xbox one is almost the size of tiny desktops and bigger/thicker than my laptop which sports massively more TDP and much smaller fan. Just compare Xbox One to PS4 and you will see what nearly identical TDPs have very different cooling solutions and sizes. TPD can be easily controlled at such low numbers.

    Xbox 360 had woeful cooling and issues that they are trying to completely erradicate with xbox one, hence the massive cooling solution. Their manufacturing was lackluster and their cooling solution inadequate as well as higher TDP than they have now.

    As per BOTH your links, if you see the scores of single performance and multithreaded scores, add twice the cores and 10% or more (not sure if final speed has been released for clockspeed, it can be as high as +25% per core) it does beat the i5 and the other ultra low power solutions in multithread and it obviously lags in single core performance.

    General multitasking will be fine. Regular programs and the running OS as well as popular software will run just fine as they already run on lower end hardware. Gaming can be heavily optimized and in order to make up for the lackluster single core performance, developers will have to offload tasks to the GPU cores, and to the rest of the CPU cores. There is a lot of room of improvement for multithreaded software, but before we already had decent IPC. Now the challenge is to actually make it work with full multicore systems that lack a punch on their CPU per individual core.

    Again I am not saying these CPUs are not weak. They are, but they are not nearly as terrible as many make them out to be, as they are still adequate for gaming when you code around their obvious limitations. For example, most popular MMOs right now, like Tera, rely heavily on single core performance. Such games would absolutely not run as they are on the consoles. They would need to write the code again and make it completely multithreaded in order to get acceptable performance.

    We simply can't compare them to current CPUs we have, because they of course are not on the same league. This isn't only console optimization, but a forced, no choice, multithread way that will most likely hinder launch games but later on we will be more than fine. PS4 will not run current existing games, but they will develop new games that of course will need a new approach. any game right now that relies on CPU power will not run on ps4 or xbox one. But that's not the point.

    In the end, I do hope that the extra focus on multithreading will also make our current CPUs shine. But that's another case entirely.
     
    HopelesslyFaithful likes this.
  18. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    So what if the 8-core Jaguar from 2013 intended to last 10 years in the next-gen console cycle ends up beating, by not even a huge margin, one of the slowest ULV Core i5 from 2012 in multi-threaded performance? It still gets destroyed in single-threaded grunt. And based on what I've already stated about cores being reserved for the OS, the total CPU power available for games, and hence your agument about multi-threading, will be less than the sum of its parts.

    To be honest, all of this is nothing to be proud of and we can talk about it all day long but it doesn't change the fact that there is a serious imbalance going on here and they are severely gimping the consoles from the outset. Think MSI GX60/GX70 level of imbalance but worse.

    This is in stark contrast to when the Xbox 360 came out and it performed right up there with the high-end desktop PC's of the day. And PC hardware hasn't gotten any more expensive than before, if anything it's cheaper now. Meanwhile here we are with the XBONE and PS4 getting a laptop CPU and GPU. M$ and Sony can still make a huge profit playing the loss leader strategy again because for sure this next console cycle will last longer than the previous one. They don't even need to raise the current prices of the consoles by much or at all to put in hardware that's a lot more powerful. Don't get me wrong, I'm not a console gamer and I don't care about console gaming at all but this just feels like a cop-out.
     
  19. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    It's not even as unbalanced as you may think. Consoles can be programmed to utilize the hardware differently due to it being closed hardware, something that is not normally controlled on PC. While the CPU is weak, you do not need an extremely beefy CPU for a great deal of tasks of a general game. I keep repeating that you shouldn't be comparing this to current CPU because it's not even the purpose. As a dedicated gaming machine, they simply aimed for cheap but capable hardware in general. You might see the GX laptops as terribly unbalanced but thats a biased view, because they have a powerful GPU with a weaker CPU. But High end Core i7 computers are unbalanced in the sense that the CPU is hardly utilized and then the GPU is taxed completely, limiting performance on the GPU end. The GPU part is the main thing for graphics which is why we favor it much more. But as soon as a game taxes the CPU then we already call it badly optimized, because it shouldn't rely or use as much CPU. Yet we demand absolute powerhouses CPUs for consoles that might not even need it. You talk unbalance from a PC standpoint, not from the intended use of the device. This console won't be programmed as a PC. It won't play PC games. It won't be open and upgradeable like a PC.

    Whether it will last 10 years or whatever is intended is a completely different matter. Will it last? Yes it will. Will it look awesome or play awesome in 10 years? Everything points out to no. Single threaded performance is meaningless if what you want to achieve can be done multithreaded. We already know IPC leaves a lot to be desired, but we know that's not the end all be all of gaming performance, specially when you can code directly to the resources you have. Even if 2 cores are separated for other OS tasks, 6 cores can still be used.

    Now keep in mind PS4 and Xbox One are not high end devices for gaming. They are simply the strongest consoles yet. When Xbox 360 came out it was using high end PC GPU with a decent CPU. PC hardware has EXPONENTIALLY increase in performance, complexity and TDP since then. PC hardware can have GPUs or CPUs that alone consume more power than the entirety of an Xbox 360 alone. We can afford this with proper cooling, ventilation, power etc, but it gets difficult for a console to keep costs acceptable, noise acceptable etc. Specially price, mostly because they want profit.

    Sony and MS just went with the bare minimum that had a considerably increase in performance compared to previous gen, because they can no longer compete vs PCs at an affordable price. As you mentioned PC became cheaper, which is why PS4 and xbox are cheap and pack a punch for the price they cost. But we use several times more expensive machines with several times the performance capability. Sadly, console gamers don't ask as much as we do PC gamers. Hell, the reason I am a PC gamer too is because I like performance, graphics, power, everything. But I also like a lot of the games released for consoles. Despite limited hardware, when you have the proper resources and creativity, you can achieve a lot.

    All I am expecting from PS4 is something that they already achieved of sorts with PS vita. It is a seamless experience. It doesn't take a long time to use the UI and everything feels fast and responsive. I can't stand the loading of the UI on PS3 and xbox360. The time it takes to quit something. The single taks take forever. The limited memory don't allow for proper multitasking. With the new consoles they fixed that. They will just run to the issue of not being nearly powerful enough compared to PC.

    Hell if you consider that an Xbox One has something below an HD7790, why would you pair it with a powerful CPU to begin with?
     
  20. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I don't know man. You say CPU power is unimportant for games but I have to strongly disagree. The CPU is what feeds the GPU and it definitely needs to be fast enough to prepare frames well in advance of the GPU to be rendered. Otherwise you run into a bottleneck where the GPU always has to be waiting on the CPU. This is why the MSI GX series are unbalanced. The GPU is just so much faster relative to the CPU. It's not a biased view because all the objective data points to this. You'd be surprised in a modern game how much work, even rendering, is done on the CPU. So your assertion that i7 laptops are unbalanced in that there is too much CPU power for the GPU is false. If you played the BF4 Beta you saw how it consistently used up 80-90% of overall CPU power on an i7. It didn't matter how much you turned down the settings, it just always used that much of the CPU. The Xbox 360 and PS3 are only 24-player so the processing requirement is much lower, but next-gen will be full-fledged 64-player and you have to wonder what that'll do to the weak Jaguar, even with optimizations. There are certain types of games that will just be hard on the CPU no matter what.
     
  21. TheBlackIdentity

    TheBlackIdentity Notebook Evangelist

    Reputations:
    532
    Messages:
    421
    Likes Received:
    1
    Trophy Points:
    31
    Cpu power is extremely important for games. I learned that the hard way. I listened to all these morons up here who say it's unimportant and then I got 15-20fps in older modded games on my 2670qm simply because it couldn't keep up with the single threaded performance requirements of the given game. In newer titles it's not as important but it's still great to have a strong cpu. You ALWAYS want to be gpu bottlenecked as that's the most important for gaming performance. Having a strong cpu that can fully load the gpu in all scenarios comes right after that on the priority list.
     
  22. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    All depends on the game. What games would sacrifice 15-20 fps with a 2670qm? Sure if you pair it with a GTX 780m it might be problematic. But usually keeping similar gen CPU with GPU there won't be a problem.

    And on the topic of single threaded vs multi-threaded, not everything can be multi-threaded. Some things just need single core grunt. Multi-threading has it's advantages, but primarily with graphics where most operations are predictable and can fairly easily be scheduled for lots of threads. Many other operations aren't nearly as effective when multi-threading and waste a lot of CPU cycles waiting for the other threads to complete. So it will have to be a highly engineered balance of resources. You don't want to be wasting ticks waiting for another thread for AI when it could be used to optimize FPS, etc.

    I am still pretty discouraged by the meager specs of these consoles though. Nobody is willing to take any risk any more. But honestly I feel that this is a bigger risk when people are going to see not a huge improvement in graphics fidelity or gameplay from previous gen games, and definitely behind what any PC, even current high end laptop can offer. I'm OK if they come out with some pretty awesome and unique games and not the formulaic crap that's been released year after year (ie CoD).

    With 4k screens right around the corner, do they really feel that these consoles will satisfy users running games at 720p? Some games will manage 1080p, but most, likely not, let alone anything having a chance of running at 4k. Maybe if they refresh the cycle to once every 5 years then perhaps it's not a bad thing.
     
  23. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    I agree that the specs are rather disappointing in the end. Specially, as you mentioned, with 4k resolutions around the corner. However, compared to previous gen? It is already a massive jump, thanks to previous gen being seriously under powered by now. But it is disappointing that they don't even pack enough punch to make 1080p a standard, specifically the Xbox One.

    I really don't see a 10 year cycle as they keep mentioning. It will last much less than that.

    I really hope we see some new good ideas for gaming. I won't expect anything else because I already have a gaming computer for that.
     
  24. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I concur. I think many people will be disappointed when they realize that native-1080p and High/Ultra settings is still a pipe dream with the "next-gen" consoles. And with the way Microsoft is pushing all these non-gaming related services it's clear that gaming isn't even the highest priority this time around. The new consoles are really supposed to be jack-of-all-trades media center set-top boxes to serve all your connected entertainment and social networking needs in the living room. In essence, Microsoft's and Sony's own bid to take over the living room and compete with Valve, Apple, and Google.
     
  25. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Also, even if these consoles manage to push 1080p they are only targeting 30 fps in most titles. Worst thing is, unlike previous consoles, Devs already know how to maximize most of the resources available to them already, so its unlikely things are going to improve over time.
     
  26. Benmaui

    Benmaui Notebook Evangelist

    Reputations:
    153
    Messages:
    577
    Likes Received:
    169
    Trophy Points:
    56
    Indeed this should be their plan, though I doubt they will understand this until next next gen, as things are now they know most people will buy PS4 and Xboxone for as long as there is nothing new to replace them .
    I can definitely see the PC gamer market getting bigger in the years to come because of this .
     
  27. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Considering how Sony is pushing their brand lately as a service, like PS3 gaming offered via gaikai, I doubt we will see a PS5 and instead we will have a service to play on the system of our choice...

    Oh well.
     
  28. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    With internet infrastructure being so patchy around the world I don't see GaiKai as a good option for PS5 games. Perhaps older and less demanding games to get around backwards compatibility problems. The idea is there, but they could isolate too much of the market even 5-10 years from now. I'm making this point with Australia's internet roadmap in mind - absolutely dismal
     
  29. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,547
    Messages:
    6,410
    Likes Received:
    4,085
    Trophy Points:
    431
    Definitely we need some years to improve it. But hey, even here in Mexico, I have a 200mb symmetrical connection, there is hope for the rest of the world! :)
     
  30. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Then you have no idea how bad the situation is in even "developed" nations like Canada and Australia/New Zealand. We have quite a ways to go in some parts of the world because they're like decades behind at this point, and these aren't even third-world countries we're talking here.
     
    TBoneSan likes this.
  31. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    One thing to point out, consoles are designed for the living room. They might be thrown in a AV rack, or box under the tv, or just laid on carpeting. My point is that they have to run cooler since they might be partially enclosed. Also they need to be cooler because people would complain about a loud fan. So the size of the unit, where it might be put, and noise, all create limitations on what they can do.

    I think both MS and Sony learned a lesson from the red rings of death. That cost MS a lot, in money and publicity.
     
  32. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    All this arguing.. These consoles are going to perform swimmingly. Rest assured.
     
  33. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Even in the USA there's still lots of places with dial-up, old slow ADSL, or limited bandwidth satellite. All this online activation segregates the crowd to begin with let alone streaming and downloading games.
     
  34. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    On the architecture we're currently insisting on using, that's of course the case. Still, if game-developers were on top of their game, they would be able to create schedulers that would ensure execution and response time. Instead of always pushing towards the bandwidth limits during bursts of loading, or creating context switches that require instant object creation when that's not actually possible, etc., etc. Lots of things that could be done more cleverly on a completely linear architecture like the x86 instruction set is created to exploit.

    Not that it's going to be worth it to actually create clever code like that on the "new" consoles. Since the result will essentially be exactly the same as with the other approach. That's what the hardware is designed for. I.e., the only reason to go through a very long process that in the end would likely impose a couple of new limitations on resource use, etc., would be to lower the minimum specs for a playable game a little bit.

    So it's not going to be worth it, until we actually sit on a system with an integrated bus and multiple programmable processing elements. ..which of course we essentially did, when the Cell processor became the first system of that kind that could be purchased for money.

    But hey - who cares, right? It only had 512Mb ram, after all. And a maximum clock speed that was slightly lower than an intel system. So clearly it sucked, and had no value to anyone.

    Seriously, though - it's going to be years until another system on that level is designed. And then we're going to run into lawsuits with amd, Intel, Nvidia and Microsoft over anything from where higher level instruction set logic would be placed, to chip design, over patent infringements. So basically, we're going to be stuck with this crap forever.
     
  35. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    Speaking as a programmer, it is not so easy to make games multi-threaded. Multicore cpus have been out for a long time. Most of those programmer are very smart. They have picked the long hanging fruit, and even stood on the shoulders of others to get higher fruit. I am pretty sure those professional game programmers are on top of their game. Of course a lot more man hours can always tweak a little bit more performance. But cost versus payback drops considerable after a few passes.
     
  36. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    Sure, sure. And teams of programmers happily work together with budgets on clock-cycles. Happens practically all the time. ... Anyway - with current tech and x86 no one can rationally argue for putting more hours into coding, or to stall production elsewhere in the developer house based on some technical fancy someone might have. It's not going to pay off, either monetarily or otherwise. So it's not done.

    And that's also the reason why programmers on projects say they want a faster x86 system along with a faster graphics card, when they are asked what they wish for in the future. Because that's the easiest way to do it in the short term. And the coding guy can be put to tweaking "performance" out of direct addressing hacks.

    Another thing - the way multicore on an amd or intel config is set up, adding cores to increase number of threads isn't going to work. Stores are not big enough, cores cannot context shift fast enough independently. Meanwhile, ensuring running time on an eight-core amd system is difficult, since response is different depending on load config. I.e., all the Sony and MS talk about multitasking is based on non-technical people spouting hopeful things about the cheapest and simplest system they could make,
     
  37. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    Anyway, my point wasn't about how bad the AMD APUs are. The point is, BECAUSE the AMD APUs are such garbage crap, it means the console developers are now being FORCED TO make their games highly threaded and use all 8 cores of the AMD garbage. That's good for us since most of are running Intel quads with HyperThreading.
    - And my drivel isn't from me just spouting what I want. This is based on what game developers are saying now. AMD APU is too damn weak and to compensate for how bad it is, they have to make it as threaded as possible to use all 8 cores. Otherwise they won't be able to do what they want. The 2ghz limit on a weak CPU is a huge problem otherwise.

    I don't get why some in this thread are so hung up about my opinion of how garbage and crap AMD APU is and not be excited that finally games will be super threaded which just means more awesome for us. Dunno about you, but I'm happy that my CPU is running at 4.1 ghz across all 4 cores, 8 logical cores at 4.1 ghz? Massively threaded games, yes please. Yes I want lots of that.

    I wouldn't bet on it. My prediction most games if they are pushing the latest graphics developments will be 720p barely keeping up at 30 FPS within a year. If it is running at 1080p the it will be like Titan Fall, using the fastest engine possible and relying on artists and texture detail and not utilizing any of the new tech to maximize performance.
    - This AMD APU rumored to be based on Jaguar, is already OLD even for AMD...

    Hopefully what will perform swimmingly is console games that support 8 cores/8 logical cores on PC!
     
  38. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    It was claimed that one of the launch titles, Ryse, would run at 1080p; however, this was later confirmed to be reduced to 900p.

    :rolleyes: The Xbox360 upscales games to 1080p as well and they don't quite look "amazing" on a 1080p display. In a few years when 4k displays are less expensive and more widely available, I imagine this console hardware will be upscaling everything other than VirtuaTennis, etc.
     
  39. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The difference is the Xbox 360 is upscaling from sub-720p resolutions to 1080p, so 900p will definitely look better. But I agree, this is pretty lame for "next-gen."
     
  40. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    That's the sweetness. Even at release, AMD's crap hardware won't allow them to run at 720p, even with all the DX 11.2 updates to improve texture rendering and memory management...
    - HILARIOUS.
     
  41. Beowulf112

    Beowulf112 Notebook Geek

    Reputations:
    0
    Messages:
    88
    Likes Received:
    7
    Trophy Points:
    16
    AMD is ..look at MSI GX70 and GX60...terrible..im lucky i picked my acer with 760M
     
  42. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    Isn't focusing on a games resolution, 720p/900p/1080p, simular to focusing on camera resolution, or cpu frequency, etc. I am sure we all can easily think of games that looked awsome at 720p, and others that looked terrible at 1080p. I am not saying the same game.

    My point is, I thought we were smarter than this. You don't judge a game on its resolution.
     
  43. darkydark

    darkydark Notebook Evangelist

    Reputations:
    143
    Messages:
    671
    Likes Received:
    93
    Trophy Points:
    41
    I understand there are sceptics here. But i dont think you realise how powerful optimisations can be. how os handles system resources is only thing you should concern with. hardware is more than adequate for 1080p gaming and your view from a pc pov can not be applied here as pc uses way too much resorces due to bad optimisations. closed systems can offer much much more to developers if they can controll hardware the way they want. not the way that is unoptimised and resource hungry and full of limitations like pc platform is. platform deversity pc has comes at a price price consoles do not have to pay.

    Sent from my HUAWEI Y300-0100 using Tapatalk
     
  44. senshin

    senshin Notebook Evangelist

    Reputations:
    124
    Messages:
    311
    Likes Received:
    11
    Trophy Points:
    31
    I think the CPU will perform well.
    Games are not really Multi-threaded now, or not heavily anyway.

    This is gonna change, in my eyes to good part is that in next few years my I7 will be utilized alot better.

    Maybe the CPU ain''t the best there, but if developers can get 8-cores fully working ingame I think you can get some nice psychics we never seen yet.
    And the RAM upgrade also gona do alot, the consoles before this where really RAM starved.

    And resolution, it's no important to me, the more important factor is how the games looks whatever the resolution is, althought 720P is a minimum I think.
     
  45. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    It may be "OK" for a 720p game to run at 1080p. But a 720p game running on a 4K screen? It'd be like running a 640x480 game full screen on a 1080p monitor. No matter what you do it will look like crap.
     
  46. moviemarketing

    moviemarketing Milk Drinker

    Reputations:
    1,036
    Messages:
    4,247
    Likes Received:
    881
    Trophy Points:
    181
    Once you become accustomed to a higher resolution display, dropping down to a lower resolution for watching media content or playing games starts to look very bad.

    720p is only about 900k pixels per frame, while 4k resolution is over 8 million pixels per frame, you would be stretching out an image comprised of less than 1/8 of the rendered content to fit the display. Unless you are watching a really tiny display from a huge distance away, it's going to look pretty bad.

    I'm not aware of any games that would look better at 720p than 1080p. With the exception of the Dark Souls PC port, one of the benefits of choosing PC instead of console as our platform is that we are able to run games at any resolution we choose, adapting to fit the native resolution of our displays or to improve framerate, etc.

    Console players have been restricted to rendering a 720p image stretched out to fit their 1080p TVs for the vast majority of Xbox and PS3 games for the last several years, and it looks like the next generation of consoles will be similarly restricted when the market transitions to 4k displays.
     
  47. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,049
    Trophy Points:
    431
    Yes let's spend $899 on the next gen consoles.. I can't believe what I'm hearing. And this coming from a person that's been PC gaming since 1980.. I never even had a console outside of the 2600 until the ps2.

    The next gen consoles are plenty fast enough. It will be fine.
     
  48. nipsen

    nipsen Notebook Ditty

    Reputations:
    694
    Messages:
    1,686
    Likes Received:
    131
    Trophy Points:
    81
    No. What happens is that with unpredictable loads on the different cores, the running time becomes variable. The cores are also not fast enough, and don't have large enough stores, to actually complete asynchronous operations in main memory. This is the case with intel quads as well. So for desktop use and completion of operations that don't require set running times, many cores is useful. But for optimizing games or graphics operations, running physics, etc., it's practically useless.
    No, it's not. It's more than adequate to run any current game, and the graphics card can buff the details and post-processing massively higher.

    What still is the case though is that all the talk from "developers", in reality comes from publishers talking up the abilities of their hardware. For example, someone could have let on to one of the braindead Sony dudes that their OS makes use of all 8 cores at all time, for example, while running a game. And they would then produce, practically on automatic, that "8 cores makes games better!". That's just how their heads are wired, since they don't actually have brains and do not really think about what they're saying, basically.

    But it doesn't make a multicore system any more necessary for playing "next gen" games. Why people believe that, or seriously think an x86 system is going to be a step up from something like the Cell-processor, is beyond me.
    Well, "super-threaded" is not a thing. And they're not going to be that either. And an apu/radeon is the best cost/performance proposition by far. From every possible perspective, whether it's component cost or build/cooling solution/durability. Also, like I said, you have people running around playing games in windows on Alienware laptops that effectively throttle the processors at 2.2Ghz. No one notices that, just as an example. 2Ghz is more than good enough, it's not going to be an issue.
     
  49. ajnindlo

    ajnindlo Notebook Deity

    Reputations:
    265
    Messages:
    1,357
    Likes Received:
    87
    Trophy Points:
    66
    I didn't say the same game at 720p looks better then the same game at 1080p. I am am talking about games made for 720p or 1080p, and among those there are ones that look better at 720p. And that aside, if the frame rate is not high enough, then 720p will be better.
     
  50. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    We're not talking about the story or innovation of games... This is a hardware thread. We're talking about making games more optimized for current hardware. Geez, get off your soap box and consider the context.
    - Next gen consoles were supposed to allow for next gen gaming performance. If it can't run at 720p, then that's very bad, sad news.

    I thought you were smarter than this. -- Note the sarcasm

    To be nice, we all in this thread will forget that you tried to compare a weak 2ghz rubbish APU to an Intel i7 powering Alienware. And Alienware CPUs are not throttled at 2 ghz. Good gracious, mine is running at 4.1 ghz on ALL FOUR CORES.

    Unable to run a game at 720p in 2013 is not adequate, that's trash. You can get that kind of performance from an ultrabook running an i5 Ivey with HD4000.

    Since you haven't demonstrated at all that you are developing games for these low powered consoles, I'm going to take the word of the inadequacies of the rubbish APU from Sony game devs and the evidence of the low resolution of announced release games over yours.
     
 Next page →