Apparently, if it is all about bandwidth, then a 280m GTX with roughly 60GB bandwidth should totally crush the 4870m with around 35GB bandwidth. Yet the performance advantage--if any--always went to the ATi card.
More marketed hype and presentation of numbers in very selective ways to create an impression that'll hardly hold water.
-
Kade Storm The Devil's Advocate
-
failwheeldrive Notebook Deity
-
-
TheBlackIdentity Notebook Evangelist
-
Not entirely a 7970M but not far from it either. PS4 GPU have 1.9 Teraflops. 7970M have 2.2 Teraflops. That is computational performance though but since both are AMD it should translate roughly the performance of the GPU core. But yeah I`m with you, on a console this performance will be much greater than what we can get with 7970M and 680M because its a console, where everything is optimized for that exact hardware. That was my point with my previous post, just look at the PS3 hardware and what they still managed to squeeze out of it. The recent week several gaming studios have embraced Sony`s choice to include 8GB GDDR5 because they now get to include more of the graphical stuff they had to ditch with the PS3 and the 256MB GDDR5 RAM.
Here is a recent article from a game developer. He says "most pcs". Midrange GPUs it will beat for sure with the right optimizations. Included 680M I think. I could be wrong but thats what I see. I hope to see my 680M beaten though, since I`m getting a PS4 when its released. I`m also thinking about building a desktop with GTX Titan.
-
-
But whatever, it is what it is. I could be wrong like I said. PS4 is a huge jump from PS3. That is all that matters -
failwheeldrive Notebook Deity
-
-
people also need to remember that the ps4 will likely be locked to 30fps at 1080p...and if ps3 is anything to go buy, consoles often miss out on a few eye candy options compared to pc to have playable fps. If you think about it, the 680m can play almost every game on the market at the moment at least at 30fps with all settings set to max and AA enabled.
-
This tweet from John Carmack has been making the rounds:
-
-
Still see it now at Uni. A guy makes a piece of code that arguably is both pretty and valiant, and it runs fast. It also runs faster than the original piece of code, which was slightly less pretty and valiant. But it never tried to abstract the code away into more easily created separate parts, before those individual parts would possibly be optimised to run in concert more properly later.
Had one example. Guy is a million times smarter than me, he can code stuff as easily as I write b*****it down on a piece of paper. But he just couldn't see that waiting for a specific instruction, and halting the program execution every single time that specific routine should complete - was less efficient in terms of user-response than designing the code to run that routine only during free cycles, while using "approximated data" that in practice never would be more out of date than the original routine.
Just didn't compute. He had to have that update from a "fresh" run, because that's just the way it had always been done. Demoed it and proved the indirect approach had on average better execution times, as well as maintained user-response consistently 100%. But he didn't get that, and just said that hardware should be faster and everything would work, etc.
That's the kind of thing you see in the industry now. We keep thinking that programmers and game-developers are the kind of people who sit at the cutting edge and enjoy inventing things on the absolutely latest hardware. But the ones who actually get paid money don't do that. -
Although I'll still probably buy Doom 4... lol. (wow has it really been almost 9 years since Doom 3 was released? WTH?) -
Come on, why all this hate for Carmack? Maybe his most recent efforts haven't exactly hit it out of the ballpark, but are people forgetting that he made PC gaming and graphics what it is today? And I just love the armchair experts on here that act like they know more than him. You guys couldn't bow down low enough to kiss his feet.
-
Seriously, though. Do you think it's a coincidence that conventions are kept? Not because users want it, but because developers resist it? People know they have to learn new things extremely fast if they were to keep up. So you don't do that. What you do is you create and prototype a solution you've already made. And then you convince your customer that they want that product.
Rinse and repeat. And you end up with conventions that are so strong that they just can't be changed. And anyone who challenges that are typically insane, or they want to destroy the business and cost people their jobs, etc. If I had, say.. 15 dollars every time I heard a variation on this phrase: "But the market isn't ready for xxx, so it doesn't matter how good the solution is", I would never have to work again. -
-
The quake 2 engine, and quake when that came as well, those were extremely well done, though. Making the transition between "2.5d" with slightly rotating sprites, and into more geometry based rendering - that too was id and their teams. The way they promoted 3d-cards at the beginning, and created specific solutions to utilize them in the engine as well - great work.
That Carmack is wrong as a reverse 6-string fiddle when it comes to current use of parallelisation in hardware, and where hardware needs to go to break any new ceilings - doesn't change what they did before.
...or that Psygnosis and Remedy+futurecrew did much more important work when it came to use of actual 3d contexts. But hey, who cares, it's not like the games-industry has awards for "most technically interesting graphics context and node-construction graph", is it. -
It's like hating on George Lucas because he sold Star Wars to Disney or because you thought the prequel trilogy sucked. You still can't deny how Star Wars pretty much changed not just film but the entire world. -
failwheeldrive Notebook Deity
-
-
Kade Storm The Devil's Advocate
Heh. Since there's talk about this guy now, I will say that I don't hate Carmack. I just think that he can be wrong, many times over. I also don't think that he's an untouchable legend because of something great from the 90s. He's already established himself as a bit of a mouthpiece for certain parties over the last decade and that's been baseless and off putting. Besides, I don't exactly take individuals--no matter how grand and great--for their word at gospel level. Argumentum ad verecundiam? No thanks. No man/being/myth is beyond critical reproach, especially Carmack.
-
-
According to Sony PS4 specs - the GPU will have 1.84 TFLOPS of computing power (this is marginally more than a stock 7850 which has about 1.76 TFLOPS). So yes, the GPU is essentially a 7850 - however, they don't mention anything on ROPs, texture units, shader cores etc. But I assume it would more or less the same as the 7850.
The PS4 will also have 8GB of GDDR5 memory (which will be shared with the GPU). In answer to the thread question - YES, I think a 680M would be better than the PS4 GPU. Couple a 4GB DDR5 680M with a Core i7 3920XM with 32GB RAM + SSD and that should demolish a PS4.
One last thing to remember is PRICE OF GAMES! Sony can claim no price increases all they want, once PS4 games hit the shelves - AAA titles are going to cost anywhere between $60 - $70 each.
If you have a PC - most games on steam range between $40 - $60 (max). And steam usually have specials weeks prior to the game being launched.
Let's not forget their seasonal specials and steam sales - you can pick up good games dirt cheap.
A simple calculation:
If you buy 30 games for PS4 at an average of $65 each = 30 x 65 = $1950
and if you buy the same 30 games for PC at an average of $55 each = 30 x 55 = $1650
With the difference of $300, you could buy a Geforce GTX 660Ti for your PC -
-
Even with gaming consoles having the advantage of 'fire up and go' gameplay without a potentially heavy upfront investment, they do have their limits like the PCs, but developers have found ways of working around them. One such method is taking advantage of the GPU's hardware scaling Just because you fire up the PS3/XBOX 360 and have it setup for 1080p on your big screen TV does not mean the game is running at that resolution 100% of the time. Gran Turismo 5 is known to shift between 1080p in menus/car galleries to 720p in actual racing. XBOX's big and bad FORZA 4 runs at 720p constantly with heavy post-AA. I implore those of you who think they are playing their favorite games at 1080p resolution like a PC can, do research and I guarantee that almost all, if not all, of your games may actually be running at 720p or not even at a standard HD resolution. The low PPI of a large screen TV and low level anti-aliasing processing does wonders.
As far as the OP question goes, the GTX 680M can certainly compete with the PS4's GPU. In its factory state, its raw performance can match or even slightly be better than the PS4. -
-
-
So funny to witness people so attached to their computers. You guys try too hard.
1 year from now our 680M and Ivy bridge processors will be crap compared to notebooks with Maxwell+Broadwell+ExpressSATA SSDs. Not to even mention the desktop that will be out in 1 year. Heck you could just look at the ones that are out now. They are in a total different league. We are little boys compared.
You guys think your little notebook with a 1080p 60Hz display, single SSD, downclocked midrange GPU, locked down CPU (TDP) is so good. Its decent hardware, but nothing more.
Our noteooks are OK, but its still a long way from the top hardware, so I`m not sure why everyone is so defensive about a notebook that maybe will be beaten by a new console that PS4 owners are "stuck with" for 5-8 years. Its just silly watching people getting so riled up about it.
It is what it is
I`m gonna build myself a desktop with Titan SLI or GTX 790 when Haswell is out. It is going to be a slient build, and I`m gonna pair it with 1600p 120Hz screen. It will eat all of our notebooks for dinner.
I`m still going for a notebook with GTX 780M and Haswell for LAN parties and traveling. -
Kade Storm The Devil's Advocate
I actually agree with the above. I don't think that anyone's arguing the benefits of fixed hardware and the coding that follows. It's the claim of how much of a gain is yielded. Is he right in saying that it'll be twice as powerful? It's the bloated aspect of his claim, along with the certitude that warrants questions.
Also to keep in mind, this two-fold yield wasn't the case last generation, not even close, and while there was an advantage it wasn't the 100% increase that Carmack's claiming/predicting for these new consoles. So I guess the logic he employs is correct, but the actual efficacy--two fold performance of equal PC hardware--that he predicts seems categorically hyperbolic.
-
-
One thing for sure, the PS4 will crush your pathetic little HP notebook. 6770M hahaha, probably just as weak as the PS3 GPU. :hi2:
-
The question in the topic title is moot. Two different platforms. Each will have exclusive games. Buy both a gaming PC and a ps4 and then just choose which cross platform titles you will buy for which system. If you can only afford one, choose wisely based on your gaming style. Pretty simple.
-
hockeymass gets personal and calls it trolling because he needs to defend notebooks. I mean come on, its just a toy. -
Kade Storm The Devil's Advocate
-
-
failwheeldrive Notebook Deity
Hey guyz, I swear that I'm actually 28 and that x 32993x isn't my birthday! I'm super sereal guyz! I'm not 19! I'm way moar mature than u babiez!
-
-
i think we should be glad that there is so much technology available to use as consumers and that we get all these choices. I will always have a mid to high end PC and a console. To me certain games are meant ot played on consoles with a controller and others are meant for PC. An example is, IMHO Racing and Sports games such as Fifa should be played on a console whereas something like Diablo or Starcraft belongs only on a PC. Having said that, the move to x86 architecture i think will mean console exclusives might start making PC appearnaces...see how we go i guess..
-
-
TheBlackIdentity Notebook Evangelist
-
-
-
In the end, consoles are accesible gaming machines with less hassle, but limited options.
I like PS4 btw. I think it's very well designed. I am a hardware enthusiast and I was happy to see the specs of it, and more importantly how it will all traduce to better user experience. I will buy it and since I have a 150Mb connection, I can make full use of most of its features. -
https://www.youtube.com/watch?v=c7PKpGyjqBw
Guy is running on a 7800GT (same as PS3s)
https://www.youtube.com/watch?v=bPIuTOm6flc
Same guy running Skyrim at 1280x1024, a bit higher than PS3s.
https://www.youtube.com/watch?v=abGW1bk1nmM
A bundle of videos of people running a 7800GT with fraps for recording.
Without fraps, frame rates should pretty much be the same. To be honest, I'm taking what everyone says with a grain of salt. Until I have the console myself, I'll reserve judgement. Businesses and their partners will lie in order to make the product look good. The PS3 was supposed to have native 1080p for all games and 2 HDMI ports. What we got were sub HD games at 30 frames. The cell was supposed to be revolutionary and instead multiplatform games got an inferior port in comparison. I'm not knocking the console as I've had every playstation to date aside from vita but I'm keeping my "I'll believe it when I see it" thing going on here. -
-
-
-
-
I think the actual equation would be "E=γPS4," because Einstein's theory of relativity is "E=γmc^2." The implication is that the PS4 moves so quickly that γ approaches infinity, so the energy of the PS4 approaches infinity.
Sorry, I had to spoil the joke.
Can a 680M compete with the PS4?
Discussion in 'Gaming (Software and Graphics Cards)' started by fluent, Feb 22, 2013.