Many people have probably heard about Semiaccurate and the author Charlie Demerjian. Probably the biggest Nvidia hater there is. His first indepth story about Kepler can be read here. Long story short, "Nvidia and Kepler sucks."
He hated Nvidia GPUs so much a while back he actually rented an electron microscope to find defects on the soldering on the Nvidia GPUs. Read it here lol
BUT today he wrote a new article claiming he have seen the upcoming GK104 Kepler GPUs (to replace desktop GTX 560 Ti), and to everyone`s shock, he announce Kepler as the clear winner. So this coming from a true Nvidia hater, there must be some truth to it![]()
Here is the recent article:
Nvidia Kepler vs AMD GCN has a clear winner | SemiAccurate
-
-
I'm pretty sure this is one of the reasons Apple and many other companies are switching back to nVidia.
-
There's zero details in that article. Nothing. It's all just "nVidia is better". What?
One place AMD will beat nVidia is cost. You take cost/performance and AMD will win 9/10 times. Especially in the mobile market. You need only look at the Llano and HD 6970m/6990m to see that. -
Wohoo calm down a little HT. You don`t know that AMD mobile GPUs will be cheaper than Nvidia this gen. The desktop 7970 price have gone up (compared) to 6970, and mobile GPUs could follow. A lot of AMD fans was not happy about this.
Charlie have twittered that one of the reasons why he announced Kepler as the "winner" is because what is going to happen on the financial side. Read here. Perhaps there is change in the air?
And I am allowed to post rumours ty -
I also think kepler will beat GCN, because AMD just rushed 28nm to be the first, always the follower is better in nvidia/amd competition, tell me one card which didn't beat predecessor?
-
We'll see. Although on the mobile segment, nVidia coupled with Intel will price it much higher than any AMD solution. Intel is just expensive, so nVidia would have to cut their costs by a good 20-25% for an Intel/nVidia combo to be cost competitive with AMD.
-
Interesting... I wonder if Nvidia wins in power efficiency.
-
i mean seriously a new gen card that is not much faster than the previous gen at a $550 price tag. when 5870 came out it was near double the speed of 4870 and only had the price tag of $450. what happened to the goodol amd.
back on topic, kepler is almost guarantee to be faster than 7970 and anyone who reads about hardware should know this. it might come with a ridiculous 800dollar price tag though since amd priced 7970 at a stupid 550. -
Power efficiency is certainly one of their goals with Kepler. According to the slide the Kepler is almost 3x better performance/watt compared to Fermi
-
I was never very impressed with Fermi anyways. Expensive, big, and had only marginally more performance than the HD6000 series which was much cheaper.
I am hoping Nvidia is forgetting about Fermi and starting fresh with Kepler.
So far it looks promising especially for mobile GPUs. Who knows though, could be Nvidia doing their marketing BS
-
Plus the new architecture needs drivers to really shine. Plus it overclocks like crazy... Reaching the performance of an HD6990 is no small feat. -
the top end fermi wasnt great on dollar/performance. the 460 and 570 were pretty decent till price rose somehow. my friend bought a 570 a year ago for 260 and they are going for 320 atm....
-
Nvidia has held the performance crown in every generation since the Geforce 6000 series launched in 2004. The problem with Nvidia is that their top of the line cards have also been far more expensive than ATI/AMD in every generation since the Geforce 8000 series launched in 2007.
I don't see either of these extremely longstanding trends changing with GCN vs Kepler. If AMD's asking for $550 for the 7970, Nvidia will simply launch Kepler at $600+. -
and now amd manages to screw things up even more with their 7970 pricing at $550. its performance increase is not even close to previous "real" generation upgrades yet its priced higher than previous flagship cards. ive bought every singlegpu flagship card until the 5870 series and probably will never buy one again since i game much less now and the price/performance is ridiculous even for midend cards. -
masterchef341 The guy from The Notebook
I would say neither company holds an all-encompassing performance crown, especially not since the GeForce 6 / ATI x800 series.
-
i got into hardware in the x800 and 6xxx age with my best friend. he got the x800xtpe while i got the 6800gt. there most definiatly has been performance crowns since that age. most notably the 7800gtx and 5870.
-
On the desktop front, there's no reason to buy anything better than midrange nowadays unless if you have a 2560x1600 monitor - the crappy console ports that try to pass for PC games these days simply can't make use of it. -
the x800xtpe was pretty widely available. when we can get them in our ty lil city of ottawa it means its pretty widely available.
4870 was came ealier and was cheaper than gtx 260 216 core.... and gtx 480 was released long after 5870. i tookover my friends xfire 5870s when gtx 480s came out....
the gtx 570 was forsale at 260 a year ago and atm its 320...... i guess supply and demand made it that way.
i didnt play crysis until got a 5870 because all previous gen cant even sustain 50+fps on decent settings. and atm buying a highend gpu seems very stupid due to the crazy prices. 550 should buy a top end gpu that isnt only 30% faster than the previous gen. -
masterchef341 The guy from The Notebook
-
-
The HD2000 and onwards, the Radeon cards have not had the crown in single GPU, only in dual GPU cards. -
AlwaysSearching Notebook Evangelist
AMD wins on price/performance.
Doesn't matter that they have increased their price on the new cards
I for one will bet it will still be a more attractive option than the
comparable nVidia card.
If I can save 10%+ ($150+ on a $1500) laptop for almost the same gpu
performance then I will. If the savings is closer to 5% then nVidia will
pick up sales. -
masterchef341 The guy from The Notebook
sort of depends on how close the performance is, as well. maybe you'd accept a 5% difference for a 10% cost savings, but maybe 10 or 15% would be too much. other people have other thresholds. hence, the market.
-
wake me when ATI has proper 3D support.
-
Perhaps this was what Charlie had in mind. GTX 660 will be equal or faster than 7950/7970. Gotta love rumours/hype
-
AlwaysSearching Notebook Evangelist
Well one thing is for sure. AMD is already selling their nextgen.
We are still guessing what nvidia will really be and more importantly when.
I wouldnt be surprised if 4-6 months from now we are still wondering. -
-
^agreed
surround gaming...now THAT is where its at(provided the game actually uses the monitors properly).
Nvidia will have their own triple-header system(their version of eyefinity)? on mobile as well? -
@Generic User #2: it's called 3D Vison Surround... -
-
Doesn't work for those with astigmatism, have variability in vision between two eyes, most 3D glasses don't work well for those that wear corrective lenses, those with vision out of only one eye, lazy eye, cross eyes, etc. It's tricking your vision to see what it normally doesn't, and does and can give people headaches and nausea.
In certain movies, like Avatar, that were designed exclusively for 3D in mind, they developed special camera equipment "to get it right", they have a place, but they should not and are not the norm. But other movies that just add 3D, it's usually non value added IMHO. -
-
Also (admittedly not an objective testimony) the majority of people who shared their experience with 3D with me did complain about headache among other things. As far as I'm concerned I wear glasses and I was born with an eye condition that gives me trouble to distinguish embossments ; most of the time I can't tell the difference bewteen 3D on/off. It's nice that we are making progress in this field but it remains a gimmick for me and for many people. -
A lot of this Kepler hype just sounds like Nvidia is trying to stalling, because they're so late to the party. I'd be trying to keep people from buying a 7900 AMD card too.
-
masterchef341 The guy from The Notebook
sarge: the vast majority of us aren't buying into the 3D screens just yet, and for good reason: it's gimmicky and doesn't work very well. don't need to have my eyes checked, pop-out 3D in video games is just pretty pathetic overall. It's also cheesy in film. Moreover, it strains the eyes, can cause headaches, and degrades image quality.
-
I doubt such a scene will change anytime soon. I wouldn't be surprised if the GTX660 and the HD7950 have about the same performance. -
I have a 3D monitor, and while playing games in 3D is cool and all, it is nothing more than a cool gimmick. It kinda fades after a while, the novelty.
-
I'm sorry that 3D doesn't work for you and your medical conditions, but works fine for me. I experience neither headaches nor nausea nor increased eye strain even after many hours of 3D gaming. The effect is immersive and looks great. I actually prefer lowering some graphics details (if needed) and playing in 3D to playing in 2D with better graphics.
Calling it a gimmick is easy, but can anyone actually base their claim?
I have noticed a tendency for the 3D sceptics to say that they personally can't enjoy 3D because of headaches/nausea/some medical condition. Well guess what, that's not the case for everyone. It is not the norm. -
That's fine, but doesn't work for everyone.
It is a gimmick, plain and simple, call it a luxury if you wish. 3D, virtual surround, and other types of devices have come and gone over the years, never surfacing as mainstream. There is nothing value added. Until they can offer 3D without glasses and without requiring an extreme narrow field of view in order for it to be seen will it become mainstream. Red/green or red/cyan glasses have been around for decades yet never really was more than "fun". Phase shift 3D and even shutter 3D has been around for decades too.
The only reason they're pushing it? Money. It's harder to attract users to movies or buying new home theater equipment, so they resort to gimmicks. And TV or movie viewing that requires peripherals will not catch on at home either. It's hard enough keeping track of your remote controls let alone glasses that you will NEED to watch 3D. I remember going to Las Vegas ten years ago and they had virtual boxing, virtual everything, and it was pretty good technology. Thought we would have had something like that in home by now, but nope, doesn't exist for a reason. It's a niche, a luxury, or gimmick however you want to call it.
I like 3D. It's just this push for it is quite annoying. I loved Avatar and saw it in 3D and 2D and have to say I wouldn't watch Avatar any other way but 3D. But other movies, even games, it's ok, but the effect wears off quickly. I have to take off my glasses every 20 minutes for a good 30-60 seconds if I watch a 3D movie to avoid headaches and nausea. It's more common than you think. The nausea and headaches mostly come from motion sickness, and that affects 33% of people. I've known lots of people who enjoy 3D movies but say they have to remove the glasses periodically as well, or close their eyes.
http://www.pcworld.com/article/247739/why_3d_tv_isnt_cool_at_ces_this_year.html -
Megacharge Custom User Title
-
I am with the rest of the guys here though: Take this with a big amount of salt. See it to believe it. -
ratchetnclank Notebook Deity
I can't see 3d personally, except the very prominent parts so i couldn't say if it adds to immersion but in my eyes it's a waste. -
Always been an Nvidia fan here
-
I've got two 1GB in SLI and they out perform a GTX 580, great little cards. -
masterchef341 The guy from The Notebook
just to confirm: the gtx 560 ti matches up with the AMD 6950. The gtx 570 matches up with the AMD 6970.
-
Supposedly, driver updates from AMD over the course of 2011 have made the 6950 significantly faster while GTX 560 Ti performance has stayed more or less constant, so the 6950 is now the faster card. -
masterchef341 The guy from The Notebook
it's certainly possible. I don't know. the 6950 is still not as fast as the gtx 570.
-
Well GK104 is rumored to be priced at $300. Could be good news for mobile GPU prices if this is true.
-
Wat? That doesnt sound right? A $300 GPU outperforming a $600 GPU???
-
AlwaysSearching Notebook Evangelist
That would be around the right price if the GK104 is more a mid-level
card.
Nvidia Kepler vs AMD GCN (winner announced)
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jan 19, 2012.