The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Nvidia Kepler vs AMD GCN (winner announced)

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jan 19, 2012.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Many people have probably heard about Semiaccurate and the author Charlie Demerjian. Probably the biggest Nvidia hater there is. His first indepth story about Kepler can be read here. Long story short, "Nvidia and Kepler sucks."
    He hated Nvidia GPUs so much a while back he actually rented an electron microscope to find defects on the soldering on the Nvidia GPUs. Read it here lol

    BUT today he wrote a new article claiming he have seen the upcoming GK104 Kepler GPUs (to replace desktop GTX 560 Ti), and to everyone`s shock, he announce Kepler as the clear winner. So this coming from a true Nvidia hater, there must be some truth to it :D

    Here is the recent article:
    Nvidia Kepler vs AMD GCN has a clear winner | SemiAccurate
     
  2. crpcookie

    crpcookie Notebook Geek

    Reputations:
    0
    Messages:
    98
    Likes Received:
    0
    Trophy Points:
    15
    I'm pretty sure this is one of the reasons Apple and many other companies are switching back to nVidia.
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    There's zero details in that article. Nothing. It's all just "nVidia is better". What?

    One place AMD will beat nVidia is cost. You take cost/performance and AMD will win 9/10 times. Especially in the mobile market. You need only look at the Llano and HD 6970m/6990m to see that.
     
  4. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Wohoo calm down a little HT. You don`t know that AMD mobile GPUs will be cheaper than Nvidia this gen. The desktop 7970 price have gone up (compared) to 6970, and mobile GPUs could follow. A lot of AMD fans was not happy about this.

    Charlie have twittered that one of the reasons why he announced Kepler as the "winner" is because what is going to happen on the financial side. Read here. Perhaps there is change in the air?

    And I am allowed to post rumours ty
     
  5. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I also think kepler will beat GCN, because AMD just rushed 28nm to be the first, always the follower is better in nvidia/amd competition, tell me one card which didn't beat predecessor?
     
  6. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    We'll see. Although on the mobile segment, nVidia coupled with Intel will price it much higher than any AMD solution. Intel is just expensive, so nVidia would have to cut their costs by a good 20-25% for an Intel/nVidia combo to be cost competitive with AMD.
     
  7. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Interesting... I wonder if Nvidia wins in power efficiency.
     
  8. aintz

    aintz Notebook Evangelist

    Reputations:
    34
    Messages:
    588
    Likes Received:
    3
    Trophy Points:
    31
    except for the fact with the new 7xxx series amd is raising the cost/performance ratio LOL

    i mean seriously a new gen card that is not much faster than the previous gen at a $550 price tag. when 5870 came out it was near double the speed of 4870 and only had the price tag of $450. what happened to the goodol amd.


    back on topic, kepler is almost guarantee to be faster than 7970 and anyone who reads about hardware should know this. it might come with a ridiculous 800dollar price tag though since amd priced 7970 at a stupid 550.
     
  9. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Power efficiency is certainly one of their goals with Kepler. According to the slide the Kepler is almost 3x better performance/watt compared to Fermi

    [​IMG]
    [​IMG]
    [​IMG]
     
  10. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I was never very impressed with Fermi anyways. Expensive, big, and had only marginally more performance than the HD6000 series which was much cheaper.

    I am hoping Nvidia is forgetting about Fermi and starting fresh with Kepler.
    So far it looks promising especially for mobile GPUs. Who knows though, could be Nvidia doing their marketing BS :p

    http://www.fudzilla.com/games/item/25508-kepler-has-more-design-wins-than-fermi
     
  11. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    The performance is certainly there. Depending on the game it even smokes the GTX580. The problem is that since there is no competition they are pricing it accordingly to todays solutions. Since it is clearly superior to the GTX580 then it costs a bit more than it.

    Plus the new architecture needs drivers to really shine. Plus it overclocks like crazy... Reaching the performance of an HD6990 is no small feat.
     
  12. aintz

    aintz Notebook Evangelist

    Reputations:
    34
    Messages:
    588
    Likes Received:
    3
    Trophy Points:
    31
    the top end fermi wasnt great on dollar/performance. the 460 and 570 were pretty decent till price rose somehow. my friend bought a 570 a year ago for 260 and they are going for 320 atm....
     
  13. Peon

    Peon Notebook Virtuoso

    Reputations:
    406
    Messages:
    2,007
    Likes Received:
    128
    Trophy Points:
    81
    Nvidia has held the performance crown in every generation since the Geforce 6000 series launched in 2004. The problem with Nvidia is that their top of the line cards have also been far more expensive than ATI/AMD in every generation since the Geforce 8000 series launched in 2007.

    I don't see either of these extremely longstanding trends changing with GCN vs Kepler. If AMD's asking for $550 for the 7970, Nvidia will simply launch Kepler at $600+.
     
  14. aintz

    aintz Notebook Evangelist

    Reputations:
    34
    Messages:
    588
    Likes Received:
    3
    Trophy Points:
    31
    not really. they been moving back and forth. x800xtpe ($650)> 6800ultra ($620), 7800gtx (499) killed and so did 8800gtx ($499). hd4870 (449) and 5870($449) dominated. gtx480 (499) was faster than 5870 but it came much later and had huge powerdraw and heat. then came the filler generation 6xxx and 5xx, initially the prices were reasonable but then prices actually went up.

    and now amd manages to screw things up even more with their 7970 pricing at $550. its performance increase is not even close to previous "real" generation upgrades yet its priced higher than previous flagship cards. ive bought every singlegpu flagship card until the 5870 series and probably will never buy one again since i game much less now and the price/performance is ridiculous even for midend cards.
     
  15. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    I would say neither company holds an all-encompassing performance crown, especially not since the GeForce 6 / ATI x800 series.
     
  16. aintz

    aintz Notebook Evangelist

    Reputations:
    34
    Messages:
    588
    Likes Received:
    3
    Trophy Points:
    31
    i got into hardware in the x800 and 6xxx age with my best friend. he got the x800xtpe while i got the 6800gt. there most definiatly has been performance crowns since that age. most notably the 7800gtx and 5870.
     
  17. Peon

    Peon Notebook Virtuoso

    Reputations:
    406
    Messages:
    2,007
    Likes Received:
    128
    Trophy Points:
    81
    There was a 6800 Ultra Extreme. Of course, much like the X800XT PE, it was more or less vaporware, being virtually impossible to find for sale.

    IIRC, the HD 4870 was only on par with a GTX 260 core 216... It was certainly nowhere near the level of performance of a GTX 280 or GTX 285. The rest I agree with, though I have to say the 5870's performance crown was merely temporary, and from this thread it sounds like GCN vs Kepler will be a repeat of that scenario.

    The GTX 580 was always expensive. Then again, neither AMD or Nvidia's pricing reflected the fact that what they were selling was basically year-old technology so everything above 6950 1 GB/GTX 560 Ti level felt overpriced.

    I haven't bought a flagship card ever since the 8800 GTX - that was around when Crysis came out, and Crysis was the last game that managed to bring the fastest available PC hardware to its knees.

    On the desktop front, there's no reason to buy anything better than midrange nowadays unless if you have a 2560x1600 monitor - the crappy console ports that try to pass for PC games these days simply can't make use of it.
     
  18. aintz

    aintz Notebook Evangelist

    Reputations:
    34
    Messages:
    588
    Likes Received:
    3
    Trophy Points:
    31
    the x800xtpe was pretty widely available. when we can get them in our ty lil city of ottawa it means its pretty widely available.

    4870 was came ealier and was cheaper than gtx 260 216 core.... and gtx 480 was released long after 5870. i tookover my friends xfire 5870s when gtx 480s came out....

    the gtx 570 was forsale at 260 a year ago and atm its 320...... i guess supply and demand made it that way.

    i didnt play crysis until got a 5870 because all previous gen cant even sustain 50+fps on decent settings. and atm buying a highend gpu seems very stupid due to the crazy prices. 550 should buy a top end gpu that isnt only 30% faster than the previous gen.
     
  19. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    that's my point. neither company held that crown the entire time.
     
  20. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131
    lol? x800xtpe vs. 6800 was even except 6800s had SM 3.0. and the x1900xt raped the 7900gtx. nvidia has had the crown consecutively since 8800 series.
     
  21. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    This is correct, the X1900XT was the best deal back then. The problem was that it launched much later than the GTX7800 series so they were still going back and forth.

    The HD2000 and onwards, the Radeon cards have not had the crown in single GPU, only in dual GPU cards.
     
  22. AlwaysSearching

    AlwaysSearching Notebook Evangelist

    Reputations:
    164
    Messages:
    334
    Likes Received:
    15
    Trophy Points:
    31
    AMD wins on price/performance.

    Doesn't matter that they have increased their price on the new cards
    I for one will bet it will still be a more attractive option than the
    comparable nVidia card.

    If I can save 10%+ ($150+ on a $1500) laptop for almost the same gpu
    performance then I will. If the savings is closer to 5% then nVidia will
    pick up sales.
     
  23. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    sort of depends on how close the performance is, as well. maybe you'd accept a 5% difference for a 10% cost savings, but maybe 10 or 15% would be too much. other people have other thresholds. hence, the market.
     
  24. sarge_

    sarge_ Notebook Deity

    Reputations:
    288
    Messages:
    896
    Likes Received:
    1
    Trophy Points:
    31
    wake me when ATI has proper 3D support.
     
  25. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Perhaps this was what Charlie had in mind. GTX 660 will be equal or faster than 7950/7970. Gotta love rumours/hype :)

    Mid-Range NVIDIA Kepler to Challenge AMD 7900 High-End Cards - Softpedia

    http://www.legitreviews.com/news/12324/
     
  26. AlwaysSearching

    AlwaysSearching Notebook Evangelist

    Reputations:
    164
    Messages:
    334
    Likes Received:
    15
    Trophy Points:
    31
    Well one thing is for sure. AMD is already selling their nextgen.

    We are still guessing what nvidia will really be and more importantly when.

    I wouldnt be surprised if 4-6 months from now we are still wondering.
     
  27. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Wake me when 3D is more than a gimmick that doesn't make me vomit after 20 minutes.
     
  28. Generic User #2

    Generic User #2 Notebook Deity

    Reputations:
    179
    Messages:
    846
    Likes Received:
    0
    Trophy Points:
    30
    ^agreed

    surround gaming...now THAT is where its at(provided the game actually uses the monitors properly).

    Nvidia will have their own triple-header system(their version of eyefinity)? on mobile as well?
     
  29. sarge_

    sarge_ Notebook Deity

    Reputations:
    288
    Messages:
    896
    Likes Received:
    1
    Trophy Points:
    31
    Wake up and get your eyes checked. Works and looks great for the majority.


    @Generic User #2: it's called 3D Vison Surround...
     
  30. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    Not even close. There is a reason 3D is not catching on.
     
  31. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Doesn't work for those with astigmatism, have variability in vision between two eyes, most 3D glasses don't work well for those that wear corrective lenses, those with vision out of only one eye, lazy eye, cross eyes, etc. It's tricking your vision to see what it normally doesn't, and does and can give people headaches and nausea.

    In certain movies, like Avatar, that were designed exclusively for 3D in mind, they developed special camera equipment "to get it right", they have a place, but they should not and are not the norm. But other movies that just add 3D, it's usually non value added IMHO.
     
  32. Peon

    Peon Notebook Virtuoso

    Reputations:
    406
    Messages:
    2,007
    Likes Received:
    128
    Trophy Points:
    81
    Agreed. I don't think 3D will really catch on until hologram projectors replace screens.
     
  33. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    Yea right. Go tell that to Nintendo. Oh well.
    Also (admittedly not an objective testimony) the majority of people who shared their experience with 3D with me did complain about headache among other things. As far as I'm concerned I wear glasses and I was born with an eye condition that gives me trouble to distinguish embossments ; most of the time I can't tell the difference bewteen 3D on/off. It's nice that we are making progress in this field but it remains a gimmick for me and for many people.
     
  34. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    A lot of this Kepler hype just sounds like Nvidia is trying to stalling, because they're so late to the party. I'd be trying to keep people from buying a 7900 AMD card too.
     
  35. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    sarge: the vast majority of us aren't buying into the 3D screens just yet, and for good reason: it's gimmicky and doesn't work very well. don't need to have my eyes checked, pop-out 3D in video games is just pretty pathetic overall. It's also cheesy in film. Moreover, it strains the eyes, can cause headaches, and degrades image quality.
     
  36. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Uhhh, thats exactly how things have been for a while, and even the current generation? The GTX560ti is on par with te HD6950, and the HD6970 trades blows with the GTX570.

    I doubt such a scene will change anytime soon. I wouldn't be surprised if the GTX660 and the HD7950 have about the same performance.
     
  37. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    I have a 3D monitor, and while playing games in 3D is cool and all, it is nothing more than a cool gimmick. It kinda fades after a while, the novelty.
     
  38. sarge_

    sarge_ Notebook Deity

    Reputations:
    288
    Messages:
    896
    Likes Received:
    1
    Trophy Points:
    31
    I'm sorry that 3D doesn't work for you and your medical conditions, but works fine for me. I experience neither headaches nor nausea nor increased eye strain even after many hours of 3D gaming. The effect is immersive and looks great. I actually prefer lowering some graphics details (if needed) and playing in 3D to playing in 2D with better graphics.

    That's no different than saying "DirectX11 is just a gimmick".
    Calling it a gimmick is easy, but can anyone actually base their claim?

    I have noticed a tendency for the 3D sceptics to say that they personally can't enjoy 3D because of headaches/nausea/some medical condition. Well guess what, that's not the case for everyone. It is not the norm.
     
  39. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    That's fine, but doesn't work for everyone.

    It is a gimmick, plain and simple, call it a luxury if you wish. 3D, virtual surround, and other types of devices have come and gone over the years, never surfacing as mainstream. There is nothing value added. Until they can offer 3D without glasses and without requiring an extreme narrow field of view in order for it to be seen will it become mainstream. Red/green or red/cyan glasses have been around for decades yet never really was more than "fun". Phase shift 3D and even shutter 3D has been around for decades too.

    The only reason they're pushing it? Money. It's harder to attract users to movies or buying new home theater equipment, so they resort to gimmicks. And TV or movie viewing that requires peripherals will not catch on at home either. It's hard enough keeping track of your remote controls let alone glasses that you will NEED to watch 3D. I remember going to Las Vegas ten years ago and they had virtual boxing, virtual everything, and it was pretty good technology. Thought we would have had something like that in home by now, but nope, doesn't exist for a reason. It's a niche, a luxury, or gimmick however you want to call it.

    I like 3D. It's just this push for it is quite annoying. I loved Avatar and saw it in 3D and 2D and have to say I wouldn't watch Avatar any other way but 3D. But other movies, even games, it's ok, but the effect wears off quickly. I have to take off my glasses every 20 minutes for a good 30-60 seconds if I watch a 3D movie to avoid headaches and nausea. It's more common than you think. The nausea and headaches mostly come from motion sickness, and that affects 33% of people. I've known lots of people who enjoy 3D movies but say they have to remove the glasses periodically as well, or close their eyes.

    http://www.pcworld.com/article/247739/why_3d_tv_isnt_cool_at_ces_this_year.html
     
  40. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Agreed on the 3D front, all I want is 120hz, I personally don't give a crap about 3D on my monitor. The 120hz is the real "Star" of 3D screens IMO, and the only reason I will get one.
     
  41. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    But they say 560Ti is faster than 69 70. That have never been the case has it?

    I am with the rest of the guys here though: Take this with a big amount of salt. See it to believe it. :)
     
  42. ratchetnclank

    ratchetnclank Notebook Deity

    Reputations:
    1,084
    Messages:
    1,506
    Likes Received:
    900
    Trophy Points:
    131
    DirectX 11 is a set of api's. How can that be a gimmick?

    I can't see 3d personally, except the very prominent parts so i couldn't say if it adds to immersion but in my eyes it's a waste.
     
  43. junglebungle

    junglebungle Notebook Evangelist

    Reputations:
    263
    Messages:
    499
    Likes Received:
    0
    Trophy Points:
    30
    Always been an Nvidia fan here :)
     
  44. junglebungle

    junglebungle Notebook Evangelist

    Reputations:
    263
    Messages:
    499
    Likes Received:
    0
    Trophy Points:
    30
    That's never been the case, it's just about on par with a 6950, a 2GB version at that.

    I've got two 1GB in SLI and they out perform a GTX 580, great little cards.
     
  45. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    just to confirm: the gtx 560 ti matches up with the AMD 6950. The gtx 570 matches up with the AMD 6970.
     
  46. Peon

    Peon Notebook Virtuoso

    Reputations:
    406
    Messages:
    2,007
    Likes Received:
    128
    Trophy Points:
    81
    More or less.

    Supposedly, driver updates from AMD over the course of 2011 have made the 6950 significantly faster while GTX 560 Ti performance has stayed more or less constant, so the 6950 is now the faster card.
     
  47. masterchef341

    masterchef341 The guy from The Notebook

    Reputations:
    3,047
    Messages:
    8,636
    Likes Received:
    4
    Trophy Points:
    206
    it's certainly possible. I don't know. the 6950 is still not as fast as the gtx 570.
     
  48. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Well GK104 is rumored to be priced at $300. Could be good news for mobile GPU prices if this is true.
     
  49. Alias

    Alias Notebook Deity

    Reputations:
    78
    Messages:
    714
    Likes Received:
    143
    Trophy Points:
    56
    Wat? That doesnt sound right? A $300 GPU outperforming a $600 GPU???
     
  50. AlwaysSearching

    AlwaysSearching Notebook Evangelist

    Reputations:
    164
    Messages:
    334
    Likes Received:
    15
    Trophy Points:
    31
    That would be around the right price if the GK104 is more a mid-level
    card.
     
 Next page →