The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Nvidia announces GTX580- +20% over GTX480

    Discussion in 'Gaming (Software and Graphics Cards)' started by Lozz, Oct 21, 2010.

  1. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    Source Digitimes

    This is the first time I've heard of the GTX500 series, looks like we're set for a good ole fashioned graphics card hoedown.
     
  2. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    There were some reports of the GTX 580 a few days ago but still awaiting confirmation on the specs. Both AMD 6xxx series and the nVidia 5xx series are just incremental step ups from the current series.
     
  3. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    do i see cards with 1000W draw power under load? I can't imagine how much more power will these cards consume... and not to mention how hot they'll get.. maybe 100C++ under load?
     
  4. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,194
    Likes Received:
    17,902
    Trophy Points:
    931
    No I think now that fermi is king of the compute market they are taking the same approach as the GTX460 and scaling it up.
     
  5. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    I would bet money it's GF104 based, which means 25% less power consumption, and it would also be cheaper per chip
    http://www.electronista.com/articles/10/06/18/gf104.could.replace.most.nvidia.hardware/

    When it goes on market, I would expect the same power consumption worst case, but with any luck 10-15% better at the least.
     
  6. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    really? I doubt it... like 70C idle and 94C in load for GTX480...

    look at this.. sums up my opinion of ferni.

    YouTube - Hitler reacts to Nvidia Fermi Benchmarks
     
  7. wikoogle

    wikoogle Notebook Consultant

    Reputations:
    56
    Messages:
    119
    Likes Received:
    0
    Trophy Points:
    30
    What the hell is up with all these graphics fanboys?

    Support whoever actually innovates the industry and pushes it forward. It doesn't matter what field it is. Innovation is a good thing, and one that we should support.

    So in short, I don't give a crap about either the ATI 6XXX series or the GTX580. Both will be on par with each other in performance and in price, and both are minute incremental upgrades over the current graphics cards and don't warrant upgrading to. What excites is whatever the hell Maxwell and Kepler are, and whatever ATI is doing to compete with them. Whichever of these actually pushes the industry forward is what I'm interested in.
     
  8. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Less power and cheaper per chip over GF100 but still a higher power consumption and more expensive per chip compared to AMD's offering.

    Per watt the new HD6800s outperforms GF104 based GTX 460 by up to 15%, while using only about 5% more silicon than GF106 to produce.
     
  9. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    I'm not sure what' you're suggesting, GF104 is cooler and uses less power. What has the current GF100 design have to do with GTX580 if it's GF104 based?
    no thanks.
    if all they did was copy and paste GF104 in a 512 shader design, it would probally be about as bad as GF100 is on power/heat unless they add a ton of shaders to alot of frequency. I can't imagine that's *all* they would do given how much press they've taken about those disadvantages however.

    http://www.bit-tech.net/hardware/graphics/2010/10/22/ati-radeon-hd-6870-review/9

    I see up to a 9% difference, and people that complain about power consumption with their 1KW PSU's make me laugh.
     
  10. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Today was the hard launch of the desktop cards but some shops jumped the gun and have had them available for pre-order since early this week.


    Are you going to make a 4th edit? (I guess you were)

    That's total power consumption not perf/watt.

    http://www.techpowerup.com/reviews/HIS/Radeon_HD_6850/29.html
     
  11. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    just noticed, shows how much I keep up with things.
     
  12. wikoogle

    wikoogle Notebook Consultant

    Reputations:
    56
    Messages:
    119
    Likes Received:
    0
    Trophy Points:
    30
    For a long time now, I prefered ATI cards.

    But I'm growing fond of Nvidia precisely because of Fermi.

    Say what you want, but atleast Fermi is innovative. Expensive yes, buggy yes, risky yes. Every single significant innovation in history was expensive, buggy, and risky at first. That's a fact of life.

    But we should encourage innovation. Because it pushes technology forward. Kepler and Maxwell also seem like they will be very innovative.

    Where as I haven't seen much innovation coming from ATI as of late.

    I'm not looking to buy a card anytime soon, so I can wait for the bugs to be worked out, I can wait for Kepler. But I hope that ATI has some awesome innovations of their own that they're planning to implement into their next gen cards. And I'm fairly confident that they do and are just being secreative about it, otherwise, they will lose out in the long term.
     
  13. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    There is absolutely nothing innovative about FERMI that AMD doesn't already have. They are not being secretive and if you want to know more about their awesome innovations... try to read what AMD's stream or now renamed Accellerated Parallel Processing can do. AMD staff are among the most prevalent for online presence, everything from new videos constantly posted on Youtube, use of Twitter, and constantly communicating with the Tech sites. You just gotta read if you are interested about.

    AMD with 10.10 have already begun to release drivers packaged with OpenCL.

    AMD APP will be used in their fusion processors with a Phenom II+HD5xxx in one. This will allow netbook with a CPU/GPU combined TDP of 25 Watts to have all the GPGPU support you would want in a netbook, even playing DX11 games with tessellation on.

    The AMD APP with HD5970 was proven to be faster at some calcuations than even a $10,000 Tesla card. Password recover massive amounts of calculations.
    [​IMG]

    AMD is just as innovative as Nvidia. Just because one company doesn't market their junk by forcing every reviewer to regurgitate their marketing blurb continuously doesn't make them more innovative.

    An AMD FirePro is 80% of a Quadro FERMI and actually faster in real time rendering. $1,500 vs $5,000. Two FirePro still $2,000 less than a Quadro blows it away.

    Two FirePro for less than one FERMI Quadro = AWESOME 12 DISPLAY EYEFINITY Setup!

    How can you forget that with FERMI, Nvidia just runs around screaming tessellation is awesome. AMD has had tessellation units since HD3xxx cards and they first implemented it in the Xbox 360 nearly half a decade ago...
     
  14. Abula

    Abula Puro Chapin

    Reputations:
    1,115
    Messages:
    3,252
    Likes Received:
    13
    Trophy Points:
    106
    Do you own one, or have you seen it in action?? or you just repeating what you read, in my case my GTX480 idles as low as my i7 920 (both air cooled), and gaming in sc2 for more than 4 hours, dont even break 65C, in furmark does reach 85C, but no games have ever passed 75C to what i remember. Im not trying to make Fermi look great, as i still think they could have done better, but its not as bad as half the poeple on the net makes it to be.

    [​IMG]
     
  15. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,054
    Likes Received:
    130
    Trophy Points:
    81
    Firepro's have bugs in workstation apps. Half price doesn't make them half worth the bother.
     
  16. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    right, the next time I need to break the password from the iphone, I'll be sure to pick up a 5970 :rolleyes: Why don't you post some benchmarks that involve the acutal applications the Tesla is desiged for instead of from some neat application that has no enterprise value. I would also suggest next time pulling a benchmark from some place other than the seller's own product description.
    http://igogadget.com/2010/03/16/recover-wi-fi-and-iphone-passwords-using-ati-video-cards/
    Quadro's cost more because of the driver and software that's designed for it. When you spend 32hours rending 10minuites of cgi to have it crash because of bugged drivers on your fire-pro, you'd wish you had spent more on the quadro's. businesses buy...get this.. workstation graphics cards, not individual users.

    and optimus doesn't? AMD will have a package advantage(and therfor a power consumption advantage), but that's one of the perks of owning both the processor and gpu division within the same parent company.
     
  17. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Can never argue with Nvidia fanboys, lols. Yay, Quadro are godly, FirePro are bugged and terrible. nVidia are amazing. Whatever.
     
  18. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    I can't argue with people ignorant of real world situations, and wish to base their arguments on irrelevent data, so we're even.
     
  19. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Can't argue with people who aren't even on topic. My response was about innovation. No point in arguing with someone who doesn't even respond to the topic. Good grief. We're not even, not even close, not even on topic.

    And in real world situation, the data I presented wasn't irrelevant. Password recovery may not be relevant for you, but for much of corporate and anyone interested in security, it's huge. Gimme a break, just because it doesn't suit your needs for processing 32 hours of CGI, it's not relevant.

    I must remember, what you deem to be important is only relevant and everything is just irrelevant data. Thank goodness for ignore feature. I don't mind having a discussion, but this is nothing close to a conversation at all, just plain insulting and derogatory.
     
  20. mew1838

    mew1838 Team Teal

    Reputations:
    294
    Messages:
    1,231
    Likes Received:
    5
    Trophy Points:
    56
    Why do people keep assuming that the GTX 480m runs hot?? It is definitely cooler than my 5870m. Now that card runs hot considering it uses less wattage.
     
  21. hakira

    hakira <3 xkcd

    Reputations:
    957
    Messages:
    1,286
    Likes Received:
    0
    Trophy Points:
    55
    Ironically, such recovery algorithms are almost always run under linux, where ATI cards have inferior support to NV. That's actually one of the few instances where CUDA has a major advantage over stream.

    Anyway, the 6970 will probably have "20%" on the 5870 and the cycle continues. Shame the 5xx fermi won't be any cheaper, though.
     
  22. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Okay people. FERMI does not run hot just because. How hot it runs depends on the cooling system. FERMI draws much more power. That's it.

    As for innovation, I have to give it to AMD. If you study anything regarding administration of technology etc, you would learn how to properly apply innovation. Innovation is not only about inventing or pushing forward. It's about making money. Proven on the market.

    A new technology can't be innovative if it hasn't been succesful on the market.

    Andyways 20% increase over 480 is nothing, and it's probably inline with the supposed increase to 512 cores.

    We will have to wait until next get cards to get massive increases.

    Now keep on fighting. I'll go back to play games.
     
  23. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    No. PCIe specs clearly state the card cannot go over 300W without requiring more than two 8-pin PCIe connectors.

    Also note that silicon boils at about 105C. No card can run that hot without almost immediately damaging itself.

    Please.

    The GTX 580 is supposed to be based off of GF104, which is a refinement of the Fermi architecture (GF106 and GF108 were cut down versions of GF104) that was designed to use less power and be more efficient. GF110 is in the works as well but there is not much information about it yet.

    The GTX 480 is the GF100. Bottom line is that nVidia screwed up, did not listen to TSMC's recommendations on chip design, and then had to ramp up the power levels in order to get a chip that worked. Thus it is hot, inefficient, and power hungry because that is what they had to do to even get a product out the door. It's called an engineering miscalculation.

    GF104 was the fix. Or at least a significant enough fix that a GTX 460 is on-par or more powerful than a GTX 275 while drawing even less power.

    Not entirely sure about GF106 or GF108 but bottom line is that they are an improvement over what they replaced while drawing around the same amount of power.
     
  24. wikoogle

    wikoogle Notebook Consultant

    Reputations:
    56
    Messages:
    119
    Likes Received:
    0
    Trophy Points:
    30
    Sorry, I completely disagree. How innovative a product is has almost no bearing on it's ability to succeed. Statistically, the most innovative products are the ones most likely to fail.

    Tried and true designs are more likely to succeed. Radical alterations on what people are used to are far more likely to never see wide adoption. Yet, without such departures, things would never get better.

    Some of the greatest innovations of all time failed, often because they were ahead of their time.

    And the technologies and ideas they pioneered often end up succeeding much much later...

    Examples...

    Betamax

    The Segway

    LaserDisc

    The Newton

    And every single thing on this list...

    Nine Technologies Ahead of Their Time - PCWorld
     
  25. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    Actually, there is. The refined versions of Fermi have a superscalar architecture. Dual schedulers, each scheduler can issue up to two instructions for ILP. ATI does not have that yet. Is it a big deal? Maybe not.

    They've also got 64KB "cache" per SM (48KB L1 cache with 16KB shared memory, or 16KB/48KB). ATI does not have anything that configurable. This, for high performance computing, is a big deal to me.

    nVidia also has ECC support built in to their higher end products, and the double precision performance is at least as good as ATI's.

    I cannot remember the GFLOPS ratings for ATI's cards right now but the Tesla units perform at a 1:2 ratio for double:single precision floating point math. It was 1:8 for last gen's nVidia cards.

    ...and the GTX480 is proven to be faster than $10,000 Tesla cards too. The 480 is clocked higher but have no ECC support, only a single DMA controller, and 1:8 ratio on FP performance. If you don't need those features the GTX480 makes more sense financially as does the 5870/5970.

    You buy $10,000 Tesla cards because you want ultimate reliability. Memory runs at about 1.1GHz instead of ~1.4GHz to ensure stability for longer periods of time (what GTX480 card is rated to run 24-7 for 3-5 years straight?), cores are downclocked slightly to again ensure longer term reliability, ECC support (needed for real scientific computing), two DMA controllers (more efficient memory transfers while the GPU is busy), and that 1:2 ratio on FP performance (again). Those $10,000 cards are still cheaper than building full blown high performance clusters...*sometimes*.

    This is definitely true. They're both good competitors. nVidia just slipped up. ATI is continuously slipping with their drivers right now...hoping that gets fixed by the 6xxx series.

    Not 100% sure on specs but you'll want to look at reliability numbers, double precision throughput, ECC support, and more to determine if one card is priced "too high" relative to the other.

    ...again note that other factors drive the price here.

    Tessellation is awesome when they can do it without massive performance losses that you could see with ATI products for a while. Not sure if that was driver related and not sure if that has been fixed though.
     
  26. Ruckus

    Ruckus Notebook Deity

    Reputations:
    363
    Messages:
    832
    Likes Received:
    1
    Trophy Points:
    0
    Wow, learned a ton from your post Greg. I will admit there are definitely advantages to FERMI, but... Of course I know that with a budget price of $1,500 it likely won't have ECC memory modules. But $1,500 fastest 147 GB/S memory bandwidth, exceeding even the Quadro 6000, and 2.4TFlops single precision, 532GFLOPS of double precision, that is just as innovative from AMD in their own manner. The Tesla C2050 is 1.03 TFlops Single Precision and 515GFLOPS double precision. Can't comment on AMD drivers, but seems AMD is well aware of their reputation and hopefully that issue will no longer be a deciding point for purchasing. And I'm sure when demand for FirePro grows, ECC memory will be added if AMD chooses to compete at $5,000-$10,000 for their GPU.
    - In my defense for response on innovation, I think providing that kind of performance for $1,500 for anyone who's data may not be as critical as yours, where a system using ECC memory modules would suffice, this is very innovative. Something Nvidia is unable or uninterested in providing.

    Anyways, sounds like you do some serious computing Ross!
     
  27. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Which is why I stated that you require to STUDY technology management or properly study branches dedicated in studying technology.

    The media and almost everyone use the word innovative for just about everything. The ammount of technology, inventions and patents on work that have radical new approaches are so overwhelmingly huge in quantity that the word innovative would loose its entire meaning. Which is why the word innovative is usually used to describe ideas with actual market potential.

    If you have a great idea, but it tanked with no success on the market, it was a probably a great invention ahead of it's time, but it wasn't innovative as it never reached its potential.

    Design and innovation management management - Guiding principles of good design
     
  28. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    All I want to know is when this GPU is going to be released?
     
  29. Harleyquin07

    Harleyquin07 エミヤ

    Reputations:
    603
    Messages:
    3,376
    Likes Received:
    78
    Trophy Points:
    116
    Is notebookcheck correct when they claim a 480m SLI setup actually consumes less power than the equivalent Crossfire 5870m? Funny since the single cards are the exact opposite in power consumption ratings.

    Back on topic: Early days yet, it sounds good that Nvidia are announcing the counter to the AMD 6xxx series, but saying it's 20% better than the 480m right off the bat smacks of hyperbole.
     
  30. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    In the mobile market, GF110 will almost definitely never be released.

    It's purely a HPC/desktop GPU and even chopped down and underclocked like the GF100 based GTX 480M, Nvidia is better off rebranding GF104 for a mobile GTX 580M....especially when you consider Nvidia is making little to no profit on each GF104 in the desktop market but think they can charge a $395 premium over GF106 for scrap GF104 parts as GTX 470M.


    The rumored GF112/114 parts have a somewhat better chance of making it to some kind of mobile form but so far the rumors on them suggest only an extra 64bit on the memory bus over a fully enabled GF104. That's only an improvement for the mobile market if downclocked GF104 are bandwidth limited.
     
  31. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    I consider this announcement of the 580 as a 100% diversionary tactic. Nvidia isn't even close to a new lineup, they just finished their 450 release within the past few weeks. AMD has had their 5xxx series out for over a year, it's mature and in need of a refresh. Now that AMD is ready to move on to the next generation, Nvidia wants to keep themselves relevant.

    {edit} What am I saying!!! We all know Nvidia never comes out with new products more than twice a decade. I'm sure they already have the 580 & 680 series stickers ready to slap on their current products to keep things "fresh" as they concentrate on the Tegra platform.
     
  32. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    Nvidia is doing what any relevent company that has reached it's saturation point in a market would do; diversifying. That's why it's going after phones, deeper into tegra and netbooks with optimus. AMD did the same thing by buying ATi, and neither have any any major architecture change since G80 or R600 btw. What's shocking is ATi isn't totally and completly blowing Nvidia away more than it is, they have a lot more chip making power, they have a huge parent company backing them by name now and they're not mananging anything better than what they've been doing for the last decade. Contrary to your narrow point of view, share holders don't care if PC nerds want nvidia to just make graphics cards, they want growth, and the best way to do that is to venture into new markets.
    you are absoluety wrong. Last I checked, Nvidia still had >45% discrete graphics card marketshare. Infact, it was only this past june that ATi finnaly eeked out 51% in the discrete market. You're kidding yourself if you think 6000 is anything less than a fly on nvidia's overall shoulder. Nvidia will not suddenly roll over and die within the peroid 6000 cards are avaliable and the time it takes to get 580 to doors.
     
  33. skoreanime

    skoreanime Notebook Consultant

    Reputations:
    40
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    QFT.

    Unlike AMD/ATI, Nvidia has been diversifying into different sectors in the market that will grow their overall lineup and in the long run, make them a better, bigger company. AMD bought ATI for a reason. They wanted to diversify into a new sector instead of relying on their rivals for GPU solutions.

    Nvidia is no different. Their road plan is capturing the professional market even more and make a statement in the expanding, competitive cell phone market. Also, we know they're contending against Intel in the wireless display market as well...that should be fun to watch.
     
  34. GapItLykAMaori

    GapItLykAMaori Notebook Evangelist

    Reputations:
    28
    Messages:
    385
    Likes Received:
    0
    Trophy Points:
    30
    >.> Keplar is going to be a more efficient architecture however i still have its doubts because we are nearing the limits of 40nm trans. Even tho nvidia have pretty much failed this gen its because of their change in their "ineffecient" architecture. I suspect that ati are slightly scared to move on to a brand new one because the effects could be devasting towards their revenue as displayed by nvidia. Since the rv670 ati have just been increasing streams and tweaking the overall architecture to give more performance per watt, however they are going to hit a wall if they continue this way. When this will happen, we do not know.

    yea smart move for both companies. At the time they were both the underdog's and by working together/sharing tech they became much stronger and AMD managed to pay off their huge debts this year :)
     
  35. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    I'm not a bond holder, I'm a customer of GPU's so obviously my narrow point of view keeps me from caring about the company as a whole. I'm not concerned about the long-term business strategies of Nvidia and its diversification into other new products and markets. Also, I wouldn't call ATI going from 23% market share in 2007 to the current 51% "fail"....but we all have our own opinions on how to define success. Also, LOL at you trying to paint Nvidia as the under-dog.

    45% market share isn't bad unless you only have one other competitor in your market and used to have 65%.
    Nvidia has watched their overall market share decline over the past few years. All this while still holding the larger share of the professional/business (Quadro/FirePro) market segment. ATI has made all it's gains on the Radeon product. Sure Nvidia will come back, but you can't deny the history of the past 3 years. Nvidia sat on their hands counting on the G80/G92 to carry them for the next 3 years until they released their Savior: Fermi. And you're right, ATI didn't change much... but they didn't have to.
     
  36. skoreanime

    skoreanime Notebook Consultant

    Reputations:
    40
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    Naw, AMD is still much in the red. It's going to take awhile to pay off that $5.4billion dollar buyout of ATI.
     
  37. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    While the GF100 was not a great chip, their 2nd generation GF104 (GTX 460) is dominating the current mid range GPU and is very efficient.
     
  38. rschauby

    rschauby Superfluously Redundant

    Reputations:
    865
    Messages:
    1,560
    Likes Received:
    0
    Trophy Points:
    55
    Which is precisely the reason AMD rushed out their 6850/6870 - to directly compete with the 460. The 460 is a huge success for Nvidia, and its biggest/only success for Fermi thus far.
     
  39. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
    Considering that the HD6850/70s are already out, they technically are the current best mid range GPUs. That and the sudden drop of price of the GTX470 make the current mid range rather exciting.

    Suddenly, the GTX460 don't seem that awesome anymore huh. Let's see a GTX 560 come around quickly to lower prices even more!
     
  40. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    The GTX 460 is still the best deal, there have been great deals from $100-200, while the HD 6850 is new and hence still $200+.
     
  41. skoreanime

    skoreanime Notebook Consultant

    Reputations:
    40
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    Really thankful for AMD releasing the 6xxx lineup.

    Means there will be plenty of used GTX 460s on ebay soon enough. I'll get one for dirt cheap to slap in SLI with my current one and enjoy a nice jump in performance.

    <3 competition. Everyone wins.
     
  42. GapItLykAMaori

    GapItLykAMaori Notebook Evangelist

    Reputations:
    28
    Messages:
    385
    Likes Received:
    0
    Trophy Points:
    30
    Nvidia could have made the gf104 a lot better however that meant it wud compete with its own products hence in the long run kill itself. I see this chip being rebranded and used a lot like the g92 :p

    R u sure? i read somewhere that they had paid off most of their debts and it was not a concern anymore.
     
  43. skoreanime

    skoreanime Notebook Consultant

    Reputations:
    40
    Messages:
    148
    Likes Received:
    1
    Trophy Points:
    31
    AMD Reports Third Quarter Results | techPowerUp

    $5.4billion will take a company like AMD awhile to recover from.

    People assume just because AMD did very well with the 5xxx series that they're suddenly in the clear. Not even close.
     
  44. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    It would be bad if the PC market sector were still expanding, but it's shrinking by the day, I'm sure you're aware that the vast majority of games today are played on the Console? The hay-days of PC gaming are over, it only gets worse from here, espically since 5 year old consoles still manage better eye candy than most PC's. It doesn't make any sense to have 65% of a shrinking market.
    it will take a long time, but, the 1.45billion dollar payout by Intel, and the fact that they're acutally posting profits now that they've off loaded globalfoundries means AMD is more healthy and stable than it's been in a very long time. It's not like a credit card loan, as long as AMD keeps on chugging it's not a huge problem.

    the payout by Intel let them payoff debts from GF and some other things, but that's it. Some debt for a company like AMD is good, it improves their credit score and allows them to borrow larger amounts.
     
  45. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
  46. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    How do you know it's a rebrand? If you mean by reusing the architecture, ATI is using the same architecture for their 6xxx series, though they have refined their shader complexity, nVidia did something similar from their GF100 to GF104.
     
  47. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Because it takes a lot longer than a month and a half to make those kinds of arch changes.
     
  48. Lozz

    Lozz Top Overpriced Dell

    Reputations:
    536
    Messages:
    2,087
    Likes Received:
    22
    Trophy Points:
    56
    GF104 is the arch change.
     
  49. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    No, G92 -> GF100 is the arch change, the GF104 is a refined version of the GF100.
     
  50. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    GF104 was in the works way before GF100 was released....and if there are new arch changes coming it would have had to be taped out shortly after GF104 started it's production. All that really means is that not only did Nvidia release a bad Fermi with GF100 knowing that they had better tech already set to launch a couple of months later, but they launched GF104, GF106, and GF108 knowing it would be replaced a couple of months later too.


    The more likely scenario is that these are respins of the existing arch to fix yield problems so they get more functional 384 core GF104 and 512 core GF100, that could likely also hold higher clocks more effectively. All silicon goes through respins and it's nothing that deserves a new generation of model numbers. At most it deserves a b-suffix at the end of the GPU codename: GF100b, GF104b etc.
     
 Next page →