The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Concerning next series of graphic card coming up (600m/7000m series)

    Discussion in 'Gaming (Software and Graphics Cards)' started by KaWiCH, Jan 7, 2012.

  1. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    ^^ good info, thanks! however as I expected the same number of CUDA cores, also one thing, it has lower clocks than 550m, what does this mean? they are really just going at heat and performance/watt?
     
  2. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I think it could mean that Nvidia actually was telling us the truth earlier: that their Kepler GPUs will give us better performance/watt compared with Fermi :)

    This ancient picture:
    [​IMG]
     
  3. hydra

    hydra Breaks Laptops

    Reputations:
    285
    Messages:
    2,834
    Likes Received:
    3
    Trophy Points:
    56
    Well, I feel much better for future console releases and to max out those badly written ones ;)
     
  4. midzi

    midzi Notebook Guru

    Reputations:
    111
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    GT630M = GT 650M?

    Looks like a GPU-Z error. I haven't got another explanation.
     
  5. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I'm going to have to wait until it's confirmed that those chips are or aren't Kepler parts.

    For all I know, we're looking at GT 525Ms or something.
     
  6. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Isn't kepler just a shrink of current fermi tech anyway?
     
  7. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Here is something REALLY interesting about Kepler. I`m just quoting small stuff of a very long informative article, but these are the most important bits. :)

    Looks like Nvidia is pulling out the big guns for PhysX
    Physics hardware makes Kepler/GK104 fast | SemiAccurate
     
  8. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
  9. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    Nice! Though i think that article is too detailed to be BS (or not?). Should we expect a GTX 680M to be a downclocked GTX 660? And yeah it kinda sorta bash AMD hard.
     
  10. wild05kid05

    wild05kid05 Cook Free or Die

    Reputations:
    410
    Messages:
    1,183
    Likes Received:
    0
    Trophy Points:
    55
  11. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    I don't want to be a jerk but cannot help it, optimization not optimisation... (sorry but math major :))

    BTW thanks for the news about 600 specs :) however looks way too optimistic, I am a little pessimistic about this
     
  12. ryzeki

    ryzeki Super Moderator Super Moderator

    Reputations:
    6,552
    Messages:
    6,410
    Likes Received:
    4,087
    Trophy Points:
    431
  13. jamsquid

    jamsquid Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    0
    Trophy Points:
    5
    My computer broke down and this is probably the worst timing imaginable... I really want to wait for the new Ivy Bridge/GPU updates.

    Do you think when the new hardware comes out, the price will be greater than the current models, or will it just replace the current models without an increase in price?
     
  14. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Hmm, well I was thinking the 680M would have 512 sp, so the lineup could possibly be:

    680M = GTX 660 - 512sp, 256-bit
    675M = GTX 650 Ti - 448sp, 224-bit
    670M = GTX 650 - 256sp, 192-bit
    660M = GTX 640 - 192sp, 128-bit

    Man, the 670M and 660M would be looking so disappointing.

    670M definitely wouldn't be toasting the 580M, unless Kepler has super shaders.

    I don't know, this seems pretty fishy.

    Ideally, I'd like to see the GTX 680M come from the 660 Ti (but make it 256-bit), so each GPU could be moved up one notch, meaning the 670M would come from the GTX 650 instead.
    Definitely more expensive than what's out now, depending on the manufacturer.
     
  15. Zero989

    Zero989 Notebook Virtuoso

    Reputations:
    910
    Messages:
    2,836
    Likes Received:
    583
    Trophy Points:
    131

    512sp would be amazing for a tdp limit of 100W.

    i doubt itll happen. and all the charts suggest that GTX 680/675 have 256-bit memory interface unless nvidia changes that last second.
     
  16. wild05kid05

    wild05kid05 Cook Free or Die

    Reputations:
    410
    Messages:
    1,183
    Likes Received:
    0
    Trophy Points:
    55
    1.75 Gb already makes it bs .
     
  17. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Why? 256MBx7/32bitx7 is correct with the specs shown

    @Kevin: You really think that the GTX 660M will be like GTX 640? Isn`t the GTX 650 more plausable? 192 bit like the 560M and 256 cores instead of 192? That would have been amazing :D It is rumored that the cores will be more effective than Fermi too, so keep that in mind.
     
  18. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Yes, it being the GTX 650 makes much more sense, from technical and PR perspectives.
     
  19. Devenox

    Devenox Notebook Evangelist

    Reputations:
    65
    Messages:
    417
    Likes Received:
    9
    Trophy Points:
    31
    this is so fake...
    even if rough specs (memory width, amount VRAM, shaders) were right, than it's impossable that final CLOCKS of core and memory would already be known this far before a release. Even nvidia doesnt know the final clocks yet, nor price.

    Also 224bit is VERY unlikely, always been 64bit modules a card is build with.
     
  20. wild05kid05

    wild05kid05 Cook Free or Die

    Reputations:
    410
    Messages:
    1,183
    Likes Received:
    0
    Trophy Points:
    55
    Nvidia pr team is working real hard
     
  21. midzi

    midzi Notebook Guru

    Reputations:
    111
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    Fake and BS two words.

    Nevertheless it's really possible scenario when GTX 660M will be equal GTX 560M. New process, lesser thermal efficiency, but total efficiency the same.
     
  22. Devenox

    Devenox Notebook Evangelist

    Reputations:
    65
    Messages:
    417
    Likes Received:
    9
    Trophy Points:
    31
    That doesnt matter, afcourse it will have lower TDP/performance that's because of the 28nm.

    What you do is gambling. I could also make a table with specs. Would you believe me? because that's the same that happened now
     
  23. superman3486

    superman3486 Notebook Consultant

    Reputations:
    83
    Messages:
    195
    Likes Received:
    18
    Trophy Points:
    31
    also take account that the green teams 3xxm series were very quickly phased out as it was more of a gap filler and there was no real new articture difference between the gtx 260m and a gts 360m that was found in the asus series of laptop. Perhaps nvidia will pass on the any desktop/mobile 6xx series, it's all speculation :)
     
  24. Star Forge

    Star Forge Quaggan's Creed Redux!

    Reputations:
    1,676
    Messages:
    2,700
    Likes Received:
    10
    Trophy Points:
    56
    That is possible true. Unless Kepler is going to be branded 6xx Series, I don't think they want to pull another 9xxx, 2xx, and 3xx with Fermi... However the question remains is for laptops at least, is nVidia going to still milk Fermi for now or going to start using Kepler for its mobile solutions.
     
  25. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Nvidia has confirmed multiple times that all of the high-end 600M chips are Kepler.
     
  26. Devenox

    Devenox Notebook Evangelist

    Reputations:
    65
    Messages:
    417
    Likes Received:
    9
    Trophy Points:
    31
  27. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    640 and 512 are shader counts I'd expect of the 7600 series, not 7700.

    In fact, even the previous article, which VR-Zone referas to in that one, says the 7770 has 896 SP or 14 CUs, while many other sources have also reported the 7770 and 7750 as having 896 and 832 SPs, respectively.

    Seems like some wires were crossed, and they're actually talking about the 7670 and 7650.

    I expect the desktop 7700 to become the 7800M, 7800 to become the 7900M, and it will not shock me if the aforementioned 7600 replaces the rebranded 6770M (which is currently being passed off as the 7600M) with true next-gen parts.
     
  28. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    Just for some perspective, if you need a new computer now is a great time to buy. SB processors are already stupid fast and the current top tier GPU's can eat up anything thrown at it. Technology moves so fast that it is very easy to take a "I will wait for the most opportune time to buy" but that rarely comes along.

    I would say we are in that opportune time right now because new consoles are still a few years off; probably from the bad economy. If new consoles arrive in late 2013 then even the most tricked out laptop you can buy at the END of this year, won't run the majority of console ports. Just look at how some of the PC requirements that are needed to run console ports. Sure we may play with higher resolution, but overall, it's just poor optimization.

    I have found with every new console release, only the absolute best desktops that are available around the time of the console launch have any longevity. The reality is that the money is being made on the consoles and then porting to the PC as an afterthought. So whether you have 5870s/6990s/7990s...it won't make a difference in a few years. Right now if you have a very powerful notebook, you should be set until the next consoles arrive.

    That is just my take on it. My next purchase will be about a year or so after the consoles hit (unless there are some amazing PC exclusives that come out). I got great return on investment with my m15x with the 8800GTX and will likely get another couple years out of my current rig.
     
  29. Devenox

    Devenox Notebook Evangelist

    Reputations:
    65
    Messages:
    417
    Likes Received:
    9
    Trophy Points:
    31
    ^^
    What you said made sense in 2005/6. When xbox360 and PS3 were aiming for highend GPUs inside. This time, (current desktop) midrange parts will be used.
    6670 for xbox 720/next
    I know ports are extremely poor, but 6670 performance is already outclassed by current highend notebooks. So in 2013/2014, 6670 performance will be outclassed by midrange notebooks.

    Remember there will be a wide range of hardware for nextgen consoles
    Nintendo Wii U (lowend)
    Xbox 720 (midrange)
    PS4 (unknown, but probably between midrange and highend)
     
  30. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    My take is that once the new tech come out people will try to get their hands on the new stuff thus a price drop for current tech.
     
  31. daveh98

    daveh98 P4P King

    Reputations:
    1,075
    Messages:
    1,500
    Likes Received:
    145
    Trophy Points:
    81
    Oh maybe I am way out of the loop, and you are correct I was going on the past regarding consoles. Are the specs leaked for the next Xbox? If it is using a midrange desktop GPU from 2012 I will be rather disappointed. I liked how they usually were right on the cutting edge of current technology when released. What are the details of the new Xbox?

    edit: I did research the webs and I will be very shocked if they use a GPU like that. October/November of 2013 and they are using a GPU that is midrange at best from 2011? I think it's FUD but time will tell. Sure there are a lot of console specific optimizations and you don't need a bleeding edge card but that just seems silly. Plus it's not even on the die shrink which would reduce heat, improve efficiency and power.....doesn't make sense. I guess maybe I could keep my 580M SLI for like 10 years then.
     
  32. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    I think it is too early to speculate about the hardware of the next gen consoles (save the Wii U which apparently aims for current console gen graphics).
     
  33. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The Wii U is actually going to be something like 3 times more powerful than the 360 and PS3.

    If the 6670 rumors are true, the NeXtbox is in the range of 6 times the current HD consoles.

    Sony will probably survey this and go for 8 or 9 times the current gen, if they stick to their status quo.
     
  34. Devenox

    Devenox Notebook Evangelist

    Reputations:
    65
    Messages:
    417
    Likes Received:
    9
    Trophy Points:
    31
    Yeah, i know 3 or 6 times the current speed sounds impressive.
    But we had that power already in 2010 with a 5870 or GTX285.
    Also keep in mind x times faster doesn't translate in x times BETTER.
    It just can calculate more, but if they will use tessaltion 6670 is a weak choice.
    and nintendo Wii U uses 4000 serie card, so that's even dx10.
    If they were to release every 4 years a new console, it would be ok. But not for 8-10 years like now...
    I know some people say tesselation is not that impressive, but it IS. It's just not good inplented (yet).
    But I think it's rather a big (visual) difference if you watch demos like 3dmark11 or the heaven benchmark.
     
  35. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    If this will become the newest trend, RIP console gaming :rolleyes:

    [​IMG]


    [​IMG]
     
  36. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    I wouldn't be surprised if MS's next Xbox uses a "M" variant of that GPU. The thermal and power consumption issues they deal with on consoles have cause quite a bit of issues for some time. Running more effective, or even passive cooling and low wattage chipsets will probably be a big advantage for MS.
     
  37. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30

    The Xbox 360's Xenos Processor was essentially a Radeon X1800, (Released Oct 2005, a month or so before the X360). So yes, they did use the top tier graphics processor of the day in the 360.

    However, the next Xbox really only has to shoot for 1080p playback at 30+ fps. So much has advanced in the last 7 years, even an upper-mid range part like the 6770 would perform VERY well. Really, they just want the Direct X extension update and enough muscle to power that 1080p resolution.

    I think this is a good thing, as MS is actually still concerned with their "games for windows" program and having the ability to easily port titles that will make them playable is a good thing. One of MS's biggest advantages against OSX is in the game market, so cutting that down would be unwise.
     
  38. lozanogo

    lozanogo Notebook Deity

    Reputations:
    196
    Messages:
    1,841
    Likes Received:
    0
    Trophy Points:
    55
    The dx versions are not the main issue. The 360 had a dx variant that can be catalogued officially as dx 9.5 (dx 9 + some features of dx 10). I hope MS will use some higher end card for their next console, otherwise the graphical stagnation will come fast...
     
  39. long2905

    long2905 Notebook Virtuoso

    Reputations:
    2,443
    Messages:
    2,314
    Likes Received:
    114
    Trophy Points:
    81
    I call bogus on this one. M$ got their hand full with Kinect already (the rumored 2.0)
     
  40. Wallzii

    Wallzii Notebook Consultant

    Reputations:
    92
    Messages:
    236
    Likes Received:
    3
    Trophy Points:
    31
    Sorry guys, double-post. Anyway, moooooving on....
     
  41. Wallzii

    Wallzii Notebook Consultant

    Reputations:
    92
    Messages:
    236
    Likes Received:
    3
    Trophy Points:
    31
    To be honest guys, anyone who thinks that the next generation Xbox is going to utilize 2011 hardware for their GPU core is seriously misinformed. This isn't a winning solution for Microsoft, and isn't going to generate their gaming division any economic surplus in the long run.

    We need to think about this from a business perspective first and foremost, as that is what Microsoft, a company, is running. What keeps any company or business generating revenue and investors is profit, pure and simple. Investors don't give a sh*t about the power of the hardware or how many frames a game is going to run at, again, they care about a constant revenue stream, profitability, and assurance that their investment decision was sound and protected.

    Why is this important in this conversation or even relevant? Simple; Microsoft, or any other major console manufacturer for that matter, doesn't make their core revenue intake from hardware sales. On the contrary as history will dictate, they suffer a loss over time for their hardware placement strategy to supply a market for their real money maker, royalties (ie: software, peripherals, etcetera).

    With that in mind, think about releasing a system with out-dated hardware that is already over a generation old. Yes, it is more "powerful" than the previous console hardware out there, but how long will it provide developers a stable platform to produce the content they want to deliver to their audience compared to the competition? How long will this hardware platform continue to deliver the results that studios demand before a newer generation is called for to step things up to the next level?

    Research and development isn't cheap, and you can bet that the amount of money that Microsoft, Nintendo, and Sony threw into the R&D of the current consoles was extremely high, bordering excessive and something not ever seen in this industry to this date even when considering inflation. With this in mind, it isn't sound business from an investment point of view to release something that isn't going to offer a life-cycle on par or greater to the previous product of the same line offered.

    The next Xbox will not launch with this dated hardware, nor will the next Playstation. The Wii U can get away with offering slightly-better-than current-gen hardware (and I say this lightly as nothing is finalized) because the current install base is invested in how they interact with their device rather than the polygons it can push. Throw a couple million more of those polygons at someone with a more intuitive interaction interface and you can bet they will be back for more classic Nintendo titles, time and time again. This becomes especially true now that the install base for an HDTV in the average home is much higher than it was six years ago to take advantage of the mainstream market that Nintendo initially targeted and continues to capitalize on.

    This next generation will be something special visually, and the hardware used won't be anything we have to date in February of 2012. Whether the gameplay and design will match is another story in itself, however. This is not to say that the current top-end GPU will be less powerful than what is offered in raw performance for the next-gen consoles, but at that same performance level offered will be a smaller die-size, more efficiency in production, heat management, and energy consumption. Again, all things that drive profit margins, and make board directors keep their jobs when investors come a-knockin'.
     
  42. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    I agree with your statements that MS is going to do what they can to re-coup their revenue as early on with their console as possible. It is a for-profit business and they do make money on the back side. I think taking a year or two of losses on hardware is become less palpable, especially since Nintendo proved that selling lower powered hardware is viable and profitable.

    Microsoft has actually improved upon the Xenos processor since its inception, with die shrinks from 90nm, to 65nm to 45nm over the years. So, even though the xbox platform is fairly static, the actual hardware has improved somewhat for efficiency and lower cost. (Smaller the die, the more on a wafer and the lost the cost.)

    If i had to guess, i would say that MS will outfix the next Xbox with a 28nm part. That suggests a minimum of a 77xx or better equivalant of the GPU. To further lower costs, i could see MS even going with an All-in-one APU unit. Perhaps they will have a custom variant of a Fusion style processor.

    No matter if they go integrated or discreet, the next Xbox will be a powerhouse, even if it is "only" a 7750. They have really worked the current Xenos to a level that most people would have never thought possible 7 years ago. With a huge boost in power from a more modern GPU, they should have no problem getting another 8 years of life out of it.

    It really comes down to consumer expectations meeting return on investment potential. Could MS put a 7970 in the next xbox.. sure.. but it would probably cost $700 at launch. I am putting my bet on a relative of the 77xxM part as the heart of the next box. They could still meet the $299 price point they seek, and get a solid machine out the door that will wow people until 2020.
     
  43. Wallzii

    Wallzii Notebook Consultant

    Reputations:
    92
    Messages:
    236
    Likes Received:
    3
    Trophy Points:
    31
    I'm on the same train of thought that you are sir, however you've nailed it down into more fine detail regarding hardware and expectations thereof. I can't agree with what you've said more. The thought of these mid-range 6xxx series ATI cards is pure asinine, to those who have initiated such rumours. I can definitely see a mid-high range 7xxx series equivalent as a starting point in the next Xbox, but without knowing the real-world performance of this hardware, I definitely can't suggest the M-series as the clear winner here, yet.
     
  44. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Yeah probably. That Xbox 720 pic is just a rumor.
    Wii U is real though. Looks really awful. :p
     
  45. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Free bump. Any new info?
     
  46. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Do you think Sony will release concurrently with Microsoft since MS beat Sony to the market by a year with the last release? Or perhaps Sony would prefer a year lag? I don't think it would hurt either of them any way.
     
  47. kurtcocaine

    kurtcocaine Notebook Evangelist

    Reputations:
    243
    Messages:
    655
    Likes Received:
    0
    Trophy Points:
    30
  48. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    It's amazing that a 128-bit card with 640 GCN shaders is faster than the stock 6970M in the 3DMark11 GPu score.

    I wish people would stock saying the next mobile cards won't be a significant leap.
     
  49. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    its not that amazing, since the 6970m is a downclocked Barts XT Chip which is used in the 6870 and after all we are talking about the 7770 desktop part, so its not fair to compare it to the mobile cards.. also the 7770 has a TDP of 85 watts

    In my opinion, I think the 128-bit vs 192 and so on doesnt matter at all, since they are not relevant and no handicap for these new cards, cuz they use a new architecture..

    My biggest hope from Kepler and AMDs 28nm GPUs is that smaller laptops can now have more powerful GPUs and maybe even some 13.3 inch laptops :) But please with nvidia optimus
     
  50. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    First off, the 6970M is a downclocked 6850. There's no reason it can't be compared to the desktop chips within this context. So how is it not impressive? They've improve the architecture, to the point where a 640 shader 128-bit card outpaces the previous gen's 960 shader 256-bit part, and it does so at a much lower TDP.

    That is impressively efficient, at least. AMD could probably get that down to 60W for a mobile derivative, at it'll be hugely overclockable.

    But I'm more interested in the 7800 series and what it means for the 7900M.
     
← Previous pageNext page →