The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Upcoming GPU Info Thread - 6xxM / 7xxxM Discussions!

    Discussion in 'Sager and Clevo' started by Blacky, Feb 16, 2012.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Maybe thats why the new Alienwares get 7970M and why Sager is discontinuing 6970M/6990M. New GPU coming soon. Kinda a no brainer to get the 7970M if they offer 675M also. :)
     
  2. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    You mean the 680M right? The 7970M is supposed to be stronger than the 675M/580M :)
     
  3. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    huh? I don`t understand what you mean.

    If I would pick 680M instead of 7970M? I don`t know, I want to see the performance of both first. 7970M will be a lot better than 675M/580M for shure :)

    GTX 680M in the end of June and 7970M in May. Ivy Bridge in late April. When Ivy is released, there is only a couple of weeks to wait for 7970M or 1.5 month to 680M. Sounds good. :)
     
  4. mmarchid

    mmarchid Notebook Evangelist

    Reputations:
    133
    Messages:
    496
    Likes Received:
    0
    Trophy Points:
    30
  5. Altair4

    Altair4 Notebook Consultant

    Reputations:
    12
    Messages:
    130
    Likes Received:
    3
    Trophy Points:
    31
    New 18" P370EM SLI GTX670M GTX675M -earliest possible release in June?
     
  6. Gear332

    Gear332 Notebook Evangelist

    Reputations:
    181
    Messages:
    496
    Likes Received:
    68
    Trophy Points:
    41
    No, the 670M and 675M should be out next month.
     
  7. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Nevermind. Wrong post
     
  8. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    No that says that the new 17nch P370EM with SLI has no release date yet and we wont see it until June for sure.
    Because thats when the Clevo release timeline with proper dates ended at the time of his post. ;)
     
  9. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Lets hope so, there sure is a hell lot of confusion around these forums now a days with all the pre-releases and rebrandings going on :p
     
  10. hotblack_desiato

    hotblack_desiato Notebook Consultant

    Reputations:
    34
    Messages:
    124
    Likes Received:
    0
    Trophy Points:
    30
    So I really haven't been following this subject very much but could anyone summarize the speculation on the relative performance of the 7970m to the 6990m?
     
  11. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    We haven't really got any information regarding the 7970M.. it could be cool if it was indeed stronger than the 6990M, then we would at least have some kind of expectation towards the 7990M :) but so far nothing......
     
  12. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    I'd expect a performance boost anywhere in the 20-30% range for the 7970M over the 6990M. Look at the (desktop) HD 6870 vs HD 7850 in terms of performance and TDP.

    Here you go, check the power consumption, idle and load temps and do your math. If AMD uses the same TDP for the 7970M as was used for the 6990M, they can use higher relative clocks (less downclock) and thus gain even more performance.
     
  13. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    If the 7970M is based on the desktop 7850, it'll be about 20% faster than the 6990M. A 7990M based on the desktop 7870 would be about 50% faster than the 6990M.
     
  14. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Hey why did your wall of text? Oh well, yes I agree, 40% - 50%, that is at least my lowest expectation! Wouldn't see anything else being possible :)
     
  15. hizzaah

    hizzaah Notebook Virtuoso

    Reputations:
    1,672
    Messages:
    2,418
    Likes Received:
    289
    Trophy Points:
    101
    I was going to ask the same thing lol. That was some solid speculation based on plausible math he had there.
     
  16. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    Lol. I thought it was too long.

    Basically, I was looking at 3DMark Vantage scores to estimate the performance. The 6900M's are about 20% slower than the desktop 6800's. Since the 6800's and 7800's have about the same power consumption, we might expect the same performance difference between the 7900M's and 7800's.

    The 7850 gets about 19800 marks, so the 7970M would get about 16500 or 20% higher than the 6990M. The 7870 gets about 24200, so the 7990M would get about 20200 or 50% higher than the 6990M.

    The numbers for the 7800's are from the very first drivers, so it's the baseline and can only go up. I hope AMD follows this course of action.


    On Nvidia's side, the 580M was based on the 560 Ti. The 580M is about 40% less powerful. The new desktop 680 is a smaller chip than the 560 Ti and consumes a little bit more power, so the 680M could be based off the 680. Since the 680 consumes a bit more power than the 560 Ti, let's say the 680M will be 45% less powerful than the 680 rather than 40%.

    The 680 gets 29860 marks from one review. That means a 680M based on it would get about 20600. That's very close to the 7990M's estimate. Looks like AMD and Nvidia will be neck and neck on the mobile side again.
     
  17. E.Blar

    E.Blar Notebook Deity

    Reputations:
    193
    Messages:
    743
    Likes Received:
    0
    Trophy Points:
    30
    Nothing's too long for me, lol. I once tried to solve a system of five equations (five variables) by hand, no calculator, by using substitution and elimination. It took like 3 hours. But hey, you could get more precise numbers for the 680m if you used the actual TDP ratio from the 560/680 instaid of "a bit"
    Also, we should be comparing these to 6990m and 580m on their earliest beta drivers because the present ones have had a long time to mature.
    Nice new avatar, by the way :)
     
  18. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    You are kidding, right? 560Ti was a 170W card and was heavily downclocked and crippled to fit the 100W mobile envelope. The 680 is a 195W card and would have to be cut in half. So, no. The 680M will be based of the GTX 660/Ti version not the 680. It's simply not cost-effective for Nvidia to trim the 680 (the most expensive of their line at the moment).

    On the AMD side, the 7850 is 130W and will be easily "trimmed" for the mobile sector.
     
  19. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    no, "now" i can say 680m will be based on gk104 or 28nm fermi
    why? 680m wont based on gk107, and no gk105/106 now right? mobile card always slow than desktop, so we must see gk105?106? or 28nm fermi on desktop first, and the big die 384bit?512bit? gk100?110? real flagship is comming
    ofcourse gk105?106? or 28nm fermi comming soon? "now" we dont know.......
     
  20. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    Well, if the 680M is still fermi...doesn't look too good, IMHO. I was thinking the 660Ti would appear together with the 680M (in June?).
     
  21. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    The 680 is clocked 22% (or 29% if you consider its turbo boost) higher than the 560 Ti with only 15% higher TDP. If it was clocked at the 560 Ti's 822Mhz rather than 1006Mhz (or 1058Mhz), it would probably be less than 170W. Then, it would just have to be downclocked like the 580M.

    I'm saying the 680M could be based on the 680. If not, they might have some trouble.

    Thanks :)
     
  22. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    right, gtx680 design for 195w, so the power chip wont let power draw exceed 195w...
    see? even running 100% load furmark, gtx680 still clocks about 1000mhz ovrall(depends on oc potiential of the card u get)

    and we know laptop gpu's performance per watt usually better then desktop gpu :)
     
  23. Aikimox

    Aikimox Weihenstephaner!

    Reputations:
    5,955
    Messages:
    10,196
    Likes Received:
    91
    Trophy Points:
    466
    Don't forget, while it could technically be possible to trim a desktop 680, there's a very important factor to consider - price. It's way cheaper to slightly downclock the 660ti than the 680. Would you be willing to pay 1000$+ for such a card?
     
  24. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    But we don't really know how much it costs for Nvidia. They've only priced the 680 as such because it performs well with AMD's 7970. Nvidia pays for each silicon wafer it gets. The 680 chip is 22% small than the 560 Ti and smaller chips mean more working chips per wafer.

    Traditionally, Nvidia has demanded a premium for their top performing GPU even if it's only a little better than AMD. Nvidia could have easily priced the 680 at $550-600, but they're able to price it at $500. Nvidia's top GPU is faster but cheaper than AMD's? That's not very like them at all. That tells me the 680 could have been priced much cheaper, but since AMD had already set the price, there was no need to price it much lower and lose profits. Had AMD priced the 7970 at $400, the 680 would have been equally lower in price.
     
  25. Red Line

    Red Line Notebook Deity

    Reputations:
    1,109
    Messages:
    1,289
    Likes Received:
    141
    Trophy Points:
    81
    It's shouldn't be Fermi! Desktop middle-end GPU's like GTX670/GTX660 are out in the upcoming weeks (by the end of April for sure). So 680M could possibly be based on a lower performance core.

    GTRagnarok
    Exactly... really interested in the prices upon release of AMD/Nvidia flagship mobile GPU's as well. AMD was more of best price/performance ratio card where Nvidia got pure performance crown. As we look into Desktop segment it's a first time for the past.. lots of years.. when AMD managed to build a top card which is higher priced than Nvidia: 550$ for HD7970 and 500$ for GTX680.

    Don't feel like paying +500$ for either HD 7970M and GTX680M...
     
  26. aduy

    aduy Keeping it cool since 93'

    Reputations:
    317
    Messages:
    1,474
    Likes Received:
    0
    Trophy Points:
    55
    basically if the gtx 680m is based on the gk104 it will be from a downclocked gtx 670ti, which will have a tdp of less than that of the gtx 560ti, if i do declare... and such. period end of discussion.
     
  27. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Agreed. GPU numbers posted by the manufacturers are like Mileage Life Ratings on tires. :)
     
  28. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    A - Downclock $500 GTX 680 by 50% then charge $800 per GTX 680M

    B - Downclock $300 GTX 660 Ti by 15% then charge $800 per GTX 680M

    Why would any business choose option A?

    Not quite.

    [​IMG]

    Source. Methodology is explained in detail.

    This was backed up by the results at Hardware Canucks, who had this to say:

    Results (just do the math):
    [​IMG]

    My only concern is that AMD has gone back to targeting something pathetic, like an artificial 75W ceiling.
     
  29. SlickDragon

    SlickDragon Notebook Consultant

    Reputations:
    80
    Messages:
    143
    Likes Received:
    88
    Trophy Points:
    41
    Beautiful charts and information Kevin so +Rep! Edit- Gotta spread the love around, will hit you back later ;)

    This bodes well for the 7970m being a full or nearly full powered 7850, which should outright demolish a 560ti and make a mockery of a 580m/675m. The 7990m being a slightly downclocked 7870 does seem to indicate it would have enough juice left over to beat a 570 at stock.. which is mind blowing! Also as these desktop chips have already been released it bodes well for an imminent mobile adaption, despite nearly no leaks indicating that its in motion.

    Regarding the 680m I extremely doubt it will be based off the mid range 680, it being based off the 660ti is more feasible. Since those desktop chips wont even be mass released any time in the foreseeable future I highly doubt we will be seeing legit high end Kepler mobile chips any time soon.
     
  30. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    That would be something for sure. If they can cram full 7870 power in a M GPU, they are going to totally kill it this generation.
     
  31. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Yes yes, that's exactly my position. If AMD still pushes the 100W barrier, imagine a 15% downclocked 7870 sitting on your lap, or in your briefcase.

    Will they blow this opportunity?
     
  32. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    I don't think they can afford to blow it. They had a minor hit with the 6990M, and they know they can rake in the cash if they can put real desktop power in a single mobile GPU package. AMD knows that mobile is the future, even moreso than nVidia.

    I see the race for Mobile Power happening just like the race to Multi-Core. The paradigm is shifting rapidly. It used to be MHZ MHZ MHZ, then it became cores cores cores! Almost overnight. Now its Low Power, Power!

    They just have to for the emerging market.

    How many average Joes do you know that go out and buy a desktop any more? Most people are grabbing a laptop, or tablet. Even people looking for the "Family PC' are buying laptops now too. Don't get me wrong, there is still a market for a desktop, but its slowing.

    All of the players know it. Thats why Intel has shifted to Laptop Processors that keep up with their desktop counterparts. Development has slowed greatly in Desktop Speed and power so the mobile parts can catch up. Thats what Ivy Bridge was all about. I figure intel will merge the Mobile and Desktop lines in the next Tick Cycle into one unified Low power lineup.

    Same with AMD, llano and Trinity are the obvious paradigm shift in strategy. They see the writing on the wall and they are responding with the only ace in the hole they have, a solid GPU product. All rolled in with a CPU and low low power.

    Not to get too far off topic.. but AMD needs to be seen as a Graphics Powerhouse in the market to give credence to their IGP solutions they are forging ahead with. Thats why i think they will go for the gold on this round, to give them that prominence they need when it comes time to sell APUs.
     
  33. ivansf

    ivansf Notebook Enthusiast

    Reputations:
    15
    Messages:
    31
    Likes Received:
    0
    Trophy Points:
    15
    I was wondering now after we know about the 670m and 675m cards, I went back to the config I thought about buying a few weeks ago, and still the 6990m seems like a really good option.

    Let's say you can upgrade to a 675 for $350 and to a 6990m for $100... Would the 675m be worth it?
     
  34. b0b1man

    b0b1man Notebook Deity

    Reputations:
    597
    Messages:
    1,092
    Likes Received:
    29
    Trophy Points:
    66
    If you like optimus, then yes. If not - get the AMD.
     
  35. b0b1man

    b0b1man Notebook Deity

    Reputations:
    597
    Messages:
    1,092
    Likes Received:
    29
    Trophy Points:
    66
    If AMD matches Nvidia in performance (like they did with 6990m vs 580m), then Im gonna go with AMD again. I support them for their lower prices and good performance they deliver.
     
  36. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I`m still not shure what the actual power consumption of 7850 is. Anandtech says the GPU "capped" at 150W due to the Power Tune keeping it at bay. Other sites say max power consumption of 250W. Techpowerup say max power consumption is 101W.

    :confused: :confused:
     
  37. GTRagnarok

    GTRagnarok Notebook Evangelist

    Reputations:
    556
    Messages:
    542
    Likes Received:
    45
    Trophy Points:
    41
    250W is for the entire system.
     
  38. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    It depends on who and how they are measuring.

    Many sites use a meter like a Kill-a-watt, they measure the reference system with a load. Then put the card in and measure with a load and Subtract the difference.

    There is also a reference spec for running the entire card in the system, memory, cooling and all.

    Finally, there is the power required for the GPU Itself.

    The Actual Processor TDP is probably the best way to measure it in relation to porting the GPU to other platforms.
     
  39. Heihachi_1337

    Heihachi_1337 Notebook Deity

    Reputations:
    618
    Messages:
    985
    Likes Received:
    0
    Trophy Points:
    30
  40. Ryan

    Ryan NBR Moderator

    Reputations:
    2,320
    Messages:
    2,512
    Likes Received:
    17
    Trophy Points:
    56
    I'm going to prevent the deluge of these threads and merge all gfx card discussions to the GFX card discussion thread we already have.
     
  41. BenWah

    BenWah Notebook Consultant

    Reputations:
    119
    Messages:
    289
    Likes Received:
    0
    Trophy Points:
    30
    You should allow at least one for sager here, my opinion
     
  42. Ryan

    Ryan NBR Moderator

    Reputations:
    2,320
    Messages:
    2,512
    Likes Received:
    17
    Trophy Points:
    56
    Of course, this is our own thread!


    Sent from my iPhone with Tapatalk
     
  43. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Techpowerup and Hardware Canucks are the two sites which tested with the proper methodology.

    I'll trust what TPU did with their $2,000 piece of equipment (Keithley Integra 2700) over any site which just hooked up a Kill-A-Watt and did not get proper readings.

    Seriously, read how they got their numbers.

    So if they say the 7850 is right around 100W, I'm inclined to trust them.
     
  44. E.Blar

    E.Blar Notebook Deity

    Reputations:
    193
    Messages:
    743
    Likes Received:
    0
    Trophy Points:
    30
    Actuall, the 675m will be way cheaper than that. Seeing as the card itself costs around $300, upgrade from 670m should be $100 tops.

    Didn't you already do this a few pages back :confused:
     
  45. Ryan

    Ryan NBR Moderator

    Reputations:
    2,320
    Messages:
    2,512
    Likes Received:
    17
    Trophy Points:
    56
    I merged a few threads onto this one. I actually left that note to the previous thread but it got merged together so it seems like a repetition. Sorry about the confusion, GPU discussions for Sager/Clevo will all come merged to this main thread.
     
  46. E.Blar

    E.Blar Notebook Deity

    Reputations:
    193
    Messages:
    743
    Likes Received:
    0
    Trophy Points:
    30
    Aaaaaah. I see. So much thread-merging going on that it's harder to track them than it is to track the things my HDD does that it should not do. For starters, it seems to forget where things like my virtual memory are located on the drive... leading to looooooong seek times that freeze my computer randomly for ~3mins often...
     
  47. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Indeed.. I'm really confused right now, random replies coming up here and there :p
     
  48. midzi

    midzi Notebook Guru

    Reputations:
    111
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    Looks like that guy has got access to legit and verified info.
    Overclockers UK Forums - View Single Post - Did NVIDIA Originally Intend to Call GTX 680 as GTX 670 Ti?

    Maybe GTX 680M will base on GTX 670, which should appear around May?
     
  49. E.Blar

    E.Blar Notebook Deity

    Reputations:
    193
    Messages:
    743
    Likes Received:
    0
    Trophy Points:
    30
    Ooo, that does look legit. too bad there's nothing about mobile kepler... for some absurd reason, I was hoping they could manage to use kepler efficiency to make a mobile dual GPU on a single card (GTX 685m or 690m).
     
  50. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,041
    Trophy Points:
    331
    That doesn't make much sense. A single GPU card is easier to cool down than a dual GPU one while offering the same performance.
     
← Previous pageNext page →