The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    some info about upcoming refresh

    Discussion in 'Sager and Clevo' started by DGDXGDG, Oct 21, 2011.

  1. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    from a guy who attended that meeting, will keep update: (google translated)

    Just attended the meeting CLEVO foundry in Kunshan notebook
    CLEVO the end of 2011 there will be an upgraded version of the machine X7200 launch 17-inch dual card 580M
    In early 2012, there X7200 upgraded version of the 3D version of the machine but does not seem to the naked eye
    The most fun is the face of battlefield requirements of 3 abnormal release code for NVIDIA's GTX660M N13E-GTX notebook graphics
    Sub-memory version of the 2G and 4G, 4G memory one 3Dvantage scores P21000, 2G P20000
    The low level version of the GTX650M performance in P17000

    Scores than the current GTX580M high number P16200
    The GT640M is said to belong to this fraction in the P13000 midrange graphics card
    Another new video card, but AMD has rotten performance
    HD7900M I see above is the announced P15500 P14200 than it is now HD6990M the higher
    So rich enthusiast brothers or start GTX660M

    Also ask the Taiwan engineers said that now P150HM P170HM P180HM can be upgraded to the latest graphics cards
    The interface is still the same Oh now we look forward to it of course also good drops Oh GTX560M

    ___________________________________________________________________________________

    [​IMG]

    [​IMG]

    [​IMG]
    note nvidia physx so x0.85=overall score..........but still beat 7900m :D

    ___________________________________________________________________________________
    现在,在华硕的295.18的OEM驱动里,就有GTX670M的DEV号!

    http://download.laptopvideo2go.com/nvidia/295series/29518_win7x64.exe

    NVIDIA_DEV.1210.01 = "NVIDIA GeForce GTX 570M"
    NVIDIA_DEV.1210.02 = "NVIDIA GeForce GTX 570M "
    NVIDIA_DEV.1213.01 = "NVIDIA GeForce GTX 670M"
    NVIDIA_DEV.1213.02 = "NVIDIA GeForce GTX 670M "

    1210是GTX570M的设备号,1213是GTX670M的设备号!很奇怪的是,它竟然和GTX570M在一起出现了。。但再看这里,就更奇怪了。。

    %NVIDIA_DEV.1210.01% = Section066, PCI\VEN_10DE&DEV_1210&SUBSYS_14871043
    %NVIDIA_DEV.1210.02% = Section118, PCI\VEN_10DE&DEV_1210&SUBSYS_21041043
    %NVIDIA_DEV.1213.01% = Section066, PCI\VEN_10DE&DEV_1213&SUBSYS_21191043
    %NVIDIA_DEV.1213.02% = Section118, PCI\VEN_10DE&DEV_1213&SUBSYS_21201043

    要知道华硕的295.18的OEM驱动里,全是华硕笔记本显卡用的!而华硕的G74系列从没有用过570M显卡!!

    那为什么会出现570M呢??再看下上面的规格图,你是否能明白点什么了?

    也许GTX670M就是gtx570m马甲!

    是的,现在的假设就是:GTX600M全是上一代GTX500M的28NM马甲!那么GTX660M的规格呢?

    如果说GTX570M可以通过高频来一次马甲,那已经是GTX460M马甲的GTX560M怎么来马甲呢?

    目前为止知道的还有GTX675M的设备号!1212!它是否是256BIT的570M?或是GTX580M的马甲?现在还不知道!

    [DEV_1210] NVIDIA GeForce GTX 570M
    [DEV_1211] NVIDIA GeForce GTX 580M
    [DEV_1212] NVIDIA GeForce GTX 675M
    [DEV_1213] NVIDIA GeForce GTX 670M

    GTX680M的DEV号现在仍然未知。。。
     
  2. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    so btw the vantage p is physx on, gpu score may be 18000....nearly same as gtx580m highest oc :eek:
    it seems nvidia no longer use x80 for flagship......but 4gb memory!!!!!! i decided to skip amd 4g one which saw on leaked info :D
    hope the x7200 refreash 3d version still have full oc ability, dont become next p170hm3 which locked oc tweaking :mad:
     
  3. bonnie.clyde

    bonnie.clyde Notebook Consultant

    Reputations:
    258
    Messages:
    278
    Likes Received:
    3
    Trophy Points:
    31
    And what about the P270WM?
     
  4. ettornio

    ettornio Notebook Deity

    Reputations:
    331
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    I believe the X7200 'refresh' as the source calls it is very well the P270WM--it just isn't labeled that specifically.
     
  5. bonnie.clyde

    bonnie.clyde Notebook Consultant

    Reputations:
    258
    Messages:
    278
    Likes Received:
    3
    Trophy Points:
    31
    I thought so, but whether it will be with a new X79 Chip?
     
  6. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    So MXM 3.0 has been confirmed. I'm glad I was wrong.
     
  7. Aeilerto

    Aeilerto Notebook Guru

    Reputations:
    0
    Messages:
    73
    Likes Received:
    0
    Trophy Points:
    15
    Does this mean that people with 6900-series AMD should be able to upgrade to the 7000-series ones?
     
  8. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    This quote.....
    Seems to point to yes.

    But interface aside, it will also come down to whether a BIOS upgrade is needed, and whether Clevo throws us that bone.
     
  9. ettornio

    ettornio Notebook Deity

    Reputations:
    331
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    It certainly uses the X79 platform because it was seen on the spec sheet at CompuTex. The only question that remains is which features will be in the final product, and we won't know that until launch day of the P270WM.
     
  10. Saltius

    Saltius Notebook Evangelist

    Reputations:
    626
    Messages:
    321
    Likes Received:
    3
    Trophy Points:
    31
    I guess an unnecessary but expensive main board swap would be needed. We would be told only the new revison MB could support those new GFX cards( just because clevo plays tricks in bios and vbios making old rev MB not support)

    That is exactly what they did to D901C/D900F users who sought upgrading to 8800MGTX/GTX480M.
     
  11. Mr_Mysterious

    Mr_Mysterious Like...duuuuuude

    Reputations:
    1,552
    Messages:
    2,383
    Likes Received:
    15
    Trophy Points:
    56
    Lol, if the new cards are good enough, I might consider doing that. But if the cost of doing so is close to half of simply buying a new laptop (with the same specs), I'm going to have a quandary on my hands...

    Mr. Mysterious
     
  12. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    I expect the usual prices. Meaning ~$800 for Nvidia and ~$500 for AMD.
     
  13. Mr_Mysterious

    Mr_Mysterious Like...duuuuuude

    Reputations:
    1,552
    Messages:
    2,383
    Likes Received:
    15
    Trophy Points:
    56
    Now that I think about it...$500 for a 3x increase in GPU power is....pretty darn good :D

    Mr. Mysterious
     
  14. Larry@LPC-Digital

    Larry@LPC-Digital Company Representative

    Reputations:
    3,952
    Messages:
    3,580
    Likes Received:
    283
    Trophy Points:
    151
    I do not expect any BIG jumps in GPU performance this next generation...I kinda hope I am wrong though. :)
    -
     
  15. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    thats right gtx660m and 7900m no doubt will be only 28mm shrink but same gpu architecture........but 660m stock=560ti, and oc can catch up 580m sli/6990m cf
     
  16. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    Isn't one of AMD's supposed 28nm chip holding like 1400 shaders at just 120W? If we see that go mobile, the jump will definitely qualify as a little bit more than BIG.
     
  17. jml00a

    jml00a Notebook Guru

    Reputations:
    23
    Messages:
    66
    Likes Received:
    0
    Trophy Points:
    15
    Possibly slight performance increase but with much better power consumption? Seems to be the trend with the CPU market these days. GPU following suit?
     
  18. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    No, both will be new architecture. 660M will probably be Kepler, which was based on the original FERMI, not the new ones. 28nm will allow Nvidia to do with without the power/heat issues they had before.

    AMD will be using the VIPW4 architecture which was only seen on the HD6970 desktop. Or... if we are lucky, the Next Core Gen architecture which will make it very similar to a FERMI. Either way, it will be a new architecture over what is seen for mobile right now.
     
  19. LLStarks

    LLStarks Notebook Evangelist

    Reputations:
    39
    Messages:
    390
    Likes Received:
    2
    Trophy Points:
    31
    is 660M optimus?
     
  20. acroedd

    acroedd Notebook Evangelist

    Reputations:
    17
    Messages:
    443
    Likes Received:
    0
    Trophy Points:
    30
    very interesting , I hope the cards which come out are as good as these scores! probably will be upgrading again next holidays! :) any news about p170m refresh????
     
  21. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    No, both will continue use current architecture for sure, and 6xxm=28nm die shrink.....sigh..........
     
  22. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Either that or a giant leap in performance? I mean at some point, they should be able to discover or uncover something new regarding the power output and power comsumption ratio. Cant wait tbh! Who knows what will be released :D
     
  23. Saltius

    Saltius Notebook Evangelist

    Reputations:
    626
    Messages:
    321
    Likes Received:
    3
    Trophy Points:
    31
    Some 6xx m even remain 40nm(e.g. 630m is a rebranded 540m.)
     
  24. Aeyix

    Aeyix Notebook Evangelist

    Reputations:
    26
    Messages:
    470
    Likes Received:
    9
    Trophy Points:
    31
    I'm confused, from what I read above, it looks like the GTX 650M > GTX 580M. But other comments say GTX 600 series will not be a performance increase, just a die shrink to 28nm. I know I read something about Kepler being some 3x boost in performance or in less power use. I'm not sure, it was an image from a presentation posted on Engadget. So, is the 600 series just a die shrink, a performance increase, or both? One of the things I'm holding out for (not just Optimus, Backlit KB, and Ivy Bridge, but also nexgen GPUs which I assume from Nvidia was going to be Kepler as the 600 series). Can someone set my facts straight please... I'm so confused.
     
  25. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,044
    Messages:
    5,346
    Likes Received:
    1,034
    Trophy Points:
    331
  26. DGDXGDG

    DGDXGDG Notebook Deity

    Reputations:
    737
    Messages:
    787
    Likes Received:
    6
    Trophy Points:
    31
    yes die shrink and it improve the performance per watt, which is important in notebook platform......
    architectural changes do more in desktop...like 9800gx2 vs gtx280, they performs almost the same but gtx280 is "single gpu", thus can build 3way sli beat 9800gx2 which can only 2 in sli(quad sli)
     
  27. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,044
    Messages:
    5,346
    Likes Received:
    1,034
    Trophy Points:
    331
    Exactly! (10 char)
     
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Like what Black and DG said, a die shrink is a huge part in the power game. The mobile cards cannot drain infinite amounts of power from a PSU like desktop cards can, and it's limited by the MXM3.0b architecture too, I believe. It's an optimization game. How well can we make this run using X base architecture (Fermi), Y maximum power drain (100 or 120W or so, I forget the actual limit), while keeping maximum usage heat under Z temps (call this 80) in a hot-ish environment like florida/caribbean/mexico/etc. They did their limits with the G92b core of the 285M, and tried GF100 fermi. Then they tried GF104, which was a smaller core, and accidentally broke the flagship GF100 model's performance. The GF114 etc models are just better optimized again for power consumption/heat, which makes them able to run even higher clocks.

    If they could bring a die shrink out on that base architecture, that's less heat/power per cuda core. Therefore more cores/clocks can very well be a possibility, and the GTX 6xx high end cards might even reach the level of a GTX 580 in desktop world. I don't think it's too far fetched.
     
  29. Ahmed_p800

    Ahmed_p800 Notebook Evangelist

    Reputations:
    205
    Messages:
    350
    Likes Received:
    0
    Trophy Points:
    30
    WoW, imagine the performance of a Desktop 580/590 in a 15.6" Notebook!!!!

    But, The prices... (X_x)
     
  30. fi3ryicy

    fi3ryicy Notebook Enthusiast

    Reputations:
    0
    Messages:
    28
    Likes Received:
    0
    Trophy Points:
    5
  31. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
  32. acroedd

    acroedd Notebook Evangelist

    Reputations:
    17
    Messages:
    443
    Likes Received:
    0
    Trophy Points:
    30
    I dont get it! do all you guys dont understand anything at all? when you go from 40nm to 28nm as part of the less heat generated and a million other reasons why they do it is you can have more transistors in the given space,less power and all that. so you will have a performance increase! no matter what!!! they are not gonna shrink!!! even it is rebranded it usually comes with a 15% increase 460m to 560m!!!! and million other examples!
     
  33. fi3ryicy

    fi3ryicy Notebook Enthusiast

    Reputations:
    0
    Messages:
    28
    Likes Received:
    0
    Trophy Points:
    5
    so am i correct to say that moving from 40nm to 28nm will theoretically result in both lower power consumption and better overall performance due to an increased in transistor count over the same area size?
     
  34. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    The most important question is still on whether Nvidia will come off of their insane pricing structure. I can see them charging a hell of a lot for the GTX 680M.
     
  35. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    The 680M will probably cost as much as the 580M's launch price. If they go much higher, they'll simply lose sales. And as they find easier ways to do the same 28nm process, it becomes more cost effective, and eventually they throw in price drops. =D.

    But their pricing, as you must know, is due to the CUDA and PhysX technologies, rather than raw graphics performance. CUDA saves me HOURS when I get to use it (I can't use it often because I render in 1920 x 1200, and few editing programs output at that size, but with a new 1920 x 1080p native monitor machine, I would use it 24/7) and PhysX is a nice addition to certain games that use it, like Batman, Metro 2033 and Alice: Madness Returns. That's the real reason for the high premium on nVidia cards. You could probably include their pristine driver updates/support as of late as well, but that was a recent development and not part of the original price premium.

    Honestly, if someone told me they would never do video editing and only play FPS/MMO/RPG games, all of which are very unlikely to use PhysX, I would tell em I don't blame them for taking an AMD card. And it's the truth, if you only want graphics power, why not get a cheaper card with rival performance? I personally do a lot of videos and I enjoy SP and games where PhysX is present, so nVidia is the correct choice for me.
     
  36. acroedd

    acroedd Notebook Evangelist

    Reputations:
    17
    Messages:
    443
    Likes Received:
    0
    Trophy Points:
    30
    They wont be increasing the price, should just be the same as now. ivy with nvidia/amd 28nm, momentus xt 2 , clevo refresh with a killer new design! I cant wait for the next gen stuff and I havent even received my new laptop yet! ;) lol
     
  37. HeardEmSay

    HeardEmSay Notebook Evangelist

    Reputations:
    109
    Messages:
    385
    Likes Received:
    35
    Trophy Points:
    41
    I'm not keeping my hopes up in terms of that big of a performance boost...as long as the power consumption is lower and the new cards run significantly cooler..
     
  38. J.P.@XoticPC

    J.P.@XoticPC Company Representative

    Reputations:
    237
    Messages:
    595
    Likes Received:
    0
    Trophy Points:
    30
    Correct. If you think about it, the transfer of information basically determines on how fast bits of information can travel from one place to the next. Shrinking the dye therefore means the information has less distance to travel, giving it a performance boost.

    I took a class about astrophysics and this was brought up as part of a discussion about light speed and whatnot, really interesting stuff.
     
  39. fi3ryicy

    fi3ryicy Notebook Enthusiast

    Reputations:
    0
    Messages:
    28
    Likes Received:
    0
    Trophy Points:
    5
    yeah.. thanks for clearing things up for me! :)
     
  40. matu73

    matu73 Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    0
    Trophy Points:
    15
  41. jmiller0809

    jmiller0809 Notebook Consultant

    Reputations:
    0
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    So.. I'm confused... Will my new AMD 6990 be basically obsolete in a few months when this refresh occurs?

    Will there really be a 3x performance boost for basically equally cost to todays prices?
     
  42. Anthony@MALIBAL

    Anthony@MALIBAL Company Representative

    Reputations:
    616
    Messages:
    2,771
    Likes Received:
    3
    Trophy Points:
    56
    No, you probably won't see a 3x increase in just a few months. While it may be technically outdated in 3 months, you probably won't be that far behind. I have a 485m in my P150HM which is ~8 months old. It's been superseded by the 580m in that time, but it only lags behind about 10%. For a 3x difference, you'd have to have a card that is several years old, or is being compared to a whole different class (mid-range card compared to high end).
     
  43. jmiller0809

    jmiller0809 Notebook Consultant

    Reputations:
    0
    Messages:
    112
    Likes Received:
    0
    Trophy Points:
    30
    Thanks... wheww.. I can live w/ that.
     
  44. matu73

    matu73 Notebook Guru

    Reputations:
    0
    Messages:
    58
    Likes Received:
    0
    Trophy Points:
    15
    I think, if you really want to see big increase in perfomances and you already got high end card and processor from 2011, you'll have to wait until 2013.
     
  45. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    I expect the 7970M and 680M to be hmm.... a 50% leap over current high-end offerings.

    6990M is 1120sp, 715/900? I'll be surprised if the 7970M doesn't pack 1400sp, with clocks around 850/1000.
     
  46. acroedd

    acroedd Notebook Evangelist

    Reputations:
    17
    Messages:
    443
    Likes Received:
    0
    Trophy Points:
    30
    I can put my money down at 30% increase minimum, 28nm sounds exciting! having worked with nano graphene particles for my Li-ion battery paper which got published, I did smaller and smaller tech, it packs on so many improvements! :)
     
  47. Aeyix

    Aeyix Notebook Evangelist

    Reputations:
    26
    Messages:
    470
    Likes Received:
    9
    Trophy Points:
    31
    All I have to say is with Ivy Bridge's slight boost and smaller process, Optimus, and Backlit keyboard, if it is true that the GTX 650M > GTX 580M in performance, then I'm set and definitely wanting that. May do 660M just for the slightly better future proofing. Can't wait for actual news to hit though.
     
  48. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    If this is the case, what do you expect from the 7990M?
     
  49. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,773
    Trophy Points:
    581
    It's hard to say, with the desktop chips being kept so under wraps. Once they release details on the high-end desktop chips, I'll be able to point out exactly which chips can be what on the mobile side.

    There's no guarantee we'll see a 7990M, depending on how the performance tiers of the desktop chips are spaced out. There may not be a 6850 (6970M) -> 6870 (6990M) pairing available.
     
  50. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    If we would assume that the performance tier is the same as now, slightly more balanced maybe and probably a lot stronger ofc, but really, if the 7990M would to be released, it should be a monster, presuming that AMD continues to develop as they do.
     
 Next page →