The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    HURRAY: Nvidia 600 series not just Fermi!! (Kepler)

    Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Mar 2, 2012.

  1. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So what it looks like is that the GT 640M is only with DDR3, GT 650M comes with both DDR3 and GDDR5 and GTX 660M comes with GDDR5 that is higher clocked than the memory inside GT 650M.

    GT 640M, GT 650M and GT 660M comes with the same amount of shaders, but most probably turbo boost up to different clocks :)

    Looking at the GT 640M 3DMark11 score of P1898 and the GT 650M 3DMark11 of 2185, there must be some good differences there. I wonder if the GT 650M score was with a GDDR5 or DDR3 though since I cannot find any detailed info about that

    Btw some thanks for the screenshots and benchmarks would be nice :)
     
  2. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Can you kindly provide the source? I want to take a look at what the chinese is talking about in their own forum.

    It is actually extremely impressive that the GT 640M can be squeezed into such a slim form factor in the first place. This wasn't possible before, with the exception of the Samsung Series 5 Ultra with the HD7550M, but that was considerably weaker.

    Thank you very much!

    @Cloudfire: Stay happy now! :)
     
  3. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Source is coming. Just uploading a few more pictures and editing

    Once again, since it is so long back in the thread.
    Here is the Samsung Q470 with a Kepler GT650M tested (If it have DDR3 or GDDR5 is unknown):

    [​IMG]

    Here is a picture of the Samsung Q470 in the owners posession as proof:
    [​IMG]

    Source: http://benyouhui.it168.com/thread-1958878-1-1.html

    Took me a while to upload these and find them, so a simple thank you would be nice. Now, enough pictures from me. Discuss :)

    I have also updated the first post in this thread with all the pictures so far so if you want a summary, go check it out :)
     
  4. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    From Chinese Source: Q470 ʹÓÃ3Ìì ¼òµ¥ËµËµ¸ÐÊÜ_ÈýÐDZʼDZ¾ÂÛ̳
    "本来没有考虑三星的笔记本,觉得贵,性价比不高,在网上看的差评多,好像用的人也少。最近想换本本,看上的了IBM的S420,
    前两天在网上看本本 一眼看上了Q470,漂亮,马上去官网看数据,不错,I3够我用了。又去三星论坛看评价,很像没有Q470的,看来
    要做小白鼠了。不过看Q460的评价还好,Q系列在整个三星产品体系算是中高端产品了。三星的东西我用过,感觉还行。周末跑去电脑城看
    真机,做工不错,应该对得起这个价位。果断入手。价格就不说了。

    键盘打字不错,反馈不错。

    机子散热问题,对于笔记本来说散热问题是很重要的,衡量一个本本的好快,散热占很大的比重。Q470散热不错,日常使用,上网,开个迅雷,
    ,不开大型软件,或者说用I3自带的集显,CPU一般在45—50之间,主板也是差不多,整个机子左面主要是CPU,主板,所以左面明显发热,不过
    能接受,按10分制算打7分吧。开大型软件,比如3DMARK 这些,做性能测试,CPU上升到60-65之间,手接触明显感觉发烫,不过想想其他机子也
    差不多,笔记本是这样的。(鲁大师测的 ;)

    总结一下,机子优点:雾面屏,三星自家的屏幕,有一键关屏快捷键。 背光键盘,晚上打字舒服了。 背后自带支架,对了,提一句,打开支架,散热明显提高,可以下降5度左右。 音响JBL确实不错,电影,音乐表现良好,比普通的本本音响好很多。 还有一些三星自带的软件,智能下载关机等等也不错,现在还在适用。 不足:机子塑料感明显,但做工很精致,模具不错。 机子A面向下按,有点软。"

    Translate for important parts relevant:
    Q470 is expensive and not many people own it in China.
    Q470 has great cooling and processor idles at 45-50C.
    Under performance testing using 3Dmark? the processor temperature increase to 60-65C, feels hot, but ( the writer thinks) that is common for many notebooks.
    There is a stand as mentioned and when opened, that allows up to 5C reduction in temperature.

    A simple thank you is always nice! Rep Cloudfire too! :)
     
  5. ivan_cro

    ivan_cro Notebook Consultant

    Reputations:
    23
    Messages:
    121
    Likes Received:
    0
    Trophy Points:
    30
    that's...... a bit disappointing. Perhaps it's ultra mobile cpus fault, but I'll have to take some time and read about this new architecture. 11000 3dmark06 is about as much as my 4 years old desktop 9600gt with its 64 SPs can pull paired with core2 duo cpu.
     
  6. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    2185 in 3DMark11 is NOT disappointing though. Try look through the GPUs from this list and compare scores, and you`ll see that for a entry GPU of Kepler, that is bloody impressive

    I just link you to GT 550M and you can work yourself up from there :rolleyes:
    NVIDIA GeForce GT 550M - Notebookcheck.net Tech
     
  7. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    Source http://www.pcper.com/reviews/Mobile/Nvidia-GeForce-GT-640M-Review-Kepler-Arrives-Mobile

    Thats pretty hot, even more so when considering is combined with a 17W ULV i5 processor.. Lets hope full powered IVB mobile CPUs with kepler will be provided with sufficient cooling.. or else say hi to throttling
     
  8. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Wait a minute... I saw this post somewhere?

    The Q470 uses i3-2350M with Gt 650M and if you look a page back I translated the opening post. Author says 60-65C for processor after 3dmark.
     
  9. gamba66

    gamba66 Notebook Evangelist

    Reputations:
    63
    Messages:
    491
    Likes Received:
    3
    Trophy Points:
    31
    Well, like he said its 60 to 65 Degrees for the ULV 17W Processor.. I only mean the kepler GPU though.. If only the kepler would get too hot, that wouldnt be such a big problem, but in notebooks like the m14x r2 which where the Kepler GPU shares a Heatsink with the full powered ivy bridge Quad Processor, this could be problem for throttling..

    Also checking the Temperatures after a GPU Benchmark isnt significant.. You would have to keep the GPU at Load for a while to get the real temperatures readings.. I mean you wouldn just play a game for a few minutes and then stop.. BF3 for example stresses the CPU and GPU extremely, now imagine playing over an hour while your notebook turns into an oven

    Sry for posting this more often, I just want to get some opinions on the heat :)
     
  10. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    how the hell is this good news? gtx 660m with 128bit and 1GB VRAM, it WON'T be able to game 1080p, clearly... anyway my gtx 460m was already scoring 7.1 in windows gaming index, it would be nice to see 3dmark11, which I guess will be a little above 560m..
     
  11. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    We'll see how memory efficient Kepler is after some more reviews come out. You guys are all assuming that it's a 28nm Fermi, which it isn't.
     
  12. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Who exactly is assuming it's a Fermi? It's clearly and impossible to debate, a Kepler chip.
     
  13. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    The temperatures are not encouraging to say the least and this is something that should be addressed, but I also agree that laptop manufacturers should stop being cheap and just install proper cooling materials (without raising the prices because it would result in a simple artificial gauging even though the costs are minimal to them).
     
  14. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Thought: Isn`t the GDDR5 memory of the GTX 660M clocked extremely high compared to other previous Fermi GPUs? It is even clocked higher than the top desktop models of last series :confused:

    Here is the GTX 660M:


    Here are the previous top models from mobile 500 series:


    Here are the previous top models from desktop 500 series:
     
  15. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    yupp, that's a good point that it is fake (2.5GHz memory clock on a mobile chip :eek: get out of here!)
     
  16. plancy

    plancy Notebook Evangelist

    Reputations:
    56
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30
    More like 1250mhz, see how in the pink charts the clocks are multiplied by 4 to get the data rates.

    It might not be fake, just reading the clocks wrong.
     
  17. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Could a very high clocked memory be a way to get passed the small memory bus? I don`t know exactly how memory speed and bus width work together to be honest but if the memory can push the bits through the bus faster then it won`t be a bottle neck hence better performance?
     
  18. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    anyway, thanks for the pics cloudfire, I guess we will see the real deal soon enough :)
     
  19. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    But if it is 1250MHz it is still much faster than even desktop GPUs I listed right?

    No problem, nice to finally see the GTX 660M. Now we need some benchmarks and games tested :)
     
  20. plancy

    plancy Notebook Evangelist

    Reputations:
    56
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30

    Yeah faster than stock desktop gpus, I'm curious about the shader clocks.
     
  21. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Any tech savvy people here? Is this total BS or is it any truth to this?
     
  22. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Yes, thats what they did with the 9800GTX line. It had a smaller bus than the 8800GTX, but the clock-rate made it a faster card by a small margin.
     
  23. SkittlesXD

    SkittlesXD Notebook Consultant

    Reputations:
    37
    Messages:
    220
    Likes Received:
    0
    Trophy Points:
    30
    This isn't good, looks like 660M = 640M + 1GB GDDR5. The 680M may not be as powerful as are all looking for...
     
  24. R3d

    R3d Notebook Virtuoso

    Reputations:
    1,515
    Messages:
    2,382
    Likes Received:
    60
    Trophy Points:
    66
    Bandwidth is pretty much bus width multiplied by frequency. So GDDR3 at 1000mhz with a 256bit interface will have the same bandwidth (that is, information per time e.g. gb/s) as a 500mhz chip with a 512bit interface.

    So yeah.
     
  25. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Great. So the future of GTX 660M isn`t looking too shabby after all then? The memory is clocked 750MHz faster than the GTX 580M :)
     
  26. Prema

    Prema Your Freedom, Your Choice

    Reputations:
    9,368
    Messages:
    6,297
    Likes Received:
    16,485
    Trophy Points:
    681
    To make it even more confusing...Terrans Force named the card as GT660M and not GTX...and it was listed for mid range models and as smallest entry card for high-end models...
     
  27. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Saltius wrote this earlier in the thread. I don`t think it can be a official GPU/name though since it is nowhere to be found on this list, plus we have never had any GT x6x before (atleast for the 400/500 I think)
    http://i42.tinypic.com/9r21w7.jpg

     
  28. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    That's not too absurd:
    [​IMG]

    With some improvements, nVidia should be able to get a mobile GPU to have a 2500MHz memory clock.
     
  29. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    That is a nonsensical connection you've made there. The 680M won't even be the same core as the 660M.
     
  30. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Niiice. So the memory on top of GTX 660M is 500MHz (1000MHz data rate) faster than the 580M. Sweet :D
    How come the pics I posted earlier said 750MHz(3GHz data rate) and this says 2000MHz(4000 MHz data rate)?
    How do you think this will transfer in terms of performance?
     
  31. ivan_cro

    ivan_cro Notebook Consultant

    Reputations:
    23
    Messages:
    121
    Likes Received:
    0
    Trophy Points:
    30
    Yes, I agree that it's "not bad", but considering performance 525m/540m/550m were junk honestly as they couldn't push recent games at 1920x1080 or even 1600x900 with decent details and at least some AA enabled, but they were relatively cheap. All other, stronger cards were very very expensive and for that reason alone I consider portable gaming to be privilege of rich.

    For mid range gaming notebook with gt550m one could get kick desktop that is easily upgradable in future.

    With this new generation of mobile gpus there is hope at least that portable gaming of current games will become available to masses.
     
  32. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    Well, mine is overclocked, but it shows that 2500MHz is not an unrealistic memory clock for a next-gen mobile GPU.

    I'm not sure how it will affect performance because it's on a 128-bit interface.
     
  33. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    So the stock frequency of GTX 580M is 750MHz?
    How much better did the 580M perform with this memory overclock? Can you give us an estimate? FPS in games, percentage estimate etc :)

    Another Samsung Q470 with GT 650M (this time with DDR3) spotted in the "wild", but a new screenshot of something I don`t know what is. Seems like a benchmark or index of some sort. What is it? Pic number 2
    It scored 2185 with 3DMark11 BTW. So now we know, the GDDR5 GT 650M will perform better than that

    [​IMG]
    [​IMG]
    [​IMG]

    Source
     
  34. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    Just ran the Metro 2033 benchmark a few times.

    At stock 620/1500/1240:
    [​IMG]

    Overclocked to 850/2000/1700:
    [​IMG]

    Around a 32% improvement.
     
  35. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I don't see any reason to release a GDDR5 GT 650M, when that's exactly what the 660M is. It's pretty weak of Nvidia to call the slightly higher clocked 640M the 650M too.
     
  36. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    @548...: Oh nice. Thanks for doing that. +rep

    But quick question though: That wasn`t just a memory overclock was it? Shaders and everything too?

    Yeah agreed. Should be saved to a GTX card. But man am I eager to look how good the 660M is going to be with that insanely fast memory clock. Bet that 128 bit bus won`t slow the performance down by much when it have so fast memory :D
     
  37. plancy

    plancy Notebook Evangelist

    Reputations:
    56
    Messages:
    550
    Likes Received:
    0
    Trophy Points:
    30
    I'm curious to see if the 680m will have clocks like that :O..... double perf would be intense! Even SLI level would be nice(perf x 150% min)
     
  38. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    GTX 680M, if really based on Kepler, will see GK106 or GK104 ( larger chip with more cores and thus performance) and not GK107.

    The reason is if GT 640M through GT 660M uses GK107, and GK107 even on the desktop has maximum 384 cores, to get the performance at next gen flagship level, it needs GDDR5 with at least 640 shaders.

    And I doubt the GTX 670M/ GTX 675M be mere rebrands now. They will get crushed by the GTX 660M in certain games/benches.
     
  39. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    dude, I don't know how long your GPU will live using these clocks, but a stock 750MHz 580m memory clock going to 2GHz cannot be healthy...

    gtx 580m overclocking - NVIDIA Forums
     
  40. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    You're right, that was everything.

    At 620/2000/1240:
    [​IMG]

    An increase around 7%.
     
  41. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Thats more like it. Thanks :)

    So just with higher clocked memory alone we will see around 10% performance increase. That was just one game though. We can see bigger differences than that with other games. Probably impossible to interpret that and transfer it to the Kepler though since shaders and everything are totally different. But good to see anyway.

    Nice to see more people thinking that. I just saw some guys at a chinese forum discussing the GTX 660M and they said that 660M will almost reach 580M performance. I don`t know if they actually tested the GPU but interesting nevertheless :)
     
  42. maxheap

    maxheap caparison horus :)

    Reputations:
    1,244
    Messages:
    3,294
    Likes Received:
    191
    Trophy Points:
    131
    oh by 2GHz you mean 1GHz effective clock, ok get it...

    for some reason I interpreted your first screeny as GPU-z :D
     
  43. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Anyway if there is anything on the chinese forum you don't quite understand, then post the chinese text here. I will translate into English for you guys.
     
  44. Arondel

    Arondel Notebook Evangelist

    Reputations:
    291
    Messages:
    487
    Likes Received:
    173
    Trophy Points:
    56
    It seems they are pretty fond of it.

    GeForce GT 520M - Specifications - GeForce and GeForce GT 520MX - Specifications - GeForce (although in this case the # is the same and I haven't seen any notebooks with the MX).
    GeForce GT 525M - Specifications - GeForce, GeForce GT 540M - Specifications - GeForce and GeForce GT 550M - Specifications - GeForce.

    I guess business is business...
     
  45. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    They have limited designs for 28nm low end which is the one and only GK107. If Nvidia did not use GK107 with DDR3, then performance cannot match the intended level. That is why GT 640M is given DDR3 only. Even then, I think it is slightly better at performance than what Nvidia intended the GT 640M to be at. ( Hint: Nvidia blog post results vs anandtech results)
     
  46. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    I wish. That'd be a 166% overclock. :eek:

    I don't know. Originally, before all of the delays, the 660M was supposed to be released around a month after the 670M and the 675M.

    [​IMG]

    This may have been a sign that it could have a different architecture.

    Have you seen this review?
    http://www.pcpop.com/doc/0/770/770391_all.shtml#p1

    It seems to show that the 675M is exactly the same as the 580M.
     
  47. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Have you noticed the GTX 675/670 has a codename N13E- GS1? Why GS1 when normally one codename down after GTX is GT???

    The Cinebench result is extremely weaker than what I have from the V3's GT 640M BTW. No conclusive findings... yet.
     
  48. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    BTW, have anyone concluded that the 670M/675M is 40nm and not 28nm, or is it still undecided?

    Question #2: What software is this? It is the GT 650M GPU btw. Nevermind I found out. It is called Ludashi. Probably something useless :)
    [​IMG]
     
  49. yknyong1

    yknyong1 Radiance with Radeon

    Reputations:
    1,191
    Messages:
    2,095
    Likes Received:
    8
    Trophy Points:
    56
    Not conclusive... But if the Terransforce link is accurate... Then it's definitely Fermi.

    This one is their own benchmark... by the chinese programmers.

    It is better than 79% of computers in their country for playing games and HD videos. How true that is I don't know.
     
  50. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    Ah, I see. Thanks :)

    I know 670/675M are fermi, but is it a fermi in 40nm or fermi in 28nm?
     
← Previous pageNext page →