The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    New Sager with 69xx?

    Discussion in 'Sager and Clevo' started by The Forerunner, Dec 24, 2010.

  1. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Specifications

    It says next generation ATi. Meaning is this the 6900 series?
     
  2. MasterChief07

    MasterChief07 Notebook Consultant

    Reputations:
    67
    Messages:
    128
    Likes Received:
    0
    Trophy Points:
    30
    Seems highly plausible. I have recently seen various articles regarding Mobile Radeon 6000 series yields/specs/prices, so it would make sense.
     
  3. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    The Mobility 6900 will be, in all likelihood, announced at CES. There is also rumour that the 580M/485M is around the corner, possibly waiting for CES as well.

    EUROCOM Cheetah 3.0
    17.3-inch Full HD; 1920-by-1080 pixels; Glossy or Matte; optional 3D

    Still 16:9 ... I call that bull** ! Too bad the Dell Precissions are soo expensive.
     
  4. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    I look forward to a new GPU that is reasonably priced instead of the overpriced GTX 470M. I am looking to get a new GPU so will keep a watch on this news.

    Edit:

    Just noticed the link is Eurocom which is about as reliable a source as a brick wall.
     
  5. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
  6. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    We've already discussed those benchmarks in another thread on this forum and they are not very reliable. The problem is the CPU score which seems to be tweaked. A Mobile 2720QM scoring as high as Desktop i7-975 raises a lot of questions.

    Even so, if that is a single card score then the Mobility 6970 should be about 35% faster than the 5870. Our initial estimates were for around 20-25%.
     
  7. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    That's definitely not a single 6970M, it's Crossfire. The only skewed result in those benchmarks is the one showing the Intel HD graphics which is likely just a bug that optimus based systems experience (M11x has that issue too). Otherwise the scores are consistent with what is expected of Sandy Bridge and 6970M Crossfire.


    Edit: The 2720qm turbo boosts to 3.3 GHz which puts it on par with current 920xm CPU's. The 3dmark06 results, when taking into account greater CPU efficiency along with faster GPU's is pretty close to being on the mark. With a 920xm equipped Crossfire system, the CPU score is not far behind at those clocks. 3dmark06 doesn't utilize all 4 cores efficiently like Vantage does and the CPU frequently turbo boosts well above 2.2 GHz. Take a look at what my overclocked 940xm was achieving: http://forum.notebookreview.com/att...official-m17x-benchmark-thread-part-3-23k.jpg

    Granted the TDP is ramped up with the 940xm but considering that SB will have TB that can exceed TDP as long as the cooling permits, 20k is easy to get.
     
  8. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Not really, SB is just 10% faster clock to clock than the previous generation. So all the extra performance you can get is from the smaller manufacturing process which allows for higher clocks. This means that just as you said, the 2720qm will perform similar to the 940XM. However, the 940XM has a score of around 4000 points in 3dmark, far less than the 5600 CPU score in that benchmark. The 2720QM will score 5600 only if all 4 cores will be running at 3.3 GHz (maximum Turbo Boost). I am not entirely sure if this is possible with Turbo Boost 2.0 but if it is, it is only for very brief periods of time.

    Now taking into consideration that 3dmark06 is very sensitive to CPU performance, then the 20k score is not very reliable for measuring performance. Which is what I've been trying to point out here.
     
  9. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    It is CPU dependent but a faster GPU also pushes up the final CPU score. This is easily proven by using the same CPU in an M17x and disabling one of the GPU's, the final CPU score also drops. Thus a 20k score with a 45W TDP (with TB 2.0) 2720m + 6970m crossfire is entirely plausible.


    Edit: Look at this score: http://img41.imageshack.us/img41/4146/3dmark0622335.jpg When he ran that benchmark, Throttlestop had not been released yet thus all he did was use the bios BCLK overclock + eVGA (though this was pointless due to TDP limitations). Now pay attention to his CPU score. 4546 points with just a 5% BCLK increase (which again with TDP limitations is largely useless in 3dmark06). I'm confident those posted scores are entirely legit.
     
  10. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I didn't say it's not plausible. Given that with some OC a 5870M Crossfire setup can be pushed way above 20k then a 6970M should easily reach and go beyond 20k.

    What I am saying is that it is not reliable for comparing performance. There are too many unknown elements to draw a solid conclusion.

    Are you sure this is a Crossfire setup ? Any link ?
     
  11. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Well given my experience with Crossfire setups, 23k-24k is the absolute max a 5870m Crossfire system can reach. Thus unless AMD drastically increased single GPU performance, 20k for a single GPU is not realistic.
     
  12. AndrewKW

    AndrewKW Notebook Consultant

    Reputations:
    126
    Messages:
    231
    Likes Received:
    0
    Trophy Points:
    30
    According to Eurocom things are new:

    - four RAM memory slots means 8GB of cheap memory and max 32GB!
    - 2 x USB 3.0 ports which is one more (total five USBs) compared to current gen
    - AMD 69xx will be awesome considering AlienWare leaked benchmarks.

    This laptop used to have 120W AC adapter and which is not enough for 940XM
    and GTX 470M + Ocing + 5 USBs + other ports and periphials etc..not talking about 480M.

    I expect the new W870CU to come with 150W+ TDP AC :)
     
  13. widezu69

    widezu69 Goodbye Alienware

    Reputations:
    3,079
    Messages:
    4,207
    Likes Received:
    168
    Trophy Points:
    131
    Well acutally, the single 69xx gpu in the M17x got just over 20k with Sandy Bridge so it is possible.
     
  14. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Possible only with a CPU that scores 5600 point. This would mean that without a skewed CPU score, the real score would be around 17000-18000 points. That's a huge leap from the Mobility 5870, which is not supported by the architecture change brought by the Desktop 6850 vs. the Desktop 5770. The only possible way that this is for a single GPU, is if the TDP for the GPU is above 75W (90W maybe) or it has been seriously overclocked. Something we don't know at this time.

    On the other hand,
    If that score is for a Crossfire system, I would say that is definitely too low given that CPU score, unless we are talking about a 6850M Crossfire solution in which case it is fully possible.

    As you can see there are so many variations that can lead to that score that's is simply impossible to tell anything about the performance of the card/cards tested there.
     
  15. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Lol, it's Eurocom. How did their leaked prices for the 460M and 470M turn out?

    That's all there is to say, for right now.
     
  16. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Single 6900..yeah maybe if we were talking about a desktop.
     
  17. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    edit: reposting in a minute.
     
  18. theriko

    theriko Ronin

    Reputations:
    1,303
    Messages:
    2,923
    Likes Received:
    4
    Trophy Points:
    56
    That's a very long minute...
     
  19. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I decided to not even get involved, with a debate over 3Dmark06 scores.

    All that needs to be said it that the 6850 is a consistent minimum of 25% faster than the 5770 @ 1920x1200, and that advantage will carry over the the 5870M vs 6970M.
     
  20. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    This all I have to say:

    ASUS Radeon HD 6850 Direct CU 1 GB Review - Page 29/32 | techPowerUp

    PowerColor HD 6850 PCS+ 1 GB Review - Page 29/32 | techPowerUp

    AMD Radeon HD 6800 Series Is the New Midrange Graphics Champ - PCWorld

    EDIT: I just did some calculations, and Kevin actually has a point, it might be a single GPU but it has to have a slightly higher TDP than the 5770, say 70-75W.

    I only disagree with him on one aspect, that 25% is not the minimum, but the maximum possible performance gain.
     
  21. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The techPowerUp review is the one I used, when making that statement. Each gaming benchmark shows a consistent 25% advantage at 1920x1200, with some going as high as 40-50%. I expect that to carry over, assuming 6900M is downclocked by a similar ratio.
     
  22. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Now figure for Blackcomb being full Barts XT at or around 80% of desktop HD6850's clocks.

    When's the last time ATI/AMD used a cut-down GPU for mobile parts?
     
  23. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    For 6900M to be truly between Barts XT and Barts Pro, the transistor size has to considerably increase along with the die size. I saw the projected roadmap where it's supposed to be 256 bit as well. If you look at the full board power consumption of these 5870m's, they pull 65-75W already. I measured 5870M's power consumption when I had my kill-a-watt by alternating between active Crossfire and single card with furmark active and it varied from 65W-75W per card. If Blackcomb is going to have 700 more transistors, that's roughly a 40% increase which should follow with a similar power consumption increase. You're looking at 90W-105W per card. The M17x-R3 is supposed to actually be thinner than the M17x-R2 so unless Dell created some revolutionary cooling system (along with other ODM's like Asus) this is not going to fly. Yes I'm already aware TDP and power consumption aren't directly related but they do correlate very strongly.

    Edit: Here's what Techpowerup said about Barts Pro and XT-

    So how is a mobile variant based on the same transistor and memory interface going to stay at the 75W envelope?
     
  24. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Barts die size is 255M^2...just bigger than the die size of GTX 460M/GF106 @ 240MM^2 but lots smaller than GTX 470M/GF104 @ 366mm^2.

    HD6850 is rated at a Max Board Power of 127W while HD5770/Juniper is rated at 108W. Based on overclocking potential HD6870 has proven to be a Barts GPU pushed to it's limits while HD6850 enjoys plenty of overhead to reach the same clock speeds.

    The major contributor to power draw is clock speeds not shaders so a desktop HD6870 at HD6850 speeds wouldn't draw much more power than the 127W and should be below the 151W MBP that HD6870 draws at stock speeds.

    AMD should be just as capable of converting that power draw into a mobile TDP as Nvidia was converting the 150W GTX 460SE with it's larger die into the GTX 470M.
     
  25. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Yes, but you have to take into consideration that the 6850 has a higher TDP and higher power consumption (about 10% more). That's why I am saying is not going to be as high as the desktop 6850vs5770. It's all about performance per watt when it comes to mobile graphic cards.
     
  26. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I feel you, and I don't think we're actually in disagreement.

    Even a 20% increase, with headroom to OC, will be enough for me.

    What % of the 6850 do you expect 6970M to cover at stock, 75 percent?
     
  27. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I've actually struggled to do some computations based on various factors.

    In the end, a hypothetical 6970M @ 70W TDP based on the HD 6850 core gives me the following predictions:

    A Vantage GPU score of around 8200-8300 points. With a Vantage P score of around 9000-9300 points with a processor let's say i7-2720QM. On stock, the card will probably brake 10k P points in Vantage but only with the top of the end Sandy Bridge CPU.

    The predicted card also seems to perform just a bit better than the 480M.

    At minimum it's 58%, but I would say 61%.

    If you want and I will have time, I will try later to do some predictions for a hypothetical 580M @ 75W TDP :).
     
  28. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Blacky, I appreciate the substance you bring in your posts. But I hope to God that you are wrong. Anything short of a 15-20% advantage over the 470M will disappoint me, though we are pushing towards the ceiling, where physics come in to play.

    And check this out, from the X7200 thread:

    This is in reference to the new dual 300W PSU.

    One of AMD or Nvidia are coming back with a > 75W GPU. Maybe the 580M is going to be a ~350 core GF104. Or is AMD unleashing more of the 6850's power?

    Let the speculation begin.
     
  29. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    Hmm, a 95W AMD card based on the 6850 would get around 11000 points GPU score in Vantage. That basically puts it on part with the desktop 5770 or possibly even better.

    But I am really don't like to make predictions of this sort. As you said, I would prefer to be wrong.
    Still I will try to make some for the 580M, just out of curiosity.
     
  30. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Future card S....GTX 580M and a mobile Quadro card based off the same GPU.


    At 95W you still can't consider it using a full Barts GPU? ;)
     
  31. Blacky

    Blacky Notebook Prophet

    Reputations:
    2,049
    Messages:
    5,356
    Likes Received:
    1,040
    Trophy Points:
    331
    I think it may be possible to use the full Bart GPU even at 70W TDP, but using clocks of I don't know 600 Mhz on the core? I have no idea really.

    Anyway, I've tried to do some calculations based on the new GF110 architecture but everything I get is simply bad. In order to match the performance of the 6970M a hypothetical 580M needs about 5W more power/TDP.
     
  32. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Mind if I go out on a limb, to say that the 580M is simply a 336 shader GF104? They wouldn't dare attempt another GF110 or GF100.

    GTX 460 is 150W @ 675/1350/900. I don't see anyway how that ends up as a viable mobile chip, even at 100W.
     
  33. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I know that we know better than to blindly trust Notebookcheck, but it's still worthwhile to note that they had this to say about the Radeon HD 6970M:

    If true?
     
  34. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    If true, AMD wins the top mobile GPU spot and we go back to the good days when the top GPU required the cooling only a 17" chassis or better could provide. :wink:
     
  35. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    I'd happily welcome the return of those days. We're at the point where 15" machines can handle high-end GPUs, so why not go ultra high end in the heavy 17" chassis? Right now, that extra weight (i.e. W860CU vs W870CU) isn't being taken advantage of, at all.

    Of course, it will turn out being just another Notebookcheck inaccuracy, and we'll see the same 70-75W AMD GPU in the 15" and 17" machines.
     
  36. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    Then we'll have to wait for Wimbledon. ;)
     
  37. Daniel Hahn

    Daniel Hahn Notebook Evangelist

    Reputations:
    146
    Messages:
    664
    Likes Received:
    0
    Trophy Points:
    30
    It all comes down to whether Nvidia made money with the GTX 480M or not. They basically made a loss with their whole GF100 series, but how about the GTX 480M? It was way too overpriced but quite some people bought it anyway. 100W TDP is not really a 15" vs 17" question. You can put exeptional cooling into a 15" chassis (W860CU). But for a 100W TDP card you need extreme cooling and a very big power supply. Only very few manufacturers will implement such a cooling solution, basically only Clevo made such a laptop. So the second question is: Did Clevo make money with the X7200? I guess they did and still do, but that's because the desktop CPU is another selling point. The W880CU was a complete failure or that's how it seems. So, I don't think we will see a 100W TDP card again soon. However, if Alienware designed the m17x R3 to support 100W TDP cards the situation might change, but I doubt it.

    Or maybe I'm wrong, after all, the problem might not be 100W TDP but power requirement. Maybe that was the problem with the GTX 480M. We all agree that a card with good performance per watt, like GF104/GF114 would kick with 100W TDP.
     
  38. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    The only issue the GTX 480M had, was Nvidia's ridiculous pricing.

    If Nvidia had released the 100W 480M @ $560, a 2x markup from the reference card, the TDP wouldn't have mattered as much, and the W880CU would've been a hit.

    It didn't even need "extreme" cooling. The W880CU was just a W870CU with a larger back panel, and the chip had a special heatsink.

    AMD can afford to price a 100W 6970M @ $450 to $500. No one would care, because the performance per watt per dollar would be amazing.
     
  39. Daniel Hahn

    Daniel Hahn Notebook Evangelist

    Reputations:
    146
    Messages:
    664
    Likes Received:
    0
    Trophy Points:
    30
    Still, you need another back panel, a special heatsink, probably some other improvements to the power circuit, a new PSU. Those are all costs that manufacturers have to pay just to include a 100W GPU. I guess this is the reason why no manufacturer besides Clevo suppoted the GTX 480M.

    I also don't like the way the W880CU handles the GTX 480M. Improving GPU cooling by decreasing CPU cooling isn't the way to go and I think Clevo understood that as well.

    And in the end you can make more money with 75W cards as more manufactuers can use them, and that is where AMD can make money, leave the extreme high-end sector to Nvidia (although the GTX 480M was anything but high-end).
     
  40. Phinagle

    Phinagle Notebook Prophet

    Reputations:
    2,521
    Messages:
    4,392
    Likes Received:
    1
    Trophy Points:
    106
    ^This.

    Ignoring that it was a borked GPU, the GTX 480M wasn't really an attractive offering because it was expensive and it didn't offer enough performance over the Mob. HD5870 for the extra power it drew.

    Clevo may have been the only company to pick up the GTX480M but Dell, HP, and Lenovo workstation notebooks each cool a 100W TDP GPU. Yes you'll always see fewer notebooks that will cool a 100W GPU, but you'll see more willing to take on the task if the perf/watt and perf/price is worth it to do so.

    For those thinner notebooks that don't want to engineer cooling for a GPU with a 100W TDP....well that's why AMD and Nvidia make lower model cards. ;)