The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    directx 10 for notebooks?

    Discussion in 'Gaming (Software and Graphics Cards)' started by fizzyflaskan, Nov 9, 2006.

  1. fizzyflaskan

    fizzyflaskan Notebook Guru

    Reputations:
    0
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    how long do you think it will take for the notebooks to get a gpu who supports directx 10??? Do i need to wait a year or two or just a few month??
     
  2. vespoli

    vespoli 402 NBR Reviewer

    Reputations:
    1,134
    Messages:
    3,401
    Likes Received:
    0
    Trophy Points:
    105
    No one knows for sure--the nVidia 8000-series cards will supposedly be dx10. I would say that you can look for dx10 desktop cards to come out relatively soon, and as with any new technology, it will take a little while to filter down to the notebook market (getting power consumption down, form factors, etc) I would say that once vista comes out we will start to see a rush of dx10 cards soon afterward.
     
  3. fizzyflaskan

    fizzyflaskan Notebook Guru

    Reputations:
    0
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    okej, but the first cards only will be avaible for detoskop laptops because they wont have to scale them down as mutch as for exampel 14" is that correct??
     
  4. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
    My guess would be that they will start putting mid-rage dx10 cards that don't consume as much power and aren't as big into notebooks by the time Santa Rosa comes out.
     
  5. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    The Intel GMA3000's should also be DX10 capable. The 8800 series ARE out, and the ARE DX10. But it will take a long time for those cards to hit the notebook market, when just the card alone can draw more power than an entire notebook system.

    The higher-end cards like the 8800 will probably be for desktop replacements, but that's normal. The best card you can get in a 14" machine now is an X1600/Go7600.
     
  6. Notebook Solutions

    Notebook Solutions Company Representative NBR Reviewer

    Reputations:
    461
    Messages:
    1,849
    Likes Received:
    0
    Trophy Points:
    55
    You can be sure that there will be DX 10 videocards in notebooks when Santa Rosa platform is out (Q1 2007). I think you can definately count on 8600 GO and 8400 GO as well as the 8800 GO. I do not know about ATI's DX10 cards for notebooks, they have not even released their desktops card yet.

    Charlie :)
     
  7. fizzyflaskan

    fizzyflaskan Notebook Guru

    Reputations:
    0
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    okej so you say I probably could get an notebook whit dx 10 in say march-may something??
     
  8. lumberbunny

    lumberbunny Notebook Evangelist

    Reputations:
    57
    Messages:
    314
    Likes Received:
    0
    Trophy Points:
    30
    Exactly what is the power draw on the 8800 series cards?
     
  9. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    LOL Intel GMA3000. I hate their media accelerators, I can't believe they are going to make a DX10 one. You would think the negativity towards any Intel GMA would be enough to get them to quit making them. I understand GMA3000 is aimed alot towards business stuff, but still, Intel GMAs suck balls.
     
  10. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    ~10% above the X1900 series.

    Something like 130W under load should be a decent estimate.

    You'd think anything that exists in something like 50% of all notebooks would be enough to get them to keep making them. ;)

    The negativity is only from people who expect it to be a gaming GPU. That's a tiny minority.
    Most people either have the 950 because it does what they need, or they have a "proper" GPU in the first place.

    In any case, the GMA950 has been a huge success for Intel. Don't assume that just because it can't run the latest games in 1600x1200, it's universally hated.

    Personally I like the idea of the GMA3000. I'm looking for a thin & light notebook with low power consumption.... and as big a featureset as possible. So for me, a GMA3000 would work very well. (At least if the driver ends up good enough)
     
  11. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    177 Watts on the GTX under load. Don't have any numbers on the GTS, deduct 50 Watts and you'll probably be around it.
     
  12. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    I understand both sides of the issue, but my experience with the Intel Extreme Graphics 2 GPU in my Toshiba tablet PC isn't a very good one. Yes I realize it isn't a gaming GPU and it's in alot of notebook motherboards, but I would think ATi and Nvidia would've countered Intel already in creating practical mobile GPUs for business and non gaming related stuff. I wish Matrox would make a come back or even Via would start creating GPUs on a larger scale for notebooks.
     
  13. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Oh yes, they've been doing that for a long time. Both ATI and Nvidia IGP's can run circles around the GMA950 in every possible way.
    That's just not enough to compete with Intel's IGP's.
    The problem is that until very recently, there's only one serious manufacturer of motherboards for Intel notebooks. That's Intel.

    Sure, ATI and NVidia have some perfectly good motherboards, but if you use those in your notebook, it can't get the Centrino label, and then it'll sell like crap.

    So instead, manufacturers stick with Intel CPU's, Intel motherboards with onboard Intel GPU. And if they want graphics performance, they stick an AMD or Intel discrete GPU on top of it, raising costs and power consumption.

    So Matrox or Via coming back wouldn't make a difference as long as Intel rules the motherboard market so completely.
     
  14. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    If that does indeed prove to be an accurate figure on power consumption, I'd expect to see these cards available in notebook form relatively soon. An 8800GTS could fit in a 17" DTR that's 2" thick and comes with a 150W (or more) AC adapter.
     
  15. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    What about the CPU power (around 34W max), the LCD takes around 15W at full brightness, 15W or so for the hard drive, another 15W or so for wireless, not to mention all the other devices in the system like RAM, the north and south bridges. The GPU would have to drop to 70W max power dissipation to fit in the 150W category like you're proposing, not to mention that current 7XXX series chips dissipate around 30W.
     
  16. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    It's possible at best. Short of the GMA 3000, I don't think there are any guarantees for mobile DX 10.

    There is still a lot to be worked out, and there haven't been any announcements about mobile DX 10 GPUs yet.
     
  17. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Before looking for Mobile DX10 GPUs I would look for mainstream and low-end desktop DX10 GPUs first. In this case, I haven't heard much on the G82 and G83 at all. I would assume they are targeted for early Q1 to coincide with Vista so mobile parts would have to come after that. ATI doesn't look to have mainstream desktop R600 derivatives ready until late Q1/early Q2 so mobile parts from them will take a while.
     
  18. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    *cry*

    was hoping for a dx10 notebook a year from now
     
  19. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    The 177W figure given above is only an approximation, and depends on which test you believe. I've seen figures as low as 120-130W as well. It's really hard to measure accurately, and also depends on the rest of the system (A faster CPU might be able to stress the GPU more, making that use more power as well)

    Yep, but keep in mind that the desktop versions of the 7XXX series consume much more than 30W as well. They're up above 100W in their desktop versions as well. Dropping 70W is done almost every generation, and can be done without that much effort. I think the big worry was that it might use 300W like early rumours suggested. If that was the case, it would be difficult to cram into a notebook. As it is now, it shouldn't be a big problem, at least for the big DTR systems.
     
  20. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    The 177 watt figure comes from Jen-Hsun Huang;

    Course thats not to say it runs 177 watts all the time, that would just be silly. Its probably peak maximum draw.
     
  21. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    I wonder what's the biggest contributer to power? I think the stream processors are fairly efficient and they wouldn't be operating all the time anyways. Probably the memory controller is one of the biggest power consumers. If they cut back to a "smaller" 256-bit memory controller they should be able to shave off quite a bit of power consumption and transistor count, which kind of go hand in hand.

    Still when GPU makers make massive power savings on mobile parts I think it's because they go through the major power consuming sections and lay the transistors by hand. Or at least double check the computer program's calculations. In this case though, a significant amount of the G80 is already done by hand, which is why the increase in power consumption has been relatively small over say the R580. That does mean though that they really don't have much more room to tweak for power. Mobile parts will probably have to use the 80nm process, although that'll only give a small improvement since the half-node is a cost-node rather than a power-node. I believe TSMC is working on a 80nm power-tweaked process, but that may not be suitable for fast GPUs.
     
  22. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Interesting, hadn't seen that before. Still, it's not clear whether we're talking about peak maximum draw or something like TDP. In any case, most real-world tests show it to consume only slightly more than a high-end X1900-series card.

    Keep in mind that the stream processors are running unusually fast (for a GPU), at over 1GHz. That means increased power consumption. Furthermore, there's an awful lot of them, and each of them are more complex than last-generation shader units (because they have to be able to perform both shader- texture and geometry shader ops)

    So I expect them to take a good chunk of the power consumption, but you're probably right the memory controller can get expensive too. And let's not underestimate the actual memory either.

    And keep in mind that this generation they've had to add a lot of scheduling logic just for keeping track of "threads" and assigning work units to different stream processors. On CPU's at least, that kind of work is one of the most complex parts, and is responsible for a large part of the power consumption. (And that's with maybe 9 execution units if we count everything. The 8800 has 128 to keep track of)

    From what I know, a large part of the power savings achieved on mobile parts is typically made just by lowering clock speeds and disabling a few shader units. The power consumption doesn't scale linearly with clock speed (see the P4 for a good example. Just raising the clock speed by 5% on one of the faster models could give you something like a 20-30% increase in power consumption), and newly released GPU's are always very aggressively clocked to begin with. So just by lowering clock speeds, say, 10%, they might be able to make huge savings in power consumption.
    Of course, over time they also polish the actual transistor layout, tweaking things and adding a few power-saving techniques, but a large part of the saving is just from reducing performance a bit.

    And then there's plain old undervolting as well. Cherrypicking the best parts and lowering voltage on them to reduce power consumption is an old trick as well. :)
     
  23. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Only thing I can think of is that DX10 games will really be pushing the card somewhat, I haven't seen any DX10 benchmarks yet. 3DMark07 where are you??
     
  24. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Actually I don't think the stream processors are more complex. In order to increase the clock speed, nVidia did the opposite. The stream processors now use scalar ALUs instead of vector ones and the pipeline depth has been increased. Both contribute to allowing for higher clock speeds. However, the high clock speeds and number are needed to make up for the fact that each individual stream processor is weaker than a full shader.

    http://www.dailytech.com/article.aspx?newsid=4891

    In any case, the latest news from nVidia is that they are currently working on 9 other DX10 cards which are divided into 3 products with 3 models in each segment. They are schedule to launch in February. Either the 3 product segments are going to be GeForce, Go, and Quadro or they're going to be all desktop but low-end, mainstream, and high-end. I think it may well be that they are all desktop parts on the 80nm process. The high-end being a GX2 part, a GTX, and a GTS, the middle-end being a GT, GS, and something, and the low-end being a GT, GS, and TC. Still, I'm sure mobile parts aren't far behind.
     
  25. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    By complex I meant they had to support a larger set of instructions. You're right though, making them scalar is a significant simplification (and one I'd forgotten to take into account).
    The pipeline depth however, doesn't decrease complexity. (And typically, deeper pipelines mean higher power consumption)