The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    DX10 for notebooks

    Discussion in 'Gaming (Software and Graphics Cards)' started by metalarmored, Nov 14, 2006.

  1. metalarmored

    metalarmored Newbie

    Reputations:
    1
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
  2. Notebook Solutions

    Notebook Solutions Company Representative NBR Reviewer

    Reputations:
    461
    Messages:
    1,849
    Likes Received:
    0
    Trophy Points:
    55
    Great post, thanks. But I think I am going to skip Santa Rosa for these reasons:

    - I know the dual cores are not optimalized yet, but the quad cores are coming. Intel has planned to release it's quad cores Q2-3 2007 or something. If these are native quad cores (not 2x dual cores) then I will buy that notebook.
    - DX 10 is not out yet, Vista is demanding too much from gamers, there are not many games featuring DX 10. So I am going to wait.
    - 8800GTX in a notebook, hmm what about mobility and power consumption? I think I will skip the 8-series and wait for the 9-series.

    Charlie :)
     
  3. cabral

    cabral Notebook Consultant

    Reputations:
    0
    Messages:
    179
    Likes Received:
    0
    Trophy Points:
    30
    yea, 8800GTX is going to be a power guzzler, I wonder what kind of battery life your going to get out of it. 1 hour tops? I think i will also skip the 8-series and go all out on the 9-series, knowing nvidia the 9-series should come out in the near future.
     
  4. Zellio

    Zellio The Dark Knight

    Reputations:
    446
    Messages:
    1,464
    Likes Received:
    0
    Trophy Points:
    55
    Your gonna get a special version of the g80, I've been saying this all along :p
     
  5. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    um, i don't think it's confirmed yet


    from the article


    "It is likely that the mobile version of the GeForce 8800 – if there is any planned – will feature typical 256-bit memory bus and lower amount of execution units, e.g., 64 stream processors, 16 texture units, 16 raster operation units and so on. "

    also, it's very speculative

    "Mr. Huang did not specify whether his company was sampling a special version of GeForce 8800 graphics processor for mobile computers, or was testing another GeForce 8-series architecture-based graphics chip specifically designed for laptops."


    this just shows that it's not really going to be a g80, but probably a crippled version or one done with a smaller process.

    *Edited by moderator*
     
  6. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    I'd say that falls in line with what most here were thinking. Of course they're going to be working on mobile versions of their DX 10 GPUs. They want to be able to sell them just as badly as many of us would want to buy them, if not more. We just don't know when they will be ready for release and how powerful they will be. It seems Nvidia is not ready to tip their hand just yet, or is not far enough along to give any hints yet.
     
  7. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Yupe, i believe that they produce something that is based on the G80 concept, not the 8800 itself. But there might be some tricks to it if they could make the board smaller, then they just need to use stepping function to lower the 8800 voltage down when it's not plug in to the powercord. If it is plug, then the power will be released. But most probably the laptop would explode into a huge fireball within hours... hehehe
     
  8. Zellio

    Zellio The Dark Knight

    Reputations:
    446
    Messages:
    1,464
    Likes Received:
    0
    Trophy Points:
    55
    At most laptops will get a 7950+ with dx10 functions... I've said this for awhile...
     
  9. CeeNote

    CeeNote Notebook Virtuoso

    Reputations:
    780
    Messages:
    2,072
    Likes Received:
    0
    Trophy Points:
    55
  10. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    Merging this thread with the other one like it (3d Guru's article is based on this xbitlabs one)...
     
  11. hmmmmm

    hmmmmm Notebook Deity

    Reputations:
    633
    Messages:
    1,203
    Likes Received:
    0
    Trophy Points:
    55
    i think that the g80, though crippled will at least have much more power then the 7950

    i think this cause if they can get the g80 to operate at the same power draw and heat generation as the 7950 and have the same amount of shaders (if not more), then the unified shaders will at least perform better due to them more effectively using their processing power (ie when the 3d app needs more pixel shading power then vertex shading, more of the unified shaders can be converted to be used as pixel shader instead of vertex shading).

    hence my thinking that the g80 when when severely crippled, will still perform better then a 7950gtx and has DX10
     
  12. Renegade666

    Renegade666 Notebook Enthusiast

    Reputations:
    0
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    5
    I've noticed a lot of people pondering whether to:

    a) Buy a laptop now with a DirectX 9 graphic chip in it and hope it won't become obsolete too quickly (Windows Vista will ship soon with DirectX 10)
    b) Wait for DirectX 10 laptops to become available

    No-one seems to have much solid information about when DirectX 10 laptops might become available, making the choice even harder.

    So here is a little news snippet to get the ball rolling:

    http://www.xbitlabs.com/news/video/display/20061113141458.html

    Is there any other information like this out there that can help people like me make the 'wait' or 'buy now' decision?
     
  13. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Hmm.. i thought we had this thread some time ago... And we had some agreement that the notebook DX10 GPU aint gonna be a 8800.
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,913
    Trophy Points:
    931
    Well it could come out with the midrange solutions, but the power draw is not actually that much higher than last time, so I reckon they could put an 8800GTS in easy.
     
  15. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    Thanks for posting the link, but we've actually had a couple of threads about this article already.

    Merging...
     
  16. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    *Shrug* the 8800 gives a lot better performance/watt than the 7xxx series. I don't see how it's going to be such a big problem in notebooks. Basically, they could disable half the chip, and they'd have something that performed like a 7900, but used less power.

    As Meaker said, the 8800 doesn't consume that much more power than 7900 cards (and only a few percent more than X1900 series cards)

    Of course, they're not going to release an unmodified G80 for notebooks. When have they ever done that? The notebook version of the 7900 isn't a G70 either, and the NV40 wasn't available on notebooks either, that was NV41 or 43 or one of those.
    So G81/82/83 or one of those is almost certainly going to be a notebook-optimized version of the the G80.

    And it could end up doing pretty well power consumption- and performance-wise.
     
  17. suraj

    suraj Notebook Evangelist

    Reputations:
    0
    Messages:
    311
    Likes Received:
    0
    Trophy Points:
    30
    what is quad cores plz tell me
     
  18. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Quad = 4.
    So it's a chip with 4 cores. Just like current dual-core CPU's have two cores.
     
  19. sojourner21

    sojourner21 Notebook Consultant

    Reputations:
    0
    Messages:
    108
    Likes Received:
    0
    Trophy Points:
    30
    How much do you guys think a laptop with DX10 capable GPU will be in the first couple of months? How long will it take to drop below $2000?
     
  20. Zellio

    Zellio The Dark Knight

    Reputations:
    446
    Messages:
    1,464
    Likes Received:
    0
    Trophy Points:
    55
  21. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    They are probably only going to be available in the high-end systems at first. So I'd guess prices will be in the $2500-4000 range.
     
  22. ltcommander_data

    ltcommander_data Notebook Deity

    Reputations:
    408
    Messages:
    1,398
    Likes Received:
    0
    Trophy Points:
    55
    Some of the things the XBit Labs article said are wierd. They kind of imply that Santa Rosa was Centrino Pro while Centrino Pro is really a version of Santa Rosa with AMT enabled. They also say that Santa Rosa is coming early Q1, which is optimistic since the original timeline called for early Q2. An early Q1 launch for Santa Rosa is unlikely given the scheduling of new Napa based LV and ULV chips in early Q1. New LV and ULV chips will also be launching with Santa Rosa, so Intel will need to space out the introduction sufficiently to at least look like they are reducing overlap. A late Q1 launch may be possible though.
     
  23. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    what they're saying is that they have begun work on the 8800GTX and GTS for notebooks, but haven't said yet whether they will be "Complete" versions of the desktop counterparts. And i suppose guru3d.com doesn't think that it is possible yet.

    just wait til 2008 when the 8800GTS will be like an integrated notebook GPU, or once ATI kicks in the price will drop.
     
  24. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    I've heard reports that July of next year is when we "may" see some DX 10 mobile cards. I wrote more about it here.
     
  25. Xonar

    Xonar Notebook Deity

    Reputations:
    1,457
    Messages:
    1,518
    Likes Received:
    13
    Trophy Points:
    56
    Wondeful, now we gona need 20 cell batteries for there monsters.... just for 1 hr and 30 min of game time unhooked. :/ And weight? what about weight? wouldnt be suprised with 9-13 lbs for this monster and other goodies.
     
  26. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    You mean for GPU's with similar power requirements as ATI's X1k cards?

    Let's not make it worse than it is. They only use a few percent more power than X1900 cards, and are a hell of a lot faster. So with that kind of performance/watt, it could actually make for some nice mobile GPU's.
     
  27. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    2008 then i might as well wait for AMD Fusion wif Crossfire.... (kidding)
     
  28. sojourner21

    sojourner21 Notebook Consultant

    Reputations:
    0
    Messages:
    108
    Likes Received:
    0
    Trophy Points:
    30
    I wonder at what point will it simply not be able to cram more graphics power into a video card. I mean, sooner or later something's gotta give right? Either the heat will be too much, or the size, or the fact that the computer will be no longer to utilize all that power effectively.
     
  29. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Why? The transistors are getting smaller and smaller, meaning it takes less and less space to do the same stuff (which eliminates size as a problem).
    Heat can be tricky, but can be handled with care (The GF7 produced less heat than GF6. GF8 produces only a bit more than ATI's X1k cards, but is twice as powerful).

    Performance/watt is going up, which means we can get more and more performance for the same heat output.
    And the computer has no problem utilizing the power. GPU's may be powerful, but they have a tough job too. If you run a game at 1600x1200 at 60hz, you end up having to render 120 million pixels to the screen per second. In fact you have to render a hell of a lot more than that, because some fragments are overdrawn several times before the final color to output to the screen pixel is known. That doesn't leave a lot of processing time for each fragment. Even with the most powerful GPU, it's frighteningly easy to make something that's slooooow and choppy.
     
  30. chrisyano

    chrisyano Hall Monitor NBR Reviewer

    Reputations:
    956
    Messages:
    5,504
    Likes Received:
    0
    Trophy Points:
    205
    To add to what Jalf just said, keep in mind that technology is also becoming more efficient. As long as the consumer wants more power and is willing to pay for it, there will be someone willing to develop it.