The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    A8Js or wait for Vista/DX10?

    Discussion in 'Asus' started by jgs9455, Oct 3, 2006.

  1. jgs9455

    jgs9455 Notebook Consultant

    Reputations:
    7
    Messages:
    166
    Likes Received:
    0
    Trophy Points:
    30
    I've been looking at getting a laptop (my 1st one) and I have pretty much narrowed it down to the Asus A8Js. It was between Asus, Acer, & Compal. I like Asus's good reputation, the Core 2 Duo, and the nVidia Go7700. I also like that this laptop has LAN & DVI ports in the back, as I feel on the side connections are cumbersome. I also prefer a DVI over a VGA port. (Will the DVI port have HDCP?)

    However the only thing holding me back is that Windows Vista is right around the corner (and DirectX 10 with it). How long before laptops get DirectX 10 chips? I like the idea of buying a laptop shipped with Vista as the OS because all drivers should work and I don't have to pay extra for a upgrade down the road. Is anyone else waiting because of these same reasons?
     
  2. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    Ok well the first dx 10 card will be the 8800.

    That will not fit in a 14 inch notebook.

    99% that will be released in the dell xps 1710. Then a sager 5670. Thats the first laptops that will be dx 10, DTR maybe this year by christmas. If youre looking for a dtr and youve got 3k then maybe you should wait. I think even in that case dell and sager will let you buy the card seperately.

    So that should answer your question. Waiting for the 8600 will be 6 months if youre asking my speculation.

    So youre waiting for next years model at least, sometime around may itll probably be called the a9js or something.
     
  3. Greg

    Greg Notebook Nobel Laureate

    Reputations:
    7,857
    Messages:
    16,212
    Likes Received:
    58
    Trophy Points:
    466
    DX9 isn't dead...even with DX10 on the horizon. it will be a long time before games are DX10 only, and by then you'll be wanting a new notebook anyway. i'd say go for it now and have some fun with it!
     
  4. jgs9455

    jgs9455 Notebook Consultant

    Reputations:
    7
    Messages:
    166
    Likes Received:
    0
    Trophy Points:
    30
    Thanks, good advice from both of you. I know Vista also supports DX9 and will upgrade to DX10 later on, but will DX10 games like Crysis also run on DX9?
     
  5. stamar

    stamar Notebook Prophet

    Reputations:
    454
    Messages:
    6,802
    Likes Received:
    102
    Trophy Points:
    231
    Yes but slower and with fewer effects. It would be around 3 years from today before a game came out that wouldnt work on your gpu.

    Date totally estimated.
     
  6. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30

    From what I've heard Vista won't ship with DX10 -it will be part of a service pack later on down the road. I don't think that DX10 is going to add much in the way of "eye candy" -the biggest change is that all geometry processing is suppose to be done by the GPU under DX10, but that doesn't mean that you won't be able to play a DX10 game on DX9 hardware...
     
  7. gusto5

    gusto5 Notebook Deity

    Reputations:
    54
    Messages:
    760
    Likes Received:
    0
    Trophy Points:
    30
    as far as i can tell, vista already sort of emulates dx10 on a dx9 card. in RC1, it already says i have DX10 for some odd reason.

    http://www.gusto5.net/dx10.jpg
     
  8. kenyee

    kenyee Notebook Guru

    Reputations:
    0
    Messages:
    72
    Likes Received:
    0
    Trophy Points:
    15
    Laptops seem to be getting a lot more improvement than desktops...if you want something at a reasonable price that runs Crysis, I'd guess about 1-2 yrs after Crysis is released.

    I have a hard time believing anything like the G80 or ATI's equivalent would fit into a laptop for a while because they eat 250Watts... :)

    Get a laptop you'll be happy with for 2-3 years, then upgrade...
     
  9. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30
    Then the story keeps changing -maybe MS will ship DX 10 with Vista. As for that diagnostic tool it's just showing you that you have DX10 loaded, and not that your hardware supports it.
     
  10. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Well , for the 250watts part, we saw some rumors that ATI is prepping for a external card, somewat like external harddisc that we use now a day. It really sounds cool thou, think of it having one of the slimmest laptop but with the freaking 8800 external GPU, u could play watever games on it. It could work quite well, think of it, no one really run games everytime off their battery right?
     
  11. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30
    I just can't see the practicality of an external video card -just doesn't make sense...

    There are two possibilities that I do see:

    1) Etch the GPU onto the same die as the CPU. Current GPUs run hot because the transistors in them require a relatively high voltage to turn on (forward bias) and the voltage requirements are a result of the 90nm etching process. But drop the size of the parts in the GPU down to 30nm or so and you can drop the voltage, reduce the heat, and run the GPU at the same clock speed as the CPU. You really wouldn't need a top of the line GPU either, something like a 7400Go would be fast enough because it currently runs at about 400Mhz -imagine what it could do at 2Ghz...

    2) Etch the GPU using the same process as the CPU so you can bump up the clock speed and reduce the heat. Now take that GPU and put it into an Express Card -maybe with a small heat sink and fan to dissipate the heat. Possible since the Express Card slot has a PCI bus connection (not sure how fast it is though). This one definatly has cool factor written all over it since you could easily upgrade the video card...

    Edit: PCI Express = 2.5 Gbit/sec/direction -I think that would be fast enough to use for a video card...
     
  12. TedJ

    TedJ Asus fan in a can!

    Reputations:
    407
    Messages:
    1,078
    Likes Received:
    0
    Trophy Points:
    55
    Only problem with that idea is that Express Cards are PCI Express 1x only... maximum bandwidth tops out at 2.5Gb/s, which would choke even a midrange card these days.

    Shared CPU/GPU on die, possible plans from AMD notwithstanding, is probably a non-starter as well... at least with current die sizes. Compare transistor counts on current GPUs versus current CPUs and you'll see what I mean. You could always increase the die size to fit both on, but that'll decrease yield and increase cost... besides who wants yet another socket format?
     
  13. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30
    I ran off to do a little reading after I made that post and I'd have to agree. The card would have to be able to aggressively pre-fetch data to use a standard PCI Express bus...

    The manufacturing has gotten pretty good over the last few years, so yield shouldn't be a problem. As for the new socket format: You'd only see it on laptops and small specialty PCs like the kind that are used for Tivo or Windows Media Center machines.

    Power consumption and heat dissipation are two of the key issues when trying to decide what GPU to put in a laptop (price being the third), so no matter what happens the etching process for GPUs needs to get smaller...
     
  14. jgs9455

    jgs9455 Notebook Consultant

    Reputations:
    7
    Messages:
    166
    Likes Received:
    0
    Trophy Points:
    30
    Has there been any Vista driver issues with the A8J models??
     
  15. gusto5

    gusto5 Notebook Deity

    Reputations:
    54
    Messages:
    760
    Likes Received:
    0
    Trophy Points:
    30
    im only here to show that DX10 is loaded, nothing else. We dont even have the same video card, so why would i show him compatability with a different card?
     
  16. TedJ

    TedJ Asus fan in a can!

    Reputations:
    407
    Messages:
    1,078
    Likes Received:
    0
    Trophy Points:
    55
    Sorry, perhaps I wasn't clear. When I mentioned yields, I meant the number of cores (good or bad) you could physically fit onto a wafer. Since most chip fabs can only handle a wafer of a certain size, lower yields (due to size) means higher costs.
     
  17. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30
    I completely understood what you meant -but you have to understand that manufacturing defects have come down so yield wouldn't be a big issue. Just look at Canon's 5D full frame digital camera. It has a CMOS sensor that is 36mm x 24mm square and a few years ago that little piece of silicon would set you back about 7K (about 1K for the camera body and the support electronics, 7K for the sensor alone). But the 5D has a retail price of 3K -and Canon listed improvements in the manufacturing process (higher yields) as the reason why they were able to bring the price down...
     
  18. TedJ

    TedJ Asus fan in a can!

    Reputations:
    407
    Messages:
    1,078
    Likes Received:
    0
    Trophy Points:
    55
    I still think you don't quite understand me. When I say "yield" I'm not referring to the number of cores that pass quality control and are fit for sale... I'm referring to the number of cores that will physically fit onto a wafer.

    [​IMG]

    You fit less larger cores onto a wafer of a certain size. Producing that wafer has a fixed (and very expensive) price. Larger cores are also more prone to faults due to impurities in the wafer. So larger cores will ALWAYS cost more than smaller.

    This is why both nVidia and ATI release GPU cores on smaller die sizes that perform almost exactly the same as the chip it's replacing. Higher yields per wafer means reduced cost, which allows them to decrease price and remain competitive with each other.
     
  19. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30
    You kinda contradicted yourself with that last post Ted -read it carefully cause you gave two definitions of the word "yield". One being the number of cores that can be etched onto a wafer, and the number of usable cores accounting for impurities in the silicon. I understand both definitions but the later is normally used when discussing yield, and both circumstances are used when discussing cost per core...

    Now pretend for just a moment that I'm a network engineer (I actually am BTW) and go back and read my last post before this one. I tried to explain to you that yield, as defined by the number of usable cores, has gone up (and manufacturing costs have actually come down). That's why Canon, one of the few camera companies that actually make their own sensors, can sell a camera with a 36mm x 24mm piece of silicon in it (in effect a "core") for $3000 USD -a price that is less than half of what that same camera would have cost less than 2 years ago. If they were still having yield problems (a low number of usable cores due to impurities) then the price of their cameras wouldn't drop. FWIW Canon actually claims that improved manufacturing techniques is the reason why they've been able to cut costs and they specifically site increased yields (note that the size of the sensor hasn't changed, so their understanding of the word "yield" is the same as mine).

    I'm really not gonna continue pushing my point past this post because it's not worth getting into an argument over. I will say that you seem to be stuck in the past on this issue -have you done any recent reading on LSI manufacturing? Technology marches on and improvements are made all the time. A few years ago I would have agreed with your initial post -but not today... ;)
     
  20. Dalantech

    Dalantech Notebook Consultant

    Reputations:
    149
    Messages:
    249
    Likes Received:
    0
    Trophy Points:
    30
    Sorry -one quick comment on your last statement because it's only partially correct. One benefit of reducing the die size is lowering the voltage that's necessary to forward bias the transistors in an integrated circuit. Lowering the voltage means that battery powered devices will run longer and the part will generate less heat at the same clock speed as the old chip. FWIW: The clock speed of an integrated circuit is not a frequency limit -it's a heat rating...