I've been looking at getting a laptop (my 1st one) and I have pretty much narrowed it down to the Asus A8Js. It was between Asus, Acer, & Compal. I like Asus's good reputation, the Core 2 Duo, and the nVidia Go7700. I also like that this laptop has LAN & DVI ports in the back, as I feel on the side connections are cumbersome. I also prefer a DVI over a VGA port. (Will the DVI port have HDCP?)
However the only thing holding me back is that Windows Vista is right around the corner (and DirectX 10 with it). How long before laptops get DirectX 10 chips? I like the idea of buying a laptop shipped with Vista as the OS because all drivers should work and I don't have to pay extra for a upgrade down the road. Is anyone else waiting because of these same reasons?
-
Ok well the first dx 10 card will be the 8800.
That will not fit in a 14 inch notebook.
99% that will be released in the dell xps 1710. Then a sager 5670. Thats the first laptops that will be dx 10, DTR maybe this year by christmas. If youre looking for a dtr and youve got 3k then maybe you should wait. I think even in that case dell and sager will let you buy the card seperately.
So that should answer your question. Waiting for the 8600 will be 6 months if youre asking my speculation.
So youre waiting for next years model at least, sometime around may itll probably be called the a9js or something. -
DX9 isn't dead...even with DX10 on the horizon. it will be a long time before games are DX10 only, and by then you'll be wanting a new notebook anyway. i'd say go for it now and have some fun with it!
-
Thanks, good advice from both of you. I know Vista also supports DX9 and will upgrade to DX10 later on, but will DX10 games like Crysis also run on DX9?
-
Date totally estimated. -
From what I've heard Vista won't ship with DX10 -it will be part of a service pack later on down the road. I don't think that DX10 is going to add much in the way of "eye candy" -the biggest change is that all geometry processing is suppose to be done by the GPU under DX10, but that doesn't mean that you won't be able to play a DX10 game on DX9 hardware... -
as far as i can tell, vista already sort of emulates dx10 on a dx9 card. in RC1, it already says i have DX10 for some odd reason.
http://www.gusto5.net/dx10.jpg -
Laptops seem to be getting a lot more improvement than desktops...if you want something at a reasonable price that runs Crysis, I'd guess about 1-2 yrs after Crysis is released.
I have a hard time believing anything like the G80 or ATI's equivalent would fit into a laptop for a while because they eat 250Watts...
Get a laptop you'll be happy with for 2-3 years, then upgrade... -
-
-
There are two possibilities that I do see:
1) Etch the GPU onto the same die as the CPU. Current GPUs run hot because the transistors in them require a relatively high voltage to turn on (forward bias) and the voltage requirements are a result of the 90nm etching process. But drop the size of the parts in the GPU down to 30nm or so and you can drop the voltage, reduce the heat, and run the GPU at the same clock speed as the CPU. You really wouldn't need a top of the line GPU either, something like a 7400Go would be fast enough because it currently runs at about 400Mhz -imagine what it could do at 2Ghz...
2) Etch the GPU using the same process as the CPU so you can bump up the clock speed and reduce the heat. Now take that GPU and put it into an Express Card -maybe with a small heat sink and fan to dissipate the heat. Possible since the Express Card slot has a PCI bus connection (not sure how fast it is though). This one definatly has cool factor written all over it since you could easily upgrade the video card...
Edit: PCI Express = 2.5 Gbit/sec/direction -I think that would be fast enough to use for a video card... -
Only problem with that idea is that Express Cards are PCI Express 1x only... maximum bandwidth tops out at 2.5Gb/s, which would choke even a midrange card these days.
Shared CPU/GPU on die, possible plans from AMD notwithstanding, is probably a non-starter as well... at least with current die sizes. Compare transistor counts on current GPUs versus current CPUs and you'll see what I mean. You could always increase the die size to fit both on, but that'll decrease yield and increase cost... besides who wants yet another socket format? -
Power consumption and heat dissipation are two of the key issues when trying to decide what GPU to put in a laptop (price being the third), so no matter what happens the etching process for GPUs needs to get smaller... -
Has there been any Vista driver issues with the A8J models??
-
-
-
-
You fit less larger cores onto a wafer of a certain size. Producing that wafer has a fixed (and very expensive) price. Larger cores are also more prone to faults due to impurities in the wafer. So larger cores will ALWAYS cost more than smaller.
This is why both nVidia and ATI release GPU cores on smaller die sizes that perform almost exactly the same as the chip it's replacing. Higher yields per wafer means reduced cost, which allows them to decrease price and remain competitive with each other. -
You kinda contradicted yourself with that last post Ted -read it carefully cause you gave two definitions of the word "yield". One being the number of cores that can be etched onto a wafer, and the number of usable cores accounting for impurities in the silicon. I understand both definitions but the later is normally used when discussing yield, and both circumstances are used when discussing cost per core...
Now pretend for just a moment that I'm a network engineer (I actually am BTW) and go back and read my last post before this one. I tried to explain to you that yield, as defined by the number of usable cores, has gone up (and manufacturing costs have actually come down). That's why Canon, one of the few camera companies that actually make their own sensors, can sell a camera with a 36mm x 24mm piece of silicon in it (in effect a "core") for $3000 USD -a price that is less than half of what that same camera would have cost less than 2 years ago. If they were still having yield problems (a low number of usable cores due to impurities) then the price of their cameras wouldn't drop. FWIW Canon actually claims that improved manufacturing techniques is the reason why they've been able to cut costs and they specifically site increased yields (note that the size of the sensor hasn't changed, so their understanding of the word "yield" is the same as mine).
I'm really not gonna continue pushing my point past this post because it's not worth getting into an argument over. I will say that you seem to be stuck in the past on this issue -have you done any recent reading on LSI manufacturing? Technology marches on and improvements are made all the time. A few years ago I would have agreed with your initial post -but not today... -
A8Js or wait for Vista/DX10?
Discussion in 'Asus' started by jgs9455, Oct 3, 2006.