http://www.guru3d.com/newsitem.php?id=4545
Title and link says it all. Yay for laptop gamers!
-
-
Notebook Solutions Company Representative NBR Reviewer
Great post, thanks. But I think I am going to skip Santa Rosa for these reasons:
- I know the dual cores are not optimalized yet, but the quad cores are coming. Intel has planned to release it's quad cores Q2-3 2007 or something. If these are native quad cores (not 2x dual cores) then I will buy that notebook.
- DX 10 is not out yet, Vista is demanding too much from gamers, there are not many games featuring DX 10. So I am going to wait.
- 8800GTX in a notebook, hmm what about mobility and power consumption? I think I will skip the 8-series and wait for the 9-series.
Charlie -
yea, 8800GTX is going to be a power guzzler, I wonder what kind of battery life your going to get out of it. 1 hour tops? I think i will also skip the 8-series and go all out on the 9-series, knowing nvidia the 9-series should come out in the near future.
-
Your gonna get a special version of the g80, I've been saying this all along
-
um, i don't think it's confirmed yet
from the article
"It is likely that the mobile version of the GeForce 8800 – if there is any planned – will feature typical 256-bit memory bus and lower amount of execution units, e.g., 64 stream processors, 16 texture units, 16 raster operation units and so on. "
also, it's very speculative
"Mr. Huang did not specify whether his company was sampling a special version of GeForce 8800 graphics processor for mobile computers, or was testing another GeForce 8-series architecture-based graphics chip specifically designed for laptops."
this just shows that it's not really going to be a g80, but probably a crippled version or one done with a smaller process.
*Edited by moderator* -
I'd say that falls in line with what most here were thinking. Of course they're going to be working on mobile versions of their DX 10 GPUs. They want to be able to sell them just as badly as many of us would want to buy them, if not more. We just don't know when they will be ready for release and how powerful they will be. It seems Nvidia is not ready to tip their hand just yet, or is not far enough along to give any hints yet.
-
Yupe, i believe that they produce something that is based on the G80 concept, not the 8800 itself. But there might be some tricks to it if they could make the board smaller, then they just need to use stepping function to lower the 8800 voltage down when it's not plug in to the powercord. If it is plug, then the power will be released. But most probably the laptop would explode into a huge fireball within hours... hehehe
-
At most laptops will get a 7950+ with dx10 functions... I've said this for awhile...
-
Pretty interesting article, check it out:
http://www.xbitlabs.com/news/video/display/20061113141458.html -
Merging this thread with the other one like it (3d Guru's article is based on this xbitlabs one)...
-
i think that the g80, though crippled will at least have much more power then the 7950
i think this cause if they can get the g80 to operate at the same power draw and heat generation as the 7950 and have the same amount of shaders (if not more), then the unified shaders will at least perform better due to them more effectively using their processing power (ie when the 3d app needs more pixel shading power then vertex shading, more of the unified shaders can be converted to be used as pixel shader instead of vertex shading).
hence my thinking that the g80 when when severely crippled, will still perform better then a 7950gtx and has DX10 -
I've noticed a lot of people pondering whether to:
a) Buy a laptop now with a DirectX 9 graphic chip in it and hope it won't become obsolete too quickly (Windows Vista will ship soon with DirectX 10)
b) Wait for DirectX 10 laptops to become available
No-one seems to have much solid information about when DirectX 10 laptops might become available, making the choice even harder.
So here is a little news snippet to get the ball rolling:
http://www.xbitlabs.com/news/video/display/20061113141458.html
Is there any other information like this out there that can help people like me make the 'wait' or 'buy now' decision? -
Hmm.. i thought we had this thread some time ago... And we had some agreement that the notebook DX10 GPU aint gonna be a 8800.
-
Meaker@Sager Company Representative
Well it could come out with the midrange solutions, but the power draw is not actually that much higher than last time, so I reckon they could put an 8800GTS in easy.
-
Thanks for posting the link, but we've actually had a couple of threads about this article already.
Merging... -
*Shrug* the 8800 gives a lot better performance/watt than the 7xxx series. I don't see how it's going to be such a big problem in notebooks. Basically, they could disable half the chip, and they'd have something that performed like a 7900, but used less power.
As Meaker said, the 8800 doesn't consume that much more power than 7900 cards (and only a few percent more than X1900 series cards)
Of course, they're not going to release an unmodified G80 for notebooks. When have they ever done that? The notebook version of the 7900 isn't a G70 either, and the NV40 wasn't available on notebooks either, that was NV41 or 43 or one of those.
So G81/82/83 or one of those is almost certainly going to be a notebook-optimized version of the the G80.
And it could end up doing pretty well power consumption- and performance-wise. -
what is quad cores plz tell me
-
Quad = 4.
So it's a chip with 4 cores. Just like current dual-core CPU's have two cores. -
How much do you guys think a laptop with DX10 capable GPU will be in the first couple of months? How long will it take to drop below $2000?
-
Not much more then DX9 cards huh?
http://www.gamepc.com/labs/view_content.asp?id=8800&page=4&cookie_test=1
They are more then a 7950 SLI'd! A 8800gts even.
While a desktop 7900 gs is 120 watts, a laptop card is 30 watts and is very neutuerd. Your gonna get neutuerd cards. -
-
ltcommander_data Notebook Deity
Some of the things the XBit Labs article said are wierd. They kind of imply that Santa Rosa was Centrino Pro while Centrino Pro is really a version of Santa Rosa with AMT enabled. They also say that Santa Rosa is coming early Q1, which is optimistic since the original timeline called for early Q2. An early Q1 launch for Santa Rosa is unlikely given the scheduling of new Napa based LV and ULV chips in early Q1. New LV and ULV chips will also be launching with Santa Rosa, so Intel will need to space out the introduction sufficiently to at least look like they are reducing overlap. A late Q1 launch may be possible though.
-
what they're saying is that they have begun work on the 8800GTX and GTS for notebooks, but haven't said yet whether they will be "Complete" versions of the desktop counterparts. And i suppose guru3d.com doesn't think that it is possible yet.
just wait til 2008 when the 8800GTS will be like an integrated notebook GPU, or once ATI kicks in the price will drop. -
I've heard reports that July of next year is when we "may" see some DX 10 mobile cards. I wrote more about it here.
-
Wondeful, now we gona need 20 cell batteries for there monsters.... just for 1 hr and 30 min of game time unhooked. :/ And weight? what about weight? wouldnt be suprised with 9-13 lbs for this monster and other goodies.
-
You mean for GPU's with similar power requirements as ATI's X1k cards?
Let's not make it worse than it is. They only use a few percent more power than X1900 cards, and are a hell of a lot faster. So with that kind of performance/watt, it could actually make for some nice mobile GPU's. -
2008 then i might as well wait for AMD Fusion wif Crossfire.... (kidding)
-
I wonder at what point will it simply not be able to cram more graphics power into a video card. I mean, sooner or later something's gotta give right? Either the heat will be too much, or the size, or the fact that the computer will be no longer to utilize all that power effectively.
-
Why? The transistors are getting smaller and smaller, meaning it takes less and less space to do the same stuff (which eliminates size as a problem).
Heat can be tricky, but can be handled with care (The GF7 produced less heat than GF6. GF8 produces only a bit more than ATI's X1k cards, but is twice as powerful).
Performance/watt is going up, which means we can get more and more performance for the same heat output.
And the computer has no problem utilizing the power. GPU's may be powerful, but they have a tough job too. If you run a game at 1600x1200 at 60hz, you end up having to render 120 million pixels to the screen per second. In fact you have to render a hell of a lot more than that, because some fragments are overdrawn several times before the final color to output to the screen pixel is known. That doesn't leave a lot of processing time for each fragment. Even with the most powerful GPU, it's frighteningly easy to make something that's slooooow and choppy. -
To add to what Jalf just said, keep in mind that technology is also becoming more efficient. As long as the consumer wants more power and is willing to pay for it, there will be someone willing to develop it.
DX10 for notebooks
Discussion in 'Gaming (Software and Graphics Cards)' started by metalarmored, Nov 14, 2006.