After looking at the specifications of the upcoming 8800 GTX for the desktop market a few days ago, i was wondering how nVidia is going to pull this off in the notebook market.
I know that G80 has yet to be released, even onto the desktop market, but I would like to discuss a few things with you.
Firstly, i'm wondering how exactly nVidia, is going to be able to do all this. To be able to pack all this power into a card is going to be very difficult at beast. Not to mention the heat constraints that these cards have. The current 7 series has been seriously reduced, in terms of heat, but still has very fast clock speeds. Has nVidia done this with modifications to the core, or to the memory or maybe the card itself. Maybe they can use this type of approach to refine the 8 series for the notebook market.
And also, what about power requirements. Early figures, say that the desktop card will consume upto 200 Watts. I'm sure nVidia will reduce that significantly for the notebook version, but to reduce it to 60-70 Watts is very hard. Any one have any ideas as to how they may reduce power requirement?
-
Well for a start Nvidia is still using a 90nm basis for the G80, a die shrink to 80nm will help alleviate some temperature and heat issues, albeit not by much. At its current spec, the G80 has been rumoured to be too complex for a 80nm build, so it would seem its out of necessity that its being built this way.
And the 8800GTX specs (if true) suggest a 384bit memory interface, which will take up quite a bit of space on the core, and add a few £££ to the price of it. Lowering it down to 256bit will create a smaller and cheaper GPU.
And lets not forget the 128 ALUs its rumoured to have, an absolutely massive amount. I don't see how they will get this onto a notebook GPU, so my guess is, they won't ! It'll be cut down somewhat.
I can't see Nvidia or ATI having mobility DX10 GPUs out until about Easter next year (ATI possibly later than that). When we see the first mid range GPUs based on DX10, I think that'll give us an idea of what we can expect in a laptop part. Then we'll probably see some real high end laptop GPUs later in the year.
(note, this is my opinion only!) -
Would help even more if they turn 65nm.
-
I think we'll see Nvidia and ATI go 65nm by Q3 next year, ready for refresh parts and Xmas rush.
-
Suppose, nVidia, or ATi fo that matter, do manage to cram a DX10 into a notebook form factor, do you think that the performance gain will be alot better than the current 7950 GTX.
Also, about the 256-bit bus. I don't see why Nvidia/ATi would go for such a thing. As sionyboy has mentoined, a 256-bit will be muchmore suited. Will a 384-bit bus result in better performance, or is it maybe for the addition of GDDR4 memory. -
I think the performance gain would have to be more to encourage people to upgrade. There is no point releasing a high end part that is slower than your previous one. The fact it has DX10 support won't wash with people as there simply not enough games coming out that would warrant spending the money.
The 384bit bus...it was rumoured that there would be a regular 256bit bus linking to 512mb then a seperate 128bit bus linking to 256mb for improved AA/AF settings. It does not look like this is the case now, it seems that it is a half step up to 512bit memory interface. 512bit would be expensive to implement on a desktop chip, so Nvidia have take a half step towards it with 384big, hence the 768mb memory arrangement. (12x32bit memory controllers up to 12 64mb memory chips) Its not anything to do with GDDR4, ATI have been using that for a while now. -
Meaker@Sager Company Representative
Lo sionyboy, another of the FM crew
Anyway I can only see a whacking great cooler, some searous magic (ala 6800U) or a clocked down/cut down version for the notebooks.
-
Hey Meaker !
Oh yeah, tis might interest some. I'll go post it in Desktop Hardware though, as that where it belongs
ASUS EN8800GTX/HTDP/768M;
Graphics GPU Feature
�� NVIDIA GeForce 8800GTX
�� Built for Microsoft Windows Vista�� NVIDIA SLI Technology ready
�� NVIDIA unified architecture with GigaThread technology
�� Full support for Microsoft DirectX10.0 and ShaderModel 4.0 enables stunning and complex specialeffects
�� OpenGL2.0 support
�� NVIDIA Quantum Effects Technology
�� True 128-bit floating point high dynamic-range (HDR) lighting
�� Two dual-link DVI outputs support two 2560x1600 resolution displays
Hardware Specification
Model EN8800GTX/HTDP/768M
Graphics Engine GeForce 8800GTX
Video Memory 768MB DDR3
Engine Clock 575MHz
Memory Clock 1.8GHz (900MHz DDR3)
Memory Interface 384-bit
Max. Resolution Up to 2560 x 1600
Bus Standard PCI Express X16
VGA Output YES, via DVI to VGA Adapter
HDTV Output YES, via HDTV Out cable
TV Output YES, via S-Video to Composite
DVI Output DVI-I
Dual DVI Output YES
HDCP Compliant YES
Adaptor/Cable Bundled DVI to VGA adapter
Power Cable*2
HDTV-out cable
Software Bundled 3D Game: Ghost Recon, GTI Racing
3Dmark06
ASUS Utilities & Driver
EN8800GTX/HTDP/768M , is: USD $540 (FOB).
EN8800GTS/HTDP/640M , is: USD $410 (FOB).
http://bbs.mychat.to/read.php?tid=578438 -
I'm waiting for the mid-year refresh so I can go with an X2 version of the new cards, quad-core if available, and try to stuff it in a SFF case with h20. May the Force be with me (because most deities are laughing uncontrollably).
-
It propobly gonna take a year untill a good DX10 card will hit notebooks market. Notebooks are offen 1 year behind.
-
maybe a midrange DX10 card would be on par with the 7950
-
this has been discussed in the Sager section
pretty much, the G80 is currently requiring about 300 watts, which most notebooks dont come close. My Clevo D900K (13 pound notebook) peaks at 220-250 watts
But one this is for sure, if the G80 is going to go mobile, its going to be Clevo to design the first notebook to throw it in.
If you dont know it yet, Clevo is the boutique design and manufacturer of high end notebooks for Alienware, Sager, VooDoo PC, etc... They were the first to do a lot of things of notebooks (1st to use SLI, first dual core notebook, first to support internal dual harddrives (with RAID) and dual optical drives at once), etc...)
Only time will tell. -
They will probably make some slightly slower version of the G80 available for notebooks (obviously tweaked and undervolted like crazy, but it could be done for DTR's)
For the rest of us, we'll probably have to wait a bit. The G80 chip itself is probably a lost cause, but presumably they're going to make G81, G82, whatever, as smaller, cheaper and more efficient versions with fewer shaders and a smaller memory interface (256 bit or less). And these could easily work well with a notebook.
Probably going to be a couple of months though... -
Notebook Solutions Company Representative NBR Reviewer
Gophn, the reason why you use 200 Watt is because you have a desktop processor and 2 optical drivers etc.
The Centrino's score as good as X2 of AMD but have much lower TDP. The Core 2 Duo Santa Rosa is going to use 27 Watt (not sure but it was around 30). But dont forget, ATI has a new technology used in the X1700 which gives lower TDP and still good performance. I think ATI is going to win this battle, nVidia just uses too much power consumption and consumers dont like that.
Charlie -
But compare a 7700 against the X1700, i ll take the 7700 anytime.
ATI will WIN the crown after they settle down the merging with AMD, which is maybe 1 year later or sooner. Nvidia wont be going anyway then unless they come to a joint venture with Intel. -
Meaker@Sager Company Representative
The 7700 is between the 7600gt and 7600 so is nothing special, the x1700 is like the x1600 but with Avivo and GDDR4 support. So I dont see the possitions in my chart changing.
Why would Nvidia dissapear? You think intel is going to be able to match Nvidia in their market? It would take a HUGE investment by intel to match graphics wise what nvidia is pumping out. -
No, i never said nvidia is going to disappear, but there ll be 1 day that both of them cannot catch up with AMD + ATI. Intel will never match nvidia in the GPU department but AMDATI will and even surpass it by such a great amount that intel and nvidia r forced to work together... probably in 5 years time.
-
Meaker@Sager Company Representative
Hmz, no. But im not going into it now.
-
Well, there are rumours NVidia is going to start making CPU's as well. They recently bought a team of highly skilled ex-intel CPU engineers.
Anyway, ATI doesn't have a "new" technology for reducing power consumption. They have the same tricks that have been known for ages, and have been used more or less thoroughly on GPU's for years.
According to most rumours, the R600 is going to consume *at least* as much power as the G80, so don't get your hopes up for that. (Also, the GF7 series used less power than both GF6's and Radeon X1k series, so while NVidia might "use too much power", I highly doubt ATI is going to do any better. -
ATI used to be great on power consumption, but since they made the jump from R300 based cores their power consumption, heat output and die size have sky rocketed. R600 is suppose to be no better, I was told today that the R600 cooler will be bigger than the actual PCB.
And I don't think Nvidia or Intel need to merge to compete with AMDATI. Nvidia is doing just fine on its own, its profits and income are above ATI. Intel control the majority of the CPU and the Graphics market (albeit integrated graphics) and have just done a deal with Imagination Technologies (PowerVR) to develop mobile chips. -
I think that for the time being that Intel and nVidia won't nedd to merge, because ATi and AMD haven't really developed much that is going to challenge either of the companies current products. I don't really think that the product that is supposed to compete against Intel's Centrino, forgotten what the name is, is really going to pose a problem either.
However, after a while, like maybe a year or more, i have no idea, AMD and ATi will certainly develop something that will be fast. That may be due to factors such as CPu to GPu comunication. Yes, the chipset will also need to be rethought, but Ati can take care of that. -
I doubt Intel and NVidia would ever merge. First, they're both too big. They'd get into trouble if they tried it, and second, they're both too proud to let another company gain control of them (and equally reluctant to merge with companies they can't control)
I think the recent rumours that NVidia might start making CPU's are more likely. If NVidia is going to merge with a CPU manufacturer, it's not going to be Intel. Via, or one of the other small manufacturers might be more likely.
But yeah, in the short term, NVidia doesn't need to merge with anyone. They've been handed the market for 3rd party GPU's. (especially the Intel part of it, now that ATI has been bought by AMD) -
How long you think until we see the new dx10, like the new 8800 card in our laptops ? Worth waiting until theese are out on the market, or may that be to long gone in the future?
-
I'd say that would be quite a while, the desktop version of the 8800 isnt out untill a bit after vista and then they've got to recreate it for notebooks, so i'd say quite a way in the future!
-
Desktop 8800GTX out in November, I would imagine they would launch the DX10 laptop parts around the same time as they release the midrange cards, say March time. Ati possibly a bit later, their DX10 part is not out until Q1 next year.
-
So, would you say buy a 7950 now, and wait ~1year and then upgrade to a dx10 system?
-
That sorta depends on a few small factors such as "how long can I wait", "how much am I willing to spend", and just "what am I looking for in a notebook".
Personally, I wouldn't buy either gpu for a notebook, but that's just me. I like my notebook portable, which means more than 5 minutes of battery life, and weighing less than I do.
Your mileage may vary, in which case you're going to have to tell us a bit more information. -
If you can afford it, sure! There will not be any DX 10 exclusive games for a long time yet, its not like you won't be able to play new releases within the next year with that card. You'll have to play at DX9 level for any titles that have DX10 effects (Crysis for example)
Some titles are Vista exclusive, such as Alan Wake and Halo 2, but providing you have Vista then you can play with a DX9 card. -
Im using a Dell Inspiron 9300 with Gf 6800 (256mb), 2gig ram and 1,86 Pentium M atm. Im trying to figure out if the current Dell XPS is the hugh uppgrade i want, or if im better of waiting 2-3 months and then hopefully buy a nice dx10 card =/
-
*Shrug* you can always get something better if you wait 2-3 months.
If you're not in a hurry, then wait. If you don't mind spending twice the money, buy a fast GPU now, and then a DX10 one in a few months.
And if you need a good system now, buy one now.
If you want to know exactly how a 6800 and a 7950 compares, look up a review/benchmark. There's plenty of information about that. It depends on how much of a performance boost you're looking at. Some people think 20% extra is "huge", and easily worth spending another $2000 for. Others want a 100% or more before it's worth upgrading.
In short, don't ask *us* whether *you* want to upgrade. You're the one who knows how much of an improvement you'd need to justify the cost, how much you're willing to spend, when you're going to upgrade next, and everything else that makes it possible to make a decision.
All we can do is tell you factual things like what each card is capable of, or how many games are going to take advantage of a DX10 card. But none of us can tell you whether you are "better off waiting". -
I say wait. the 6800 is a pretty strong card still, and you should get about a year or so of good performance out of it (more than that if you don't mind turning settings down to medium or lower). I personally see it taking a little while for DX10 cards to come to notebooks, at least not severly underclocked ones. Give it about 6-10 months and then consider upgrading IMHO.
-
-
However, I'm wondering why some people have the insolence to speak for all of us. If you don't think that you are able to say anything helpful, so just don't do it, especially in that way.
-
-
Charles P. Jefferies Lead Moderator Super Moderator
-
-
That said, not everyone is as touchy as you. I didn't mean any offense (and I don't know if any was taken by the OP. You obviously took great offense, but since the remark wasn't directed at you, I couldn't care less).
But hey, I'm glad that you feel able to tell people whether they're better off waiting, without knowing what they want to wait for, what their needs are or anything else. But I'm still calling BS on that. Because you can't.
And I see you've taken it upon yourself to decide that posts asking for more detail before giving advice "are not helpful, and should not be posted".
I'm sorry to disagree with you once again, but I think those posts serve a rather important purpose. They serve to get more information out of the OP, so that we can offer more relevant suggestions. In other words, I think they *should* be posted.
So, to the OP, I'm sorry if I offended you. If you reread my post, you'll hopefully see that yes, it might be direct and to the point, but all I was actually saying was simply that it's impossible to give you any useful advice unless you tell us what your criteria are. If you're one of the people who always want the latest and greatest, and don't mind spending $2000 every other month, my suggestion about whether or not to upgrade is going to be a lot different from if you want a notebook that's going to support and play games made in 5 years time.
Yes, each of us can offer our own advice, but if it's based on our preferences, its usefulness is limited. If we know *your* preferences, it becomes a lot more useful.
I suppose I could have worded my post better, but I hope you didn't take offense. None was meant, in any case.
G80 in Notebooks
Discussion in 'Gaming (Software and Graphics Cards)' started by Zero, Oct 23, 2006.