Can anyone confirm that nVidia will release a DX10 GPU on the 7xxx line for notebooks? (And if so, what's the timetable looking like?)
-
-
Notebook Solutions Company Representative NBR Reviewer
7-series of nVidia does not support DX 10, I think you are mentioning the 8-series. I think those videocards will be released when Santa Rose comes out, so in a few months.
Charlie -
-
I was hearing rumbles that the 8-series will not be usable in notebooks until H2 due to heat issues...but I assume based on your (Charlie's) statement, that this is not so, and they will be out with Santa Rosa?
-
I will give you 90-10 odds that the mobile 8 series will be out in february.
It will predate santa rosa. What gets released, might disapoint the consumer but the cards are already ready to go its just a manufacturing thing.
You see theres a short period of blindness they can take advantage of between the release of the os and the release of the games.
They can put out anything and call it dx 10 compatible and you cant tell them otherwise. -
well they are making 65nano chips instead of 90one for desktop one
however it would take a long time until mobile one comes out
like between late 2007 to 2009
but i can't guarenty you though -
-
I'm willing to bet that there will *not* be any 7-series derivative with DX10 support. Ever. The changes required would be huge. And they already happen to have a DX10 architecture, so why would they spend countless millions on that project?
I also highly doubt there will be any major problems fitting the 8-series architecture into a notebook. Of course, it'll have to be scaled down, but that's nothing new. On the other hand, it offers better performance/watt than any other GPU on the market, so they can actually scale it down and get better performance at the same wattage compared to existing chips.
Whether they can do this within the february timeframe is questionable though... Guess we'll see.
They haven't announced anything about mobile versions of the chip though. -
I give 90% the chance there will be an 8400 go chip released in february.
This is just because Ive seen the stats of the dell e1706. Its a non santa rosa 17 inch with the 8400 go.
Its supposed to be out if february. The first dell to ship with vista.
It might not Im definitely not in touch with dells exact dates. But I would bet 9 dollars to make 1.
There is already the asus a8jr with the ati x2300 going to ship in europe feb 1 with vista premium
The nvidia one should come out the same time. It would be weird if it didnt but computers are weird all the time so...
theres nothing else they are working on with the 8400 go I can at least tell you that its just a matter of making enough for a dell e1706
And selling the introduction to the highest bidder. -
When Im talking about the 8400 and the ati 2300 they are dx 10 compatible but theres no dx 10 software. The first dx10 games, theyre not going to run very well on the ati 2300 and the nvidia 8400
Thats why they want to introduce them first.
Thats what I was refering to earlier a blindness period. The chips will come out but no one will have any idea what they do, theres no software.
The software itself still has to wait for the gamer chips because a game like crysis cant come out to a market of desktop owners with 8800 gtx.
To the best of my estimate the ati x2300 should not even meet its minimum specs.
So its not false advertising its just a gpu that will come out and not actually do what some people want it to do. -
So what your saying is that there may be a supposed DX10 card out soon but it won't really work to the extent that most consumers will expect......in other words.....the DX10 software won't really work with these first supposed DX10 cards?
-
Ya I think I said that.
The only dx 10 software I know of is crysis and unreal tournament 07, so what purpose a ati x2300 has Im dubvious of. -
Buying a x2300 or 8400 (if these are the names they're going to use) will give you the cheapest possible DX10 card. With the lowest possible performance. Just like getting a 7200 gives you the lowest performance of any Geforce 7-series card. -
The release names are ati x2300 and g84
the main difference between a low end dx 10 gpu and a low end dx 9 gpu,
is the only dx 10 software thats going to be coming out anytime soon is high end software. not sim city its unreal tournament and crysis.
So running this game low end is not that valuable
life is all about contradictions but if your rushing to get the gpu thats going to run special lighting effects in the next sim city.... when theres no sim city coming out for 6 months at least .... youre in sort of a contradiction -
Anyway, keep in mind that these cards will still be faster than low-end DX9 cards. And you don't know how low the upcoming DX10 games will scale. A number of the features supported by DX10 can be used to improve performance rather than provide more eye candy. In other words, simply supporting DX10 might boost performance a fair bit.
But of course, for anyone who want to be able to play with decent graphics, these probably aren't the most obvious cards to buy.. -
Alright so...theoretically, what's faster: nVidia 7800 or 8200? Or perhaps I should ask with a real example: 6800 or 7200?
-
nvidia 7800 would be faster, according to history. And to it, a 8600 would be on par with the 7800. Pure guesses... But u would never know since DX10 are more to more shaders and stuff rather than just transfer rate... not pro in this
-
SavantEdge: As lunateck said, historically N800 have always been faster than (N+1)200. And I'd expect the same to be the case this generation.
But this time around, there's the additional difference of a major performance leap. The 8200 (or whatever) will be able to do things the 7800 can't. And some of these things allow them to compensate for *some* of the performance difference. A 7800 will still be faster, yes, but the hypothetical 8200 would perform better in DX10-supporting games than in DX9 ones.
Then again, all of this is pure guesswork since none of these cards have been anounced yet, and we don't know the specs or anything, and there are no DX10 games either, no Vista drivers, and so on. -
I'm curious to see how this all plays out. And as Jalf has pointed out--they have yet to be released officially.
-
I understand that the 8400 & 8200 would techincally be considered the low-end of DX10, but lets consider something.
for the 8800, (and I assume the whole g80 line) nvidia came out with a whole brand new architecture as well as giving the 8800 600+ million transistors, which is more than any video card or cpu in existence (that i know of)
so, even when you consider scaling down for the 84 and 82, isn't possible that the 8200 and 8400 be equal to or more likely greater than the old 7800, 7900 gpus? -
The real G80 core, with all the benefits and the raw power, will come out much later, as nVidia has a much bigger job on their hands to reduce the power consumption at a resonable level, but they also need to keep the performance high, and efficient. This is why you will see slower models coming out first, from bith ATI and nVidia. -
so no hope of anything at all close to a 7900 within a few months?....that sux.
-
The same is true for DX10. Games will take quite a while to come out, or until the market at least is ready for them. So, if you need something with Go 7900 GS power not, there is no reason not to buy it now. -
lol now I'm feelin disappointed...I'm on nvidia's site right now looking at the tech specs of the cards and the 7400 looks like it sucks...
I guess what I need to ask is how well are the 82/84 cards gonna be able to handle last years games? will I be able to do most of them at relatively high settings? I'm sure older games should be okay but I'm getting a WUXGA screen so will 1900x1200 resolutions work? otherwise, games in lower resolution better not look bad because of upscaling.
that leads me to my next question: What about high definition video? I'm sure it'll have all of the Purevideo features in it but will it be perfectly smooth playback? If dell puts Blu-ray on the Inspirons in a few months, I'm getting BR with it. -
Mr._Kubelwagen More machine now than man
Hey folks,
So, from what I gather - no one has any exact date when mid-range (x2600, 8600) gpus are coming out. But, I am starting university in the fall, and am wondering if I should hold off until the very last possible moment to get a laptop. (I'm thinking similarly spec'd to the G1, plus the mid-range DX-10 capable card). Is this a feasable option? Will the new lines of DX-10 cards be out by June/July of this year? -
Bottom line is the best you can hope for right now is speculation when it comes to the performance of future GPUs. I'd say use Jalf's assessment of the 7600=6800 and 6800*2=7800 as a guide.
I would also wait until their prices come down some as well. If I wanted a Blu-Ray drive, I'd get a PS3 for $600 (cheaper than the Blu-Ray drive upgrade on a notebook).
-
well, I hope they are released on time. although really I could put them off until later summer, I'd really prefer not to. first of all, I have 3.5 GB left on my desktop as it is, (which is gonna get filled quick with 500 MB 1080i music videos). second of all, I have good reason to get a notebook for this thing I'm doing in the summer where I'll have access to some decent quality broadband compared to my crap DSL at home.
but anyways, I just found news about the upcoming 8600 and 8300 for desktop. can we draw any conclusions from this?
http://forumz.tomshardware.com/hardware/Geforce-8600-Ultra-GT-8300-leaked-news-ftopict218320.html -
I don't think we can draw any conclusions based on the release of desktop GPUs. We can merely speculate...
I would wait until summer anyway, with all of the new technology about to be released. It would also give you the best chance at a DX 10 card. -
-
-
Let me start by saying that as some of you have pointed out - this is just speculation. There haven't been any leaked specs or benchmarks that I can find yet.
I think those of you that are saying the new generation of chips will have about 1 step up on the previous generation are right. I don't think this has much of anything to do with DX10 implementation; this is mainly the result of newer & better architecture coming out.
I'd say something like this is a good guess for pure frame rate power
8600 = 7800
8400 = 7600 = 6800
8200 = 7400 = 6600
x2600 = x1900
x2300 = x1450 = x600
For those thinking that they "need" to get a DX10 card to be able to play all the new games coming out, I think you are mistaken as well. With DX10 only being released on Vista (not XP) it won't have widespread adoption for at least another year. Game developers will slowly begin to support DX10 so they can use shader model 4 and some of the other new API's. But they will continue to support DX9 for years to come or risk loosing a percentage of their customers.
If you read interviews by game developers, they'll tell you that DX9 will be mainstream into 2008 with support into 2009. DX10 is not as big a jump in API's that some of the other levels had.
I think the idea that you'll get "better" response from an entry level DX10 card than you would a mid-level DX9 is also wrong. On the first round or two of DX10 games coming out (Crysis, Alan Wake, etc) I don't think you'll get anything other that the lowest settings possible on an G84/x2300 and the frame rates will still be lousy. When you drop to low settings you won't get complex shading or any of the bells and whistles with DX10. If you had a 7700/x1700 you won't get the DX10 bells and whistles either, but at least you can play it with moderate settings and decent frame rates.
For those expecting to see February releases, I think you are a bit too optimistic. There may 1 or 2 releases that make it out by the end of the month, but I'd put my money on March to get more than a single SKU and April before they are common place upgrades.
As I started out saying - this isn't gospel, just one guys opinion and I don't know any more about this than the rest of you. We'll just have to wait and see. -
Hopefully Intel will be in this fast, so i get a low power gun toting 8800 equivalent GPU thanx to their 45nm k-gate... ermm... forget bout gun toting... just gimme low power consuming 8800 that i can put it into a laptop.
-
http://forum.notebookreview.com/showthread.php?t=101280 -
I saw that, just din reply. What i want is not a tune down version of the 8800... i want full power... or at least at 90%
-
heheh looks like there is a lot of guessing going on in here
the new nvidia gfx cards will be called NB8 series >>> recent Asus Vista driver gave them away
NVIDIA_G84.DEV_0407.1 = "NVIDIA NB8P-GS"
NVIDIA_G86.DEV_0425.1 = "NVIDIA NB8P-SE"
NVIDIA_G86.DEV_0428.1 = "NVIDIA NB8M-SE"
I also thought it would be old news on this forum that X2300 is NOT a DX10 GFX card..................
first place you would see mention that capability would be a PR statement right >>> http://www.asus.com/news_show.aspx?id=5531
maybe not
http://www.hardware.fr/news/8554/radeon-x2000-avant-heure.html
-
By the way that is not a source I would quote as very reliable, they may have something but it is equally possible that they do not. I mean were in the world did they get that info when no one else has it. Seeing how AMD/ATI would be shooting themselves in the foot if they start releasing non DX10 cards this close to Nvidia's upcoming releases.
Right now there is a 50/50 chance that the x2300 will be DX10. Until we hear some definite Press announcement from AMD/ATI (OK I am just going to call it ATI from now on) or a notebook with a DX10 GPU in it, we don't know what the x2300 will be. Thats just my two cents.
-
-
I think they would be screaming all about DX10 if they could. It's possible that ATI is making them hold their tounge.
But I am definately of the opinion that we need to wait and see. All anyone is doing now is making blind guesses. But hey - that's what were here to do right? -
-
the 7800 was about twice as fast as the fastest 6xxx's. And the 6800 was about twice as fast as the 5xxx's.
And you certainly can compare them, as long as you pick games they can both run... Of course, in DX10 games the 8800 will be infinitely faster, since the game can't even run on other cards. But in DX9 games, there's no reason why you shouldn't be able to compare performance. -
ok yes I agree with u there, in dx9 games they could be compared. But a low-end dx10 card in dx10 could look and perform just as good as a high-end dx9 card in dx9.
-
The Geforce 8600GT, 8600 Ultra, and 8300 for desktops was leaked a few weeks ago. They said the timetable for those cards was around February-March. If I'm not mistaken, it takes a while after desktops graphics cards are released to release laptop graphics cards, so my guess is we'll see some type of mobile 8000 series card in April-May.
http://www.mobilewhack.com/nvidias-new-8600-geforce-series/ -
-
DX10 cards require different hardware to run DX10 effects...the DX9 cards don't have that hardware and hence will not work. Now DX9 cards will probably run DX10 games, but only because the game vendors release the game with both DX9 and DX10 DLLs and programming. The game decides what to use based off of which type of card is available.
-
Any DX10 card will outperform the highest end dx9 card in graphics - Now is 1 low end dx-10 card going to outperform 2 x Highest end dx9 card in frame rates on a dx10 game running in dx9L? probably not, but the dx10 graphics will still be visually better.
-
Er, running in DX9 mode, the graphics will be exactly the same. And under DX10, you obviously can't compare, because only one of the cards will be able to run it in the first place.
Similarly, a DX10 game will never run in DX9L.
(And I highly doubt the upcoming 8400 to outperform a 7900) -
What I meant was that a low-end dx10 card in dx10 could still look better(graphics wise) than a high-end dx9 card in dx9. I never said that a dx9 card would play dx10.
-
-
Just to be picky, DX10 does not require unified shaders, and that's not why it's not compatible. The reason it's not compatible with DX9 cards is simply that they don't support geometry shaders and a few other tricks. But there's no rule that these have to be supported with a unified shader model. Unified shaders just so happens to be a natural and efficient way to implement the DX10 functionality, but it's not required.
But yeah, a low-end DX10 card is still a low-end card, and as such, you'll have to run everything at low detail levels so a high-end DX9 card will look better. So as jak3676 said, it won't look better just because it's a DX10 card.
DX10 nVidia
Discussion in 'Hardware Components and Aftermarket Upgrades' started by SavantEdge, Jan 9, 2007.