Via DailyTech:
NVIDIA has the fastest single GPU for the desktop in the GTX 285, a 55nm die-shrunk version of its predecessor the GTX 280. However, ATI has been able to gain a larger market share due to aggressive pricing and ramping of smaller geometries. This has led to price pressure on NVIDIA, especially in the performance mainstream segment.
NVIDIA's original GT200 chip -- which is used in the GTX 280 and GTX 260 -- is too big, too costly, and consumes too much power to be used effectively in a mobile solution. NVIDIA has already switched to TSMC's 55nm process from the baseline 65nm node to deal with these issues for the GTX 285, but it is still not suitable for the majority of laptop users. Battery life is too short, the cooling fan is too loud, and the cost is too much.
One solution was to begin manufacturing on the 40nm bulk process like ATI has done. According to our sources, NVIDIA's attempts to produce a die-shrunk 40nm GT200 chip were "disastrous at best". Design problems became evident, since the GT200 was originally designed for the 65nm node. Two shrinks in a row without a major redesign was just too much for NVIDIA, and our most recent information from Taiwan is that the first 40nm chips from NVIDIA will be in the GeForce 300 series.
Without a power efficient GT200 based GPU solution for the mobile or mainstream value markets, NVIDIA is rebranding the 55nm G92b chip yet again to meet these critical segments. The original 65nm G92 chip was used in the GeForce 8800 GT, but you can only do so much with an older design. The chip was respun as the G92b with a 55nm die shrink, and is currently used in the 9800 GTX+. All G92 chips are only DirectX 10 capable, and will not support the full feature set of DirectX 10.1 or DirectX 11 that will come with Windows 7.
The problem is that many consumers will pick up a GTX 280M or GTX 260M thinking that it is the same or similar to the GTX 280, when it is actually just a 9800 GTX+.
There is currently no GeForce 100 series for desktop or mobile markets.
http://www.dailytech.com/article.aspx?newsid=14480
-
-
Red_Dragon Notebook Nobel Laureate
Haha nvidia is playing catchup this is exciting because now the competitiveness of both companies will come out and ultimately one person will win.........
The consumer -
*goes and cries in corner*
-
Ati are going to take the HD4860 and go running with it.
Just imagine - they are comfortable with 40nm technology, and they are also willing to put GDDR5 in a mobile card. What's next? - a GDDR5 card with 800 pipelines (ie same as 4870) but built in 40nm? Of course that's not going to happen any time soon, but it just shows that they will be ready to counter anything nvidia do with some amazing cards going into 2010 and beyond. -
-
From what I'm reading around places, the reason we're seeing 40nm so soon is that ATI's yields are much better than even they expected. They're able to launch full scale in full quantities earlier they thought, and this of course also means that they're doing it at lower cost because there are fewer chips that need "binning".
-
"the first 40nm chips from NVIDIA will be in the GeForce 300 series."
is that the Mobility 300? or desktop? -
I think desktop first and after to laptops(I hope)...
ATI FTW... -
Red_Dragon Notebook Nobel Laureate
i think someday we are gonna start seeing the release of notebook GPU's and desktop GPU's at the SAME time.
-
The 4860 is coming out for notebooks first.
-
ohhhhhhh people used to hate on ATI and be like yea i got a nvidia xxxx now im like pshhh i got a ATI FTW lol
the 40nm will make every thing so much faster and better and cheaper what more can you ask for? -
OOOOh man bad news for nvidia
i think its time to switch to ATI
Did u people see the AMD cinema 2.0 trailer??
HERE
that was amazing in game fotage -
-
That's the name of the trailer. The video was rendered in real time by the GPU, what they mean is that in theory every game could look just as good as what is in that video and playing games would be pretty much like watching a move hence the Cinema 2.0 title.
-
ViciousXUSMC Master Viking NBR Reviewer
Asus W90 is going to melt face with ATI cards, 6gb ram, a quad core cpu and at a price on par with your average gaming notebook and cheaper than your average Mac.
There top version is dual 4870's right now, I wonder if they will put 4860's in there at some point. I guess you may lose a bit of rendering power but would make it run cooler and have longer battery life and lower the cost. -
That trailer reminds me of Elephant's Dream...
-
King of Interns Simply a laptop enthusiast
This competition is great! The moment one wins its bad news for the consumer.
-
ratchetnclank Notebook Deity
I'm losing faith in Nvidia now
Rebranding same old crap all the time. -
I read about how ATi started the miniaturization of their manufacturing process much earlier than Nvidia last year or thereabouts when they were behind Nvidia in the mobile market and it seems the payoff is being seen now when Nvidia's performance lead has been cut down to size.
I'm going to wait 2 more years before my next upgrade but I reckon the next system I get may be equipped with an ATi card performance card the rate things are progressing. -
It's the exact same thing as nVidia's name for their GPU architecture that they used to call CineFX, but now they call it Graphics Plus.
What's sad here is that nVidia as a whole company is worth more than 3x as much as AMD, which is AMD/ATI combined since the buyout. And yet they're still recycling the same chips and branding them as new since 2006. The 8800GTX came out in 2006, and every GPU after that has basically been a die shrink of THAT card, with very minor modifications made, there has yet to be an actual change in the design for the most part. Much like what AMD has been doing. -
Red_Dragon Notebook Nobel Laureate
-
-
Red_Dragon Notebook Nobel Laureate
lol you are right who needs weights anyways?
-
-
Red_Dragon Notebook Nobel Laureate
well i can imagine the OC potential on the 4860 will possibly be greater
-
The 4800s so far have only allowed +50 on the core and memory. Not so impressive.
-
In relation to Nvidia's failings, they just made Engadget, once again, in regards to issues with the MBP 9600M (only) showing signs of death. I wonder if a class action suit would be applicable, legal, or logical at this point. I am (or was, not sure) a Nvidia fanboy, yes I will admit it. However, the last year or so has been pretty darn tough, I may have to return to neutrality and bounce over. Gar, Nvid, get your cr*p together like you did so in the past.
-
dondadah88 Notebook Nobel Laureate
Well the only good news about the rebranding is when people who bought the old 8800m gtx see this, they say "Oh looks like my card is going to last a bit longer." from 2006. come oh if i knew that i would of gotten it in sli and would of been good from even longer. Lol. (i love my whitebook and ati.)
game devolpers wont make a game that cant by played by almost everyone.(excluding crysis because EA was apart of it.)
but i'm glad ati is doing what there doing. they just need to fix there clock speeds and separate there shader from there core. (maybe i'm not getting why they didnt already.) -
Red_Dragon Notebook Nobel Laureate
-
-
Red_Dragon Notebook Nobel Laureate
Yes, and it will be hard to forgive them(especially the ones who faced the 8 series prob)
But if this makes them try harder to make GPU's for all notebook buyers(low, mid, and high end) better im all for it -
-
I've got a Macbook, and boy does it run hot. -
i love it, nvidia is just dominating (at least performance wise, not price/performance) the desktop market but ati is quickly picking up steam with the notebook market. i was getting worried there was gonna be another intel/amd situation.
-
-
I guess the only salvation will be one of their high end cards in an effort to not get a dud. But then the issue is the laptop's cooling. It would be nice if Nvidia set a thermal cooling standard and co-op'ed with designers to ensure it would be reached. It may be fairly extreme, but if there was some kind of standard, it may be helpful. Like set level 1, 2, and 3. Level 1 cooling is required for all low-mid range cards and provides adequate cooling. Level 2 would be great cooling for low/mid plus adequate cooling for high. Then 3 would be awesome cooling for high. I would certainly purchase a certified level 3 for an extra $500 if it covered the key points. Just a couple of my cents
...
-
The GPU wars continue, and that's what we should all be thankful for. There was a time when ATI dominated, then Nvidia picked it up and they dominated, now ATI is coming back strong.
I really don't care for either company, whether it says ATI or Nvidia in front who cares. What matters is that they push each other to deliver better products and we, the user, stand to benefit from their competitiveness. -
-
The way I see it I hope there is push and pull between the two companies for ever. I was with ATi with the Rage 128 then I went nVidia with the Geforce 2 then I went ATi with the 9800 Pro, then Ati 1900 XTX, then I got my 7800 GT then over to the 8800 GTX now I have my 9800m GTS in my laptop. I can't wait to go back to ATi and then back to nVidia and so on and so forth! Love the competition! Same with AMD / Intel, back and fourth we go!
So far though my favorite cards where the 9800 Pro and the 8800 GTX
Being a fanboy is for morons imo, why not just get whats the best on the market?!!?!? -
-
4870 X2 = GTX 295 (Entirely game dependent for who wins)
GTX 260 Core 216 = HD 4870 1GB (Game dependent for who wins)
GTX 260 Core 192 = HD 4870 512MB (Game dependent again)
HD 4850 > 9800 GTX+ & 9800 GTX
HD 4830 = 9800 GTX (Game dependent) and > 9800 GT
HD 4670 > 9600 GSO
The only unchallenged NVidia card is the GTX 285, ATI is either dead heat or winning every other price point. HD 4890, dropping next month, will change that as well (And note that NVidia has no new GPUs coming for months and months). -
-
-
-
It's incredibly annoying when people who don't follow graphics very well at all post nebulous statements about things they are grossly misinformed about. -
Though my GTX 260 doesn´t take a toll when enabling AA, sure some but not as the G92 core. It was the same with my 8800GTX it could cope with AA enabled. I wish Nvidia could have made a real GTX 260M out of the new Core. But nope. I´ll pass until the next gen Nvidia GPU´s hits the mobile market.
-
-
-
-
I admit, I was a fan of Nvidia for a while, but then there was the heat death issue and the insistence of using one core over and over and over again in its GPUs (8800M GTX = 9800M GTX = GTX 280M, all using a G92).
That's the problem with resting on your laurels - competitors will succeed in dethroning you, and your laurels will hurt after a while.
NVidia's attempts to produce a 40nm version of GT200 were "disastrous at best"
Discussion in 'Gaming (Software and Graphics Cards)' started by Jlbrightbill, Mar 4, 2009.