Nvidia plans to launch a new generation of discrete mobile cards and it is pretty confident that it can take the performance lead.
Source: Fudzilla
If the specs suggested in the links are true (which I don't believe, I'd like to wait until CeBit to see what they say), then 1) A lot people, including myself will be very disappointed and 2) The GTX 280M is probably not going to outperform a mobility 4870 with GDDR5. I honestly think Fudzilla is really referring to the GTX 180M and the GTX170/160M?
Edit: Or maybe it can. I don't know. It's be cool if this 280M uses GDDR5 as well (as I don't see a 448 or a 512 bit mobile GPU being possible). It'd be even cooler if it didn't cost $750 per card![]()
-
Yeah, the first sentence is 100% about 180M, the 55nm beefed up version of the 9800M GTX, while the second paragraph is an accurate mix of both.
So the GTX 180M appears to be a legit 55nm 8800GT with the full 128 SPs? So then it's actually way more than a rebranded 9800M GTX. If Nvidia prices it right, things could get interesting. -
-
-
About the pricing: hasn't ATi forced Nvidia to reevaluate it's mobile pricing schemes? In the desktop market, Nvidia followed ATi's lead once the 4000 series came out, s it seems that the company must follow suit in the mobile market, if it wants to keep the stranglehold. -
Yeah if they keep the prices up it won't be worth the performance with 4870's being plenty powerful....
Edit: hmmm... just read the little article. It doesn't get me overly excited. I have a feeling nvidia is going to shoot themselves in the foot with this one. Although people are still buying the desktop models so who knows... -
The biggest mistake which Nvidia made was they couldn't properly give a reasonable reply to the failing of the 7 series and 8 series mobile cards and followed it up with a rebadged 9 series cards with virtually negligible performance benefit over the previous generation and at the same time charging atrocious prices. I am however willing to give them another chance and will wait for the CeBit and then will take the decision to choose between Nvidia and ATI.
-
Everyone is just so used to lovinig nvidia... i'm giving ATI the benefit of the doubt with this generation of cards.
I just got my gateway over the summer though so i won't be in the market again for a couple years (i hope) when the next batch of consoles come out. Then we'll see some performance gains... -
I just hope this new series is not defective.
-
The 4870 series aren`t as fast as I`d hoped them to be ...
So the GTX280M, if launched, might actually do something... -
-
It's all about price in these economic times. I don't care if Nvidia bring out a card that is 30% faster than *whatever* if the thing adds $800 to the cost of the laptop.
If they can bring out high-performing cards that don't cost the earth, excellent. I guess we will see in time. -
is my 8800m gtx redundant now
i want the one gpu to rule them all
time for a new laptop soon, then there are my financial converns ...hmmm -
Please. The 8800M GTX/9800M GT won't be obsolete until the only playable resolutions we have are 1280x800 and lower. We're a looong way from being that desperate.
-
The article does not convince me at all. Sounds like a 180 and not a 280. The 280 uses a completely different architecture than the G92x does.
-
Red_Dragon Notebook Nobel Laureate
.
-
Ok, this should be interesting.
the Clevo M980NU (direct competitor to ASUS W90) will debut at CeBIT
since it will have dual MXM IV slot, hopefully Clevo will debut it with dual GTX 200 cards. -
Code:NVIDIA_G92.DEV_060A.1 = "NVIDIA GeForce GTX 280M" NVIDIA_G92.DEV_0618.1 = "NVIDIA GeForce GTX 260M" NVIDIA_G94.DEV_0631.1 = "NVIDIA GeForce GTS 160M" NVIDIA_G94.DEV_0632.1 = "NVIDIA GeForce GTS 150M" NVIDIA_G96.DEV_0651.1 = "NVIDIA GeForce G 110M" NVIDIA_G96.DEV_0652.1 = "NVIDIA GeForce GT 130M" NVIDIA_G98.DEV_06EC.1 = "NVIDIA GeForce G 105M" NVIDIA_G98.DEV_06EF.1 = "NVIDIA GeForce G 103M"
The GeForce GTX 170M and 180M will also be based off of the G9x generation and entries for these are present in the v179.43 Windows 7 drivers. -
Niiiccee!!
I hope clevo makes the cards for existing users too! so that they can easily upgrade to the new GPUs! -
-
I might have miss the news, where did you guys saw the ATI cards not performing?
-
Well, in this thread, Asus W90 owner user managed 13k in 3D mark06 with T9600 + 4870 X2, stock driver and no OC; consider Dell M1730 with dual core (800FSB) and 8800m GTX Sli gets similar score stock, it does't look great; however, the drivers and lack of tweaking are probably to be blamed.
http://forum.notebookreview.com/showthread.php?t=354885
An inprovement to 14.1k in same bench was made by PCtuning.cz using same set up, I assume this is with a later driver, not the latest 9.2; but also with good amount of tweaking.
While in this thread, another 4850 system - MSI 725, equipped with P9500 and a single 4850 managed 10.5k with mild overclock in both CPU and GPU. (CPU:about 3 Ghz, GPU core is 550 mhz up from 500). This score is similar/less than 9800m GTX with some system.
http://forum.notebookreview.com/showthread.php?t=349540&page=91
Now we are waiting for Advent's 6555's 3Dmark06, what makes this system stand out is its Intel Q9000 @ 2Ghz; shipping with a single ATI mobility HD 4850, at GPU core of 500mhz. I expect this to fetch 11.5-12k 3Dmark06 with the latest driver due to CPU weighing of 3Dmark06; however, the difference won't be that big as Q9000 shares 6mb L2 cache over 4 cores in actuality, as oppose to 2 in P9500. And benches aside, 2Ghz might be underperforming in some single threaded apps.
Over at Cnet, Alienware M17 was just benched, they returned 11k+ using intel Q9000 CPU combined with a last gen 3870, (done at XGA however - lower than all the others), however consider a dual core system with a single 3870 gets about 8-9.2k at XGA on average in 3D mark 06, up to 3k difference shows the significance of CPU wighing in this specific benchmark.
So in short, current 4 series caught on with the best of current gen NV cards, while out doing them with lower prices. However, failing to run circles around the likes of 9800m, especially in dual card condition in convincing manner.
This might be because there are only few system out utilising 48x0 series GPU, and those are either high end dual core, or lowest end quad; not helping excelling 48x0 in benching environment (which seems more hyped than real game performance); however, we can expect Alienware M17 and Asus W90 to be updated with Ati 4870 X2 and higher end quad (Q9100/QX9300) soon; which ought to truly reflect the supermacy of 4850 against current gen NV card.
Another tiny concern might be that 4850 is relatively lowly clocked, and overdrive app does not allow OC of more than 50 Mhz in core; but I guess this helps to make the card more lappy friendly.
However, I think it fair to assume that 48x0 ought to be less CPU dependent than 38x0 series, while more powerful raw spec probably means, its more than merely a pretty face in benches, and certainly not chocolate teapot in real games. -
Hmm now I see what new GPU´s my XPS M1730 supports with the new bios. Only update to the A10 bios is just New GPU´s. Woot 280m GTX SLI
-
Granted , our system has enough juice for 2 big cards...especially since we don`t have the extrad hdd and 95W quad core as the 9262, and have 10W more on the power adapter, but there is still the matter of space and heat management...
The 8800M GTX SLI hits 80C on the XPS, and if they didn`t shrink the Gtx280m enough, it might not even matter.
One can only hope though. With SLI 280gtxM, I`d jump also on an X9000 and be set for quite a while .
Not that I`m not still... -
A few have hit on the peripheral sides of the issue, but this needs to be said in bold and underlined.
Both the GT280M and GT260M are underclocked versions of the G92b GT250, which is a renamed 9800 GTX+, which is in turn an overclocked, 55nm version of the 8800 GT/8800 GTS.
That's right, NVidia's big new GPU is simply an overclocked die shrink of what's already in Clevo/Alienware's high end notebooks (Those already might be 55nm, if so, this is even lower of NVidia). These aren't GT200 architecture, they're 2 year old cards getting yet another renaming.
Want to know the performance? Just ask johnksss, since he's pushed the clock limits on the 9800M GTX, his scores are as high as any stock clocked GT280M part will be. There is no new GPU here folks, move along. -
what's up with Nvidia. If Jlbrightbill is right, Nvidia will lose a huge market share in the mobile graphics segment for sure.
-
gary_hendricks Notebook Evangelist
nVidia has nothing up their sleeve right now..so they are just confusing
people with their (stupid) (re)naming schemes. -
-
Quicklite, check again, I just hit 15118 with that same set up no GPU OC.
-
Nvidia has 40nm mobile chips with GDDR5, based on GT200 architecture, on the roadmap for Q3-Q4 of this year. This is a fact that came out of CES.
55nm overclocked chips for Q1-Q2 were also on this map. This isn't news. -
Desktop chips. These mobility chips are reportedly, if Fud can be trusted (Which it often can't), G92b.
-
It doesn't make sense. There would be no point in GTX 180M/170M (overclocked G92 chips) if GTX 280M/260M are what this article says they are (overclocked G92 chips).
-
Charles P. Jefferies Lead Moderator Super Moderator
Yet another BS renaming scheme . . . Nvidia, how about taking some pointers from ATI and actually improving your architecture so the performance is significantly better?
-
There is a huge stink about it right now.
http://www.theinquirer.net/inquirer/news/123/1051123/nvidia-cuts-reviewers-gts250 -
That article is also by Charlie Demerjian who is a notorious NVidia hater. Go look at his article history.
-
If you dont like that article how about [H]ardForums being cut from nVdia's list of reviewers? Are they haters too?
-
speculation time:
single gtx280m but with 240 to 256 sp's would explain why their is only one card running under an i7 core. and since it's 18.4 them heat sinks would have to be massive. if this concoction even keeps up with dual 4870's@1600 sp's, then it may be ati with the problem later down the line. a single 4850 is on par with a 9800m gtx they both have a give or take when dealing with one card. the 4870s we're release not to really compete against the 9800m gtx, but to smash them...not doing that just yet. seems to be a slight flaw..."the driver". 3dmark06 has been ati's for a while now.they did capture the vantage crown, but lost it when the i7's came out. nvidia was watching so they may make the allstar play and run a full gtx280/260 since technically, they have the space for it now, but in mobile form....
but running 128 sp's on a 256 bit bus....yeah, i don't see that even being a real competition for ati. and the fx and geforce do not operate the same. we already been down this road and the 9800m gtx smashed the quadro fx3700M,. so if that was their plan...that would be very stupid. a 3d model chip in a gaming machine.
flip side:
would also explain why no sli..since those chips aren't sli
as for the naming game...im going to hold on that one this time...allot of people are watching on this one so they might just surprise a few of you...or not. -
Nvidia is just so into trouble. March is coming, be very afraid Nvidia, be very afraid...
Oh yea, to those who still says the ATI HD 4870 is underperforming, I think you should go read the W90V Owner's lounge thread. From PAGE 1 to PAGE 60! -
nvidia is always competitive with ati. the 4870 is currently ati's mobile flagship. nvidia wants to compete by releasing new gpus to outperform or perform on par with the ati cards
-
until they get the right drivers..it will continue to UNDER PERFORM.
-
15k in 3dmark05 with bad driver is actually quite good feat(and bad at the same time). -
never had a chip fry on me so i can't relate there...
they can't make that call till it comes out and someone puts it to the test. and if it fails, then it will be cheaper than the ati...lol (speculation)
but they already started under cutting ati with their desktop cards.285/295 are cheaper and perform better. so who knows... -
-
Bottom line is, unless NVidia is prepared to have power consumption off the charts, there is physically no way they can match the 4870's performance on a 256-bit bus. The GTX 260 is 448 and the GTX 280 is 512. Outside of GDDR5 and 40nm, it won't happen.
-
-
It's not speculation... there are certain power and physical width requirements that NVidia can't meet without sacrificing power consumption or size.
-
didn't see that link posted.
from what i have been reading, the 4870x2 use more power and runs hotter.... i don't know for sure, because i don't own one, but from the over clockers...this seems to be the way of things.
i wouldn't mind trying them both out to see, but i don't see that being a reality right now...
http://bios.techpowerup.com/79027/P...orce_GTX_295_Run_Leads_Radeon_HD_4870_X2.html -
-
now that sounds more like right.
-
Geforce GTX280M and 260M to launch at CeBit
Discussion in 'Gaming (Software and Graphics Cards)' started by ichime, Feb 24, 2009.