The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Geforce GTX280M and 260M to launch at CeBit

    Discussion in 'Gaming (Software and Graphics Cards)' started by ichime, Feb 24, 2009.

  1. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Nvidia plans to launch a new generation of discrete mobile cards and it is pretty confident that it can take the performance lead.

    Source: Fudzilla

    If the specs suggested in the links are true (which I don't believe, I'd like to wait until CeBit to see what they say), then 1) A lot people, including myself will be very disappointed and 2) The GTX 280M is probably not going to outperform a mobility 4870 with GDDR5. I honestly think Fudzilla is really referring to the GTX 180M and the GTX170/160M?

    Edit: Or maybe it can. I don't know. It's be cool if this 280M uses GDDR5 as well (as I don't see a 448 or a 512 bit mobile GPU being possible). It'd be even cooler if it didn't cost $750 per card :D
     
  2. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Yeah, the first sentence is 100% about 180M, the 55nm beefed up version of the 9800M GTX, while the second paragraph is an accurate mix of both.

    So the GTX 180M appears to be a legit 55nm 8800GT with the full 128 SPs? So then it's actually way more than a rebranded 9800M GTX. If Nvidia prices it right, things could get interesting.
     
  3. notyou

    notyou Notebook Deity

    Reputations:
    652
    Messages:
    1,562
    Likes Received:
    0
    Trophy Points:
    55
    It's Nvidia, they won't charge anything reasonable.
     
  4. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Well, specwise, it would be an overclocked FX3700M (they'll probably overclock it to make use of the 55nm shrinkage). Whatever it is, I hope it doesn't cost what the high end nVidia mobiles are costing now.
     
  5. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Yeah I had originally typed 3700M instead of 8800GT, but I changed it.

    About the pricing: hasn't ATi forced Nvidia to reevaluate it's mobile pricing schemes? In the desktop market, Nvidia followed ATi's lead once the 4000 series came out, s it seems that the company must follow suit in the mobile market, if it wants to keep the stranglehold.
     
  6. Mormegil83

    Mormegil83 I Love Lamp.

    Reputations:
    109
    Messages:
    1,237
    Likes Received:
    0
    Trophy Points:
    55
    Yeah if they keep the prices up it won't be worth the performance with 4870's being plenty powerful....

    Edit: hmmm... just read the little article. It doesn't get me overly excited. I have a feeling nvidia is going to shoot themselves in the foot with this one. Although people are still buying the desktop models so who knows...
     
  7. Pranalien

    Pranalien Notebook Veteran

    Reputations:
    20
    Messages:
    706
    Likes Received:
    2
    Trophy Points:
    31
    The biggest mistake which Nvidia made was they couldn't properly give a reasonable reply to the failing of the 7 series and 8 series mobile cards and followed it up with a rebadged 9 series cards with virtually negligible performance benefit over the previous generation and at the same time charging atrocious prices. I am however willing to give them another chance and will wait for the CeBit and then will take the decision to choose between Nvidia and ATI.
     
  8. Mormegil83

    Mormegil83 I Love Lamp.

    Reputations:
    109
    Messages:
    1,237
    Likes Received:
    0
    Trophy Points:
    55
    Everyone is just so used to lovinig nvidia... i'm giving ATI the benefit of the doubt with this generation of cards.

    I just got my gateway over the summer though so i won't be in the market again for a couple years (i hope) when the next batch of consoles come out. Then we'll see some performance gains...
     
  9. Han Bao Quan

    Han Bao Quan The Assassin

    Reputations:
    4,071
    Messages:
    4,208
    Likes Received:
    0
    Trophy Points:
    105
    I just hope this new series is not defective.
     
  10. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    The 4870 series aren`t as fast as I`d hoped them to be ...
    So the GTX280M, if launched, might actually do something...
     
  11. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    The GTX series (actually it's all nvidia series) are known of, not being able to scale well in terms of power usage. Either, they are very good at it, or vice verse, no consistency.
     
  12. v_c

    v_c Notebook Evangelist

    Reputations:
    124
    Messages:
    635
    Likes Received:
    0
    Trophy Points:
    30
    It's all about price in these economic times. I don't care if Nvidia bring out a card that is 30% faster than *whatever* if the thing adds $800 to the cost of the laptop.

    If they can bring out high-performing cards that don't cost the earth, excellent. I guess we will see in time.
     
  13. kal360

    kal360 Notebook Consultant

    Reputations:
    3
    Messages:
    270
    Likes Received:
    0
    Trophy Points:
    30
    is my 8800m gtx redundant now :( i want the one gpu to rule them all

    time for a new laptop soon, then there are my financial converns ...hmmm
     
  14. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Please. The 8800M GTX/9800M GT won't be obsolete until the only playable resolutions we have are 1280x800 and lower. We're a looong way from being that desperate.
     
  15. emike09

    emike09 Overclocking Champion

    Reputations:
    652
    Messages:
    1,840
    Likes Received:
    0
    Trophy Points:
    55
    The article does not convince me at all. Sounds like a 180 and not a 280. The 280 uses a completely different architecture than the G92x does.
     
  16. Red_Dragon

    Red_Dragon Notebook Nobel Laureate

    Reputations:
    2,017
    Messages:
    7,251
    Likes Received:
    0
    Trophy Points:
    205
    NO, not yet its just a matter of time before nvidia's feel the sting the 4000 is only coming out now wait till summer we're gonna start seeing nvidia GPU's cheaper then they've ever been :D.
     
  17. Gophn

    Gophn NBR Resident Assistant

    Reputations:
    4,843
    Messages:
    15,707
    Likes Received:
    3
    Trophy Points:
    456
    Ok, this should be interesting.

    the Clevo M980NU (direct competitor to ASUS W90) will debut at CeBIT

    [​IMG]

    since it will have dual MXM IV slot, hopefully Clevo will debut it with dual GTX 200 cards. :)
     
  18. justinkw1

    justinkw1 Notebook Virtuoso

    Reputations:
    3,773
    Messages:
    2,847
    Likes Received:
    66
    Trophy Points:
    66
    The GeForce GTX 260M and 280M, in this case, are not a completely new generation of chips. These are based off of the G9x series as well. The main reason why I know this is because of what is said in the drivers - see version 179.44. I've included a few driver entries below.

    Code:
    NVIDIA_G92.DEV_060A.1  = "NVIDIA GeForce GTX 280M"
    NVIDIA_G92.DEV_0618.1  = "NVIDIA GeForce GTX 260M"
    NVIDIA_G94.DEV_0631.1  = "NVIDIA GeForce GTS 160M"
    NVIDIA_G94.DEV_0632.1  = "NVIDIA GeForce GTS 150M"
    NVIDIA_G96.DEV_0651.1  = "NVIDIA GeForce G 110M"
    NVIDIA_G96.DEV_0652.1  = "NVIDIA GeForce GT 130M"
    NVIDIA_G98.DEV_06EC.1  = "NVIDIA GeForce G 105M"
    NVIDIA_G98.DEV_06EF.1  = "NVIDIA GeForce G 103M"
    
    (Full details on this driver can be found in this thread.)

    The GeForce GTX 170M and 180M will also be based off of the G9x generation and entries for these are present in the v179.43 Windows 7 drivers.
     
  19. sohail99

    sohail99 Notebook Consultant

    Reputations:
    64
    Messages:
    294
    Likes Received:
    0
    Trophy Points:
    30
    Niiiccee!! :D

    I hope clevo makes the cards for existing users too! so that they can easily upgrade to the new GPUs! :D
     
  20. Althernai

    Althernai Notebook Virtuoso

    Reputations:
    919
    Messages:
    2,233
    Likes Received:
    98
    Trophy Points:
    66
    But will it only cost $2200? The interesting thing with the ATI cards is not performance. I very much doubt the difference will be large enough for anyone to care unless they're dead set on having "the best" at any cost. The reason they're exciting is that the price of a high end, dual mobile card is now around $500. Nvidia's pricing was on the order of double that very recently.
     
  21. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    I might have miss the news, where did you guys saw the ATI cards not performing?
     
  22. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    Well, in this thread, Asus W90 owner user managed 13k in 3D mark06 with T9600 + 4870 X2, stock driver and no OC; consider Dell M1730 with dual core (800FSB) and 8800m GTX Sli gets similar score stock, it does't look great; however, the drivers and lack of tweaking are probably to be blamed.

    http://forum.notebookreview.com/showthread.php?t=354885


    An inprovement to 14.1k in same bench was made by PCtuning.cz using same set up, I assume this is with a later driver, not the latest 9.2; but also with good amount of tweaking.


    While in this thread, another 4850 system - MSI 725, equipped with P9500 and a single 4850 managed 10.5k with mild overclock in both CPU and GPU. (CPU:about 3 Ghz, GPU core is 550 mhz up from 500). This score is similar/less than 9800m GTX with some system.

    http://forum.notebookreview.com/showthread.php?t=349540&page=91



    Now we are waiting for Advent's 6555's 3Dmark06, what makes this system stand out is its Intel Q9000 @ 2Ghz; shipping with a single ATI mobility HD 4850, at GPU core of 500mhz. I expect this to fetch 11.5-12k 3Dmark06 with the latest driver due to CPU weighing of 3Dmark06; however, the difference won't be that big as Q9000 shares 6mb L2 cache over 4 cores in actuality, as oppose to 2 in P9500. And benches aside, 2Ghz might be underperforming in some single threaded apps.

    Over at Cnet, Alienware M17 was just benched, they returned 11k+ using intel Q9000 CPU combined with a last gen 3870, (done at XGA however - lower than all the others), however consider a dual core system with a single 3870 gets about 8-9.2k at XGA on average in 3D mark 06, up to 3k difference shows the significance of CPU wighing in this specific benchmark.

    So in short, current 4 series caught on with the best of current gen NV cards, while out doing them with lower prices. However, failing to run circles around the likes of 9800m, especially in dual card condition in convincing manner.

    This might be because there are only few system out utilising 48x0 series GPU, and those are either high end dual core, or lowest end quad; not helping excelling 48x0 in benching environment (which seems more hyped than real game performance); however, we can expect Alienware M17 and Asus W90 to be updated with Ati 4870 X2 and higher end quad (Q9100/QX9300) soon; which ought to truly reflect the supermacy of 4850 against current gen NV card.

    Another tiny concern might be that 4850 is relatively lowly clocked, and overdrive app does not allow OC of more than 50 Mhz in core; but I guess this helps to make the card more lappy friendly.

    However, I think it fair to assume that 48x0 ought to be less CPU dependent than 38x0 series, while more powerful raw spec probably means, its more than merely a pretty face in benches, and certainly not chocolate teapot in real games.
     
  23. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Hmm now I see what new GPU´s my XPS M1730 supports with the new bios. Only update to the A10 bios is just New GPU´s. Woot 280m GTX SLI :)
     
  24. eleron911

    eleron911 HighSpeedFreak

    Reputations:
    3,886
    Messages:
    11,104
    Likes Received:
    7
    Trophy Points:
    456
    I don`t think they`re actual Gtx280m ,more like 55nm 9800M GTX+ ...
    Don`t get your hopes up so soon.
    Granted , our system has enough juice for 2 big cards...especially since we don`t have the extrad hdd and 95W quad core as the 9262, and have 10W more on the power adapter, but there is still the matter of space and heat management...
    The 8800M GTX SLI hits 80C on the XPS, and if they didn`t shrink the Gtx280m enough, it might not even matter.
    One can only hope though. With SLI 280gtxM, I`d jump also on an X9000 and be set for quite a while .
    Not that I`m not still... :D
     
  25. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    A few have hit on the peripheral sides of the issue, but this needs to be said in bold and underlined.

    Both the GT280M and GT260M are underclocked versions of the G92b GT250, which is a renamed 9800 GTX+, which is in turn an overclocked, 55nm version of the 8800 GT/8800 GTS.

    That's right, NVidia's big new GPU is simply an overclocked die shrink of what's already in Clevo/Alienware's high end notebooks (Those already might be 55nm, if so, this is even lower of NVidia). These aren't GT200 architecture, they're 2 year old cards getting yet another renaming.

    Want to know the performance? Just ask johnksss, since he's pushed the clock limits on the 9800M GTX, his scores are as high as any stock clocked GT280M part will be. There is no new GPU here folks, move along.
     
  26. Pranalien

    Pranalien Notebook Veteran

    Reputations:
    20
    Messages:
    706
    Likes Received:
    2
    Trophy Points:
    31
    what's up with Nvidia. If Jlbrightbill is right, Nvidia will lose a huge market share in the mobile graphics segment for sure.
     
  27. gary_hendricks

    gary_hendricks Notebook Evangelist

    Reputations:
    29
    Messages:
    561
    Likes Received:
    0
    Trophy Points:
    30
    afaik, this is true:

    nVidia has nothing up their sleeve right now..so they are just confusing
    people with their (stupid) (re)naming schemes.
     
  28. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    But that is IF enough people realize this. But let's see what CeBit shows.
     
  29. Templesa

    Templesa Notebook Deity

    Reputations:
    172
    Messages:
    808
    Likes Received:
    247
    Trophy Points:
    56
    Quicklite, check again, I just hit 15118 with that same set up no GPU OC.
     
  30. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Nvidia has 40nm mobile chips with GDDR5, based on GT200 architecture, on the roadmap for Q3-Q4 of this year. This is a fact that came out of CES.

    55nm overclocked chips for Q1-Q2 were also on this map. This isn't news.
     
  31. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Desktop chips. These mobility chips are reportedly, if Fud can be trusted (Which it often can't), G92b.
     
  32. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    It doesn't make sense. There would be no point in GTX 180M/170M (overclocked G92 chips) if GTX 280M/260M are what this article says they are (overclocked G92 chips).
     
  33. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,082
    Trophy Points:
    931
    Yet another BS renaming scheme . . . Nvidia, how about taking some pointers from ATI and actually improving your architecture so the performance is significantly better? :rolleyes:
     
  34. Fordx4

    Fordx4 Notebook Guru

    Reputations:
    35
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
  35. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    That article is also by Charlie Demerjian who is a notorious NVidia hater. Go look at his article history.
     
  36. Fordx4

    Fordx4 Notebook Guru

    Reputations:
    35
    Messages:
    61
    Likes Received:
    0
    Trophy Points:
    15
    If you dont like that article how about [H]ardForums being cut from nVdia's list of reviewers? Are they haters too?
     
  37. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    speculation time:
    single gtx280m but with 240 to 256 sp's would explain why their is only one card running under an i7 core. and since it's 18.4 them heat sinks would have to be massive. if this concoction even keeps up with dual 4870's@1600 sp's, then it may be ati with the problem later down the line. a single 4850 is on par with a 9800m gtx they both have a give or take when dealing with one card. the 4870s we're release not to really compete against the 9800m gtx, but to smash them...not doing that just yet. seems to be a slight flaw..."the driver". 3dmark06 has been ati's for a while now.they did capture the vantage crown, but lost it when the i7's came out. nvidia was watching so they may make the allstar play and run a full gtx280/260 since technically, they have the space for it now, but in mobile form....

    but running 128 sp's on a 256 bit bus....yeah, i don't see that even being a real competition for ati. and the fx and geforce do not operate the same. we already been down this road and the 9800m gtx smashed the quadro fx3700M,. so if that was their plan...that would be very stupid. a 3d model chip in a gaming machine.
    flip side:
    would also explain why no sli..since those chips aren't sli

    as for the naming game...im going to hold on that one this time...allot of people are watching on this one so they might just surprise a few of you...or not. :D
     
  38. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Nvidia is just so into trouble. March is coming, be very afraid Nvidia, be very afraid...

    Oh yea, to those who still says the ATI HD 4870 is underperforming, I think you should go read the W90V Owner's lounge thread. From PAGE 1 to PAGE 60!
     
  39. ChinNoobonic

    ChinNoobonic Notebook Evangelist

    Reputations:
    273
    Messages:
    638
    Likes Received:
    1
    Trophy Points:
    30
    nvidia is always competitive with ati. the 4870 is currently ati's mobile flagship. nvidia wants to compete by releasing new gpus to outperform or perform on par with the ati cards
     
  40. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    news flash...IT IS UNDER PERFORMING!

    until they get the right drivers..it will continue to UNDER PERFORM.
    :)
     
  41. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    At least it ain't underperforming because of hardware issues, or even worst, fry itself to bits (hint).

    15k in 3dmark05 with bad driver is actually quite good feat(and bad at the same time).
     
  42. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    3dmark06 is best ran with factory drivers or stock original drivers. it's just funny like that. since the factory drivers don't have all the limits enforced by the newer drivers.(long story there)

    never had a chip fry on me so i can't relate there... :)

    they can't make that call till it comes out and someone puts it to the test. and if it fails, then it will be cheaper than the ati...lol (speculation)

    but they already started under cutting ati with their desktop cards.285/295 are cheaper and perform better. so who knows... :)
     
  43. lunateck

    lunateck Bananaed

    Reputations:
    527
    Messages:
    2,654
    Likes Received:
    0
    Trophy Points:
    55
    Yea you're right. I do hope the GTX will fail by the economy point so that Nvidia being able to learn their lesson.
     
  44. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Bottom line is, unless NVidia is prepared to have power consumption off the charts, there is physically no way they can match the 4870's performance on a 256-bit bus. The GTX 260 is 448 and the GTX 280 is 512. Outside of GDDR5 and 40nm, it won't happen.
     
  45. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    looks like speculation not fact....
     
  46. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    It's not speculation... there are certain power and physical width requirements that NVidia can't meet without sacrificing power consumption or size.
     
  47. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
  48. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    It's not so much that, but more so of trying to fit a 448 bit or a 512 bit memory bus into an 80x100mm package. Due to the amount of transistors required for this, it would require alot of PCB real estate that is not available in MXM modules or modules small enough to fit in notebooks. ATi's approach of using GDDR5 memory would be better in that it doesn't require that much space to achieve the same bandwidth. This is why the desktop 4870x2 is able to fit in one PCB, while the GTX 295 is basically on two PCBs, although the GTX 295 consumes less power than the 4870x2.
     
  49. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    now that sounds more like right.
     
  50. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    ...he said exactly what I said but in different words, you just didn't bother understanding.
     
 Next page →