The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Geforce GTX280M and 260M to launch at CeBit

    Discussion in 'Gaming (Software and Graphics Cards)' started by ichime, Feb 24, 2009.

  1. RaYYaN

    RaYYaN Back on NBR :D

    Reputations:
    375
    Messages:
    1,632
    Likes Received:
    1
    Trophy Points:
    56
    This thread is really interesting
    Quite enlightening in fact

    *subscribes and sits back ;)*
     
  2. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    ill make it better :D ATI FTW
     
  3. RaYYaN

    RaYYaN Back on NBR :D

    Reputations:
    375
    Messages:
    1,632
    Likes Received:
    1
    Trophy Points:
    56
    thanks for that little gem man ;)
     
  4. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    That's exactly why we didn't see the G280 show up with the Eurocom until now. Shifting over to the 256 bit and compensating the loss of memory interface with GDDR5 means there is no loss in memory bandwidth, and the die size can be shrunken considerably to make it more mobile platform friendly. It works well for the 4870, and I'm glad nVidia is taking this route. This will also lower their die production cost since more dies can be produced a wafer.
     
  5. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Actually, yes you did. I misread the part where you said "power consumption OR size" But yeah, it's more about size than about power consumption to put it in a better way.
     
  6. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Well I was referring to johnksss, your rephrasing is fine. :)

    TRF-Inferno: If it's taking NVidia over a year to bring GT200 series cards to the mobile market, what makes you think the mobile counterparts to the 40nm die shrink / GDDR5 transition will make it to the mobile market within 3 months? I've yet to see confirmation that these new parts are 40nm or GDDR5. It would be nice if they actually did release a new GPU instead of rebranding an old one, but it sure isn't promising right now.
     
  7. Tippey764

    Tippey764 Notebook Deity

    Reputations:
    377
    Messages:
    1,423
    Likes Received:
    0
    Trophy Points:
    55
    Nvidia has lost me as a customer.
     
  8. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Again, let's see what they say come CeBIT. Most people really thought these cards were going to be 40nm and GDDR5, but it seems Fudzilla may be right when you look at the Device ID from the drivers. They could use GDDR5, but I don't think the g92b supports GDDR5 yet.

    Speaking of 40nm and GDDR5, Guru3D has a preview of ATi's new RV740 chip (40nm 128mb, GDDR5), which is supposed to be the 4750. If these results are accurate and the target is correct, then WOW! Q2 09 is going to be EPIC! ATi also already has an M97 mobile chip based on the RV740. I'd like to see what systems use that.
     
  9. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I think you mean 128-bit, not 128MB. But yes, I read that earlier and it's going to be hot.
     
  10. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    I didn't mention anywhere that it would take nVidia to bring the G280 to the market in 3 months, I specifically stated that with the 40nm die shrink and GDDR5, it makes the GT212(?) much more applicable for the mobile market. I read somewhere the G280 used in that one Eurocom machine is infact 40nm, so at least we know nVidia does have a working 40nm sample if that statement holds true.
     
  11. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    see that? speculation

    we all know about 256/448/512 bit memory.

    your speculation of the power is just that, speculation.

    and since no one has seen the gtx280m/gtx260m it's still speculation. they have a whole new design and plenty of space to work with.
    18.4 inch laptop
    with 2 drives and not 3
    1 vga card and not 2

    whos to say this pcb broad wont have two gpus on it? no one knows anything, just a bunch of talk. so if your speculating it...that's all cool, but trying to past if off as a fact when it aint. not cool. :)

    but hey, prove it wrong and ill retract it right now.
     
  12. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I don't think you know the meaning of the word speculation.

    It's not speculating to say that within a given system, when constants are known, one of a few very specific situations has to happen. Power consumption off the charts, or radical new design. There is no speculation there.

    Speculating would be me estimating performance, or making inferences based on incomplete information. Physically, you simply cannot make a GTX 280 fit in a notebook on it's current bus width and architectural design, so it's either going to be coming with GT212, the power consumption is off the charts, or they neuter it so much that it can barely even be called a GTX 280.

    That is all fact. Please take off the NVidia blinders.
     
  13. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    like i said the first time speculator.
    your education means nothing if you dont have the facts.
    and you take off the ati is the worlds answer to vga.

    they both have their good and bad points, plain and simple. i understand your bias opinions are just that. fine....but like i said. your speculating.

    take it how you want to.
     
  14. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Bias? I had an MSI GTX 260 in my shopping cart at Newegg and was checking out until I saw a 1GB aftermarket cooler HD 4870 on sale. The GTX 260, 280, and 285 desktop are great cards and are very competitive with ATI at their price point. GTX 285 currently has no direct competition. I'm not a fanboy.

    You clearly have not followed much of anything since July in terms of ATI's new GPUs. From the 4670 all the way through the 4850 ATI has no direct competition. They're also already on 55nm with the high end parts, something NVidia just managed (Albeit on a 448/512 bit interface). ATI already has 40nm out in the field with RV740, NVidia still does not. ATI has GDDR5 out for 8 months, NVidia's is still months away. ATI released mobile chips with identical architecture to the desktop, just lower clocks. NVidia releases chips that are a year and a half old, renames them, releases them again, renames them again, raises clocks, and tries to pass it off as new. The new NVidia slogan around the internet is "The way it was meant to be renamed."

    It doesn't take a speculator or a fanboy to see they have no answer to ATI and haven't for the past 8 months. There are the facts, lets see yours.
     
  15. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    last time i checked, nvidia held most of the titles once again...
    your doing alot of excess talking for nothing.
    all i said was about the power consumption thing and that was pretty much it. just like you made the same call way back when 4870s being of lessor power and heat and when the cards came out, the 285/295 cards use less power and generate less heat. we already established the renaming thing quite some time ago, not sure why you thought that was of interest here? nvidia already got busted on renames as well or was that something the "whole world missed" and you thought you would mention it again? not sure what world your living in, but ati was shut down for a few years. i use to run all ati cards before nvidia took over. now ati comes with some decent cards. it's about time is what most are saying, not sure about you.

    you downing nvidia has nothing to do with me. they have an email system..email them your complaints. :)
     
  16. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    ATI has "some decent cards"?

    I'm not oozing fanboy, that sentence sure is.
     
  17. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    huh?

    i see your dealing with nvidia issues
    no wonder your rep isn't really going up any.

    you have a nice day bill. :)
     
  18. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    One person has brought facts to this thread. The other has not.

    I think that speaks for itself. Mainly you're just trolling.
     
  19. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    is that the best you can do...first call me a fan boy then come back with calling me a troll? wow! have a nice day child.
     
  20. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Post #67 is straight up trolling, pure and simple.

    I'm not going to clutter up this thread any more with off topic posts that are likely making more work for the site's moderators. Civil discussions should not need to be polluted by trolling.
     
  21. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    Geforce GTX280M and 260M to launch at CeBit<---- not ati
    post #67 says it all to most reading. now go make yourself useful and help someone instead of trying to argue your self to another non extant point.
     
  22. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    You sure sound like an ATI fanboy to me. Also to you it sounds like ATI is beating Nvidia on all fronts in performance, which is not true. Doesn´t matter if they were at 55nm before Nvidia, they still don´t beat them so much as you would like to think they do. Take off those ATI fanboy goggles.

    From what I read johnksss isn´t a fanboy. Reading through all these pages just gives me a feeling that you think you are always correct on your "facts".

    I agree with the Nvidia renaming scheme is getting old now.
     
  23. Dox@LV2Go

    Dox@LV2Go Notebook Consultant

    Reputations:
    247
    Messages:
    188
    Likes Received:
    0
    Trophy Points:
    30
    lol,

    ultimate test to see if you are a fanboy.
    would you get the GTX280m knowing its still a revision of the G92 core?
    (assuming that GTX280m is correct for its specs)

    at least with ATi its new acrhitecture (yeah.. yeah.. for all the nvidians out there we've been through it before)
     
  24. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    lets see....if it's faster than all this high tech stuff that is out now, then i would grab it. then that would have a bunch of people looking like fools. like some have already done so. me personally....ill wait till it actually comes out and is tested before making judgment calls. and if it's one card performing = or better than a crossfire, then i guess people would have something to say about that as well.... only time will tell and it looks like we will get a real glimps next month...
     
  25. dngfng

    dngfng Notebook Enthusiast

    Reputations:
    0
    Messages:
    34
    Likes Received:
    0
    Trophy Points:
    15
    lets just sit back and wait to see what Cebit brings.

    OR much more to the point, sit back and await the first independent Reviews.
     
  26. Quiz

    Quiz Notebook Enthusiast

    Reputations:
    0
    Messages:
    29
    Likes Received:
    0
    Trophy Points:
    5
    Heh everybody is talking about the huge power consumption of the desktop GTX 280 and how it can't be "mobilized" , well may i remind you that a desktop 4870 consumes only 25% less than a 280 at load but then again the Ati consumes 56% more at idle.
    So if Ati managed to make a mobile version of this card (hell there is a x2 version) don't tell me Nvidia's engineers are a bunch of incapable idiots.

    Let's just wait and see, not that we can change anything by ranting on some forum ;)
    And lets not turn graphics cards into a cult , if Ati comes out on top buy Ati if Nvidia buy Nvidia.
     
  27. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Hm? I'm iving credit where it's due, and not just "Some decent cardds".
     
  28. souroull

    souroull Notebook Evangelist

    Reputations:
    182
    Messages:
    347
    Likes Received:
    1
    Trophy Points:
    31
    nvidia lost me as a customer when my 8600m is still pushing out 100degrees celcius running youtube vids, after being RMAed TWICE. not to mention all the rebranding fiasco that just wont stop, undermines everyones intelligence, and just scams the idiots.

    the hell with that, i dont care what comes out of nvidia this time, ati is my only option
     
  29. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
  30. Mormegil83

    Mormegil83 I Love Lamp.

    Reputations:
    109
    Messages:
    1,237
    Likes Received:
    0
    Trophy Points:
    55
    If its faster and cost effective i'll buy it (when i'm in the market) don't care who is selling... I'm a bargain shopper so "bang for buck" is what i look for. I have a hard time spending double if it's not double performance or more likely if it doesn't "suite my needs" double... there is no point in supporting one company over the other unless you are some how affiliated with them by work or some other means...
     
  31. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    More info on this GTX280M...brought to you by Fud a.k.a. The Inquirer: The same folks that started the rumor that the Radeon HD 4800 series will have 480 stream processors.

    Uh yeah, pass. I'll believe it when more credible sources are available. The 50 percent performance increase in certain games alone sounds like a bunch of hocus-pocus. I traded my 8800GT for a G92 8800GTS and it barely yielded an extra 5-8 FPS in most games.
     
  32. Quicklite

    Quicklite Notebook Deity

    Reputations:
    158
    Messages:
    1,576
    Likes Received:
    16
    Trophy Points:
    56
    Not really expecting anything from NV camp; ATi probably would show 4890 desktop card, it would seem wise though. Considering the threats of GTX 295, and also Red hardly ever had mobility and desktop platform so, similar in spec(mobility and desktop 48x0), desktop would probably have to evolve when mobility offering catches up.
     
  33. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    Look at this roadmap from CES:

    [​IMG]

    Does anyone want to take a bet that the H2 2009 chips aren't GT200 architecture?

    The only question is how much these chips will cost.
     
  34. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    I think that renaming the old G92 chips to the GTX2xxm series is just what nVidia would do, I can't see this being too far off. As far as we know this COULD be a G92 with GDDR5, which can explain the performance increase talked about.

    Also, nVidia is worth 3.5x as much as AMD/ATI as a whole, they have no excuse for this crap.
     
  35. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    If those specs are correct, then johnksss's current overclocked 9800M GTX's (660/965 from his sig) are faster than SLI GTX 280M's will be. That's both impressive that his cards overclock that high and disappointing that NVidia isn't releasing a better product. Granted it will have 128 SP's, or "CUDA cores" now, but that doesn't change the fact it's not GT200.

    Not quite. I'm again referencing johnksss's 3DMark06 benchmark with 9800M GTX's in SLI that are higher clocked than this new GTX280M. His highest record HDR/SM 3.0 subscore is 8054 at 660/965, 4870 SLI is around 8250 stock.

    Again, if the article is correct, they're still shipping with GDDR3. That's no surprise since not even the desktop cards are getting GDDR5 for another quarter or so. 9800M GTX owners should be pretty happy with their cards right now.
     
  36. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    3D Mark pfft, try Vantage instead that isn´t as CPU limited as 3D Mark 06 is. You get a far better overview of how the computer perform as whole using Vantage.

    The higher clocked CPU you have the higher 3D Mark score you get, while overclocking the GPU´s yield not that much points compared to the CPU.

    Or if you want a real benchmark go with Crysis instead.
     
  37. Dox@LV2Go

    Dox@LV2Go Notebook Consultant

    Reputations:
    247
    Messages:
    188
    Likes Received:
    0
    Trophy Points:
    30
    crysis is nvidia's game so to speak.
    3d mark is ati's usally.

    what to use as a fair benchmark....
     
  38. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Well Crysis is an actual in game benchmark, while 3D Mark isn´t. Now I mean 3D Mark 06 of course which relies more on actual CPU speed and number of cores.

    Now I have seen Crysis run really good on ATI boards too. Just look at youtube.

    On the other hand I hope there comes a real GTX 260. I bought one yesterday and it overclocks really good. Now it was factory overclocked to begin with and I raised it even further.
     
  39. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    im gonna have to say...that's a far fetch reach there and not with some basic setup either.... they say 50 percent huh, on old technology....if that comes to pass...looks like renaming will continue....

    that's just it..my clocks are low. this card clocks at 700/1000/1720. i just found that i could get more from less....ask this guy
    single gtx 3dmark06 score of 13,566

    the best i can come up with is 8130 for this. the rest is based on system wide over clocking. also..dual cores raise these scores allot, but can only go so high. we found this out with the ocz whitebook using a x9100 and a q9300. the q9300 had over all better scores, but the x9100 had better 2.0/3.0 scores. dual core vs quad core.
    another example. click on my best score and look at how many cores are running... you will notice it shows 3

    actually i am. if i had a system that could be fully over clocked like the w90. im pretty sure the 9800m gtx would indeed be up there. near 17k with out over clocking the fsb/cpu/pcie is pretty impressive by itself.

    so far it got a 9750 or so for gpu score in vantage. they are still testing....
    a 9800m gtx is at 10930 for gpu

    apparently not.
    FH with a 4850 got 32 fps in warhead on gamer settings...which was truly interesting to say the least. while everyone else got 15 to 20 fps on a single card. it even beat out a gtx280 a hd 4870...hahahah a mobile card. don't ask because i have no clue...lol

    and the fair test would be to run this program.
    http://downloads.guru3d.com/FurMark-v1.4.0-download-1965.html

    it's all gpu and takes about 1 minute to run.

    although, the minute you add aa, gtx drop kind of fast. without it...they are on top, start adding it and they start losing frames pretty quick.
    still testing on that part right now....
     
  40. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    And I got 9251 GPU score in Vantage PhysX Disabled with my 8800m GTX SLI.

    Now 3D Mark 06 is hilarious to say the least. Tested 3D Mark 06 on my desktop now, 1280x1024 13430 something, 1440x900 13298. CPU limitation comes in right away at 1280x1024. If I would raise my CPU to 3.4 I would score much higher and that with only a 400MHz increment on the CPU alone.
     
  41. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    i got a 10943 no physx.

    yeah, i keep trying to explain that to people, but some don't understand that...if i we're to raise this by 400 mhz it would gain about 1k cpu and 200 to 600 points on 2.0/3.0 scoring and with my 2.0 being 6600 that would put it in the 7k range. and the 3.0 score in the 85-8800 range....
     
  42. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yes that is correct johnksss and I agree it can´t be that hard to understand.
     
  43. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I fully realize the limitations of 3DMark06. Unfortunately, with Vantage being a 1-use-or-pay model, and Crysis isn't free, SM 3.0 subscores are currently one of our only consistent benchmarks.

    Crysis, at 1680x1060, all High, no AA/AF is currently the best mobile benchmark, although it favors NVidia. Each side has their 'Home run' games, Crysis is one of them for NVidia, S.T.A.L.K.E.R. has always been for ATI, etc. But it still is a good measuring stick.
     
  44. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yes I agree STALKER is also a very good measure tool. If you mean Clear Sky that is.
     
  45. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
  46. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I have Shadow of Chernobyl, not Clear Sky, but without an internal benchmark it loses some of the usefulness as a benchmarking too. Review sites have custom timedemos and lots of hardware, which is a luxury we don't have.

    What I like about Crysis is that it doesn't matter if I run my CPU at 3.6 GHz with all 4 cores or 2.4 GHz with 2 cores, I get within 1 FPS on benchmarks. CPU, as long as it doesn't suck, is irrelevant in it.
     
  47. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Oh Shadow of Chernobyl ran great on my XPS M1730 in SLI mode maxed out 1920x1200.

    Clear Sky on the other hand is a different story in DX10 at 1920x1200 :)

    john furmark is a very good stress tester too for the GPU :)
     
  48. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Clear Sky looks pretty good, but I agree it´s nowhere near Crysis. This only proves Crysis is pretty darn optimized considering how the game looks like.
     
  49. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    johnksss, have you run Vantage on the 'High' preset? My monitor resolution is too low to run Extreme, but I'd be interested to see how my lone 1GB 4870 desktop scale in High versus the 9800M GTX's in SLI.
     
  50. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Ha ha yes GTA 4 is another story. What other games can you think of that is really demanding on the hardware. We have so far Crysis, GTA 4 definitely takes the throne there and Clear Sky.

    World in Conflict is pretty good too be relies pretty much on CPU.
     
← Previous pageNext page →