The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Geforce GTX280M and 260M to launch at CeBit

    Discussion in 'Gaming (Software and Graphics Cards)' started by ichime, Feb 24, 2009.

  1. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    Nah my card isn't going to destroy yours (I think), I benched Vantage against johnksss but that's synthetic so it doesn't generalize to games.

    Set all settings to "Very High" DX10 in game 1680x1050, then use "Benchmark_GPU.bat" in your Crysis folder. Also then run it with 4xAA. Then I'll do it myself.

    Since the Asus W90 people don't have these things to benchmark best we can do for now is bench against mine and see how the results can be generalized.
     
  2. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    i can answer all those, but umm..that' list is looking pretty long...lol

    cpu means nothing when running vantange mark. it's all on your gpu for about 98 to 99 percent of the runs till you hit the cpu section. i ran it on 1 core and 4 cores and the gpu score stay near identical.

    drivers are the biggest reason between scoring.
    and it really doesn't matter which ones..it's really about 4 real different drivers and the rest are worked over inf files that change the minute you change a setting.
    best for benching vantage are 185.20
    best for gaming seem to be 179.28 or the 182.06
    best for 3dmark06 is 176.25 or 176.09


    magnus72, we been benching them to try to understand the why part of it.

    with out AA in the picture...the 9800m gtx will excel. start adding AA and it's down hill from there.
     
  3. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yeah I feel AA isn´t even needed at the extreme res I play at 1920x1200. I barely see those jaggies at all at that res. But yes these 8800m GTX suck at AA, I must conclude that. My GTX 260 on my desktop excel at AA though. It was the same with my deskop 8800GTX it exceled at AA.

    So Jlbrightbill is surely right about these G92 cores, they do suck at AA to some degree at least. Not that I say my 8800m GTX suck at all, in fact they are really fast considering their specs. Hell I run GTA IV better than people with a GTX 280 out there, then I mean on the GTA Forums, thoug these people don´t have a clue bout optimizin their system. I don´t say my 8800m GTX SLI is faster than a GTX 280, but some people never learn how to optimize their whole system right, nor to opimize the game :)

    Just the right drivers along with an optimized OS does a whole lot to a system. I have people asking how the I managed to run Crysis at 1920x1200 with my tweaked config like I did. I say "Chipset" drivers is really important here and to install them the right way.

    Jllbrightbill I can say I have for Crysis benchmark at 1920x1200 DX9 High

    34.56 Average

    1920x1200 DX10 exactly the same average as DX9 in Vista 64.

    Strange thing though I had a better average before I formatted my harddrive :(

    On Win XP I have 37.29 Average FPS in DX9 High.

    I must say your Sagers is definitely ahead of the XPS´s. Best would be to get a Sager next time definitely. See John had the 9800m GTX a few months ahead of the newly released 9800m GTX for the XPS M1730.

    And the 8800m GTX SLi is bottlenecked already by a X9000 overclocked to 3.4 GHz. So to me the 9800m GTX in the XPS M1730 is totally unnecessery since they are still bottlenecked by the CPU. Dell would have been better off with addinc Quad Core support like Sager has.

    I write bad when I am drunk :)
     
  4. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I don't have a 1920x1200 monitor...

    Highest I can run is 1680x1050.
     
  5. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Oh sorry I meant those were at 1680x1050.
     
  6. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    1680x1050
    [​IMG]
    over clocked 1680x1050 very high dx10
    [​IMG]
    no over clock 1680x1050 very high dx10
    [​IMG]
     
  7. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    no overclock 1680x1050 very high dx10:

    [​IMG]

    I get about 7% slower framerates in Windows 7 under DX10 than I should based on people's Vista benchmarks, so consider that a low end figure. I should be up around 27 FPS. Stock vs stock, johnksss's cards are ahead by 1.5%.
     
  8. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Not surprising since Nvidia cards are known to have better performance in Crysis. You should trying benching some other games as well.
     
  9. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    that we are, it's just getting enough people together that have the same games is all....
     
  10. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    I'm grabbing the DMC4 demo since that appears to be one of the other ones people here are using for benchmarks.
     
  11. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    I can also do some benchmarks on my 512mb 4870, which will be closer to the mobile 4870. Tell me what CPU and GPU clocks to use and I can bench Crysis and 3dmark06 in Windows 7.
     
  12. Jlbrightbill

    Jlbrightbill Notebook Deity

    Reputations:
    488
    Messages:
    1,917
    Likes Received:
    0
    Trophy Points:
    55
    You know, you and I have almost identical computers aznofans. :)
     
  13. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    WHOA how did I not see that before? Only real differences are graphics memory and RAM setup... what PSU and hard drive do you have?
     
  14. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
  15. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Reference clocks for the Mobilty 4870 GDDR5 would be 680mhz core, 700mhz memory (but memory for the GDDR5 is 1gb, but only some games show improvement with more memory, includng DMC4).
     
  16. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    john I got 27-28 fps average 1680x1050 DX10 Very High and Vista 64. Will try 1920x1200 DX10 Very High and see what I get there.
     
  17. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
  18. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    wow, the 9800m/8800m gtx's are hanging in there. ill re run it at 19x12. i think i was getting like 25 max last time.

    you have a very valid point going on there. and it matters a lot on how the clocke's are clocked. just running them at ati's clocks (550/850) compared to my stock clocks(500/799). i gained like 800 gpu points in vantage
    my DT 3dmark score.
    http://service.futuremark.com/compare?3dm06=10181674

    but when i run furmark, the desktop hd 4870 is beating the gtx280 by 12 frames at 90, while im at 78. and that's not running AA
     
  19. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
  20. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    looks like nvidia is pulling a ati and changing how the crds will seat...oh brother.!

    i hope that is not a fact.
     
  21. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Dang dang Nvidia. Well a 128 SP GPU is nice indeed. But not much joy to me there.
     
  22. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    not if that is a over clocked fx3700 card it isn't....
    and they say 50 percent increase....im very very skeptical about that.
     
  23. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Actually, nVidia was the one pushing MXM 3.0; it just so happened that ATi released their reference cards first. But if learned something from ATi @ CES, those are just design references. Notebook or board manufacturers are at liberty to change the design of the cards, like Asus and I think MSI did with the 48xx cards. I hope Clevo indeed make their upcoming systiems MXM 2.x compatible because there is nothing substantial this new MXM 3.0 format offers over MXM 2.x other than more pins for the manufacturer to add more parameters when writing/setting up the VBIOS.
     
  24. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    Yes I agree john, this is basically an desktop 8800 512 GTS with 128 SP´s.

    Can´t really see how that could be an 50% increase compared to your 9800m GTX. Unless Nvidia has some magical tweaked core for these.
     
  25. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    Now that the specs are pretty much official, I'd have to say that I'm a bit disappointed in nVidia. I thought they would surprise us and at least add GDDR5 or something, but I guess the fudzilla rumors were accurate.

    On paper, the GTX 280M looks like an underclocked desktop 9800 GTX. The GDDR3 4870 in the Asus W90vp is looks like an underclocked desktop 4850, and the GDDR5 4870 announced in CES is BETWEEN a regular desktop 4850 and a desktop 4870 just based on hard specs, and we know how the desktop geforce 9 series stacks up to the desktop 48xx series.

    Also, the 9800M GTX is close to an 8800 GT/9800 GT with a 7% performance between them give or take (because of the extra ram on the 9800M GTX), and from the performance comparisons I've seen on the desktop side of things, the most I've seen a 9800 GTX outperform an 8800GT/9800 GT is 25%

    But then again, nVidia are good with drivers...
     
  26. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    It looks like ATI is going to hold the laptop performance crown for a while... which I have no problem with. I'm tired of Nvidia pulling this renaming/overclocking crap with its mobile cards. Hopefully updated drivers will bring out the potential in the 3700M (GTX 280M), but we all know it still won't be able to touch the GDDR5 HD 4870.
     
  27. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    I wonder if they are going to release a GTX 180M/170M or if they just replaced those with the GTX 280M/260M, respectively.
     
  28. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Probably not. But if they did, then wow the laptop gpu market will be so crowded with complicated naming schemes and minor performance differences.
     
  29. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    i already got my emails lined up and ready to start rolling. this card better beat my card by a lot or im going to be posting in their faces on a regular basis...lol

    nothing personal nvidia, but we are about tired of you and this nonsense....
    now im still holding out till the real numbers start to show up, but if they aren't what we are expecting....
     
  30. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    The ultimate insult would be if they still went with rebranding a regular 9800M GTX as the GTX 180M in addition to this GTX 280M...
     
  31. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    That would be hilarious, embarrassing, and unsurprising...
     
  32. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    hummm, hope they aint that stupid.

    im still not sold on ati either...they could have just opened up the 3870s and made them 16x and opened up the rest of the stream processors they had shut down. (speculation of course)
     
  33. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    I'm pretty impressed with ATI's recent history. Just look at the huge comeback they made with the desktop 3800 and 4800 series! Now after a few years they're finally coming back to compete in the high-end laptop sector too. Even if they had locked some of the 3800's processors, the business world is not ideal. ATI is a company after all and it does have to gain competitive advantages through strategy.
     
  34. Micaiah

    Micaiah Notebook Deity

    Reputations:
    1,333
    Messages:
    1,915
    Likes Received:
    41
    Trophy Points:
    66
    This might be of interest to you.

    The RV770 Story: Documenting ATI's Road to Success
     
  35. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Awesome read... I've known about harvesting for some time but didn't know that was what it was called. Also, I had no idea that the GTX260 was just a harvested GTX280 (a GTX280 with less than 48 broken cores had 48 out of 240 total cores locked to become a 192-core GTX260). I also wasn't aware that increasing graphics card prices 2-3 years ago were a result of over-harvesting. Additionally, I wouldn't have called R600 a total failure. The HD 2900 sucked but the 3850, 3870, and 3870X2 were all really competitive despite large size and power consumption.

    EDIT: Nvm, the 3800 series was RV670, which was slightly revised and wasn't the "total failure" the article was referring to.
     
  36. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    nice read....
    now back on topic
     
  37. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Well this still is somewhat on topic. The article shows how the GTX 260 and 280 need a significant revision before Nvidia can really beat ATI in the high-end desktop sector, and since the new mobile Nvidia chips are based off older desktop Nvidia chips and the new mobile ATI chips are based off the highly successful current generation desktop ATI chips, Nvidia still has a ways to go.
     
  38. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    yeah yeah yeah..current..and so far the 4870m is an up clocked 4850m yeah yeah yeah...*LOL*

    no your off topic pal. :D
     
  39. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    What...? The Mobility HD 4870 has GDDR5 vs the GDDR3 of the Mobility HD 4850. And how am I off topic? I'm still talking about the GTX280M and 260M, although indirectly.
     
  40. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    think you better go back and check your facts. no one has gddr5 memory on any mobile 4870 x2 cards as of right now. the only one suspected card even having this is the dual gpu board. which no one has in a machine as of yet. im sure it's around somewhere, but not advertised at the moment. all the w90 are using gddr3 as of today

    off topic
    ummm, old tech(nvidia) vs new tech(ati)
    http://www.youtube.com/watch?v=ma_Df9lm8DQ
     
  41. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    The 4870x2 is not quite relevant if I'm talking about the the mobile 4870 that will be coming out in the near future.

    EDIT: Also, comparing the GTX295 and HD4870X2 isn't really valid considering price points, although I will concede that the 55nm shrink is a big step for Nvidia.
     
  42. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    Re: Geforce GTX280M and 260M to launch at CeBit<--- see this? it's mobile. not desktop card.
    i was off topic talking about desktop cards earlier in the thread. so im pretty sure i know what off topic is.. :)
    your desktop card is an out of context reference just like my gtx280 is.

    The 4870x2 is not quite relevant if I'm talking about the the mobile 4870 that will be coming out in the near future.<--- not sure what you we're getting at with this though, since the topic is Geforce GTX280M and 260M to launch at CeBit not ati 4870x2 at cebit. nor is it gtx280 at cebit. that would be gtx280 M
    but that's all the technical stuff....

    back on topic...
    man, i sure hope that's not a qfx3700m pass off, but we shall see if your correct ichime...less than 12 hours from now....

    flip side

    they said it's sli, so if it is in fact a performer then i might have to look into getting a pair of them. hahaha....
     
  43. The_Moo™

    The_Moo™ Here we go again.....

    Reputations:
    3,973
    Messages:
    13,930
    Likes Received:
    0
    Trophy Points:
    455
    bah if nvidia actually releases somethign worth while imma be sad :( i was lookin frward to being on top for a little bit :D. Anxious to see what these cards are all about
     
  44. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    it's still an if mr moo. they sure aint on top if AA is introduced. AA seems to be a mobile gtx killer. lol
     
  45. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    The thing is, though, if you read what I said earlier, I was referring to the GTX260/280 and HD4870/4850 to try and prove something about their mobile counterparts, which basically is this:

    1) It's a given that GTX260M and 280M are going to be much less powerful than the desktop variants since they're basically based on older 8800 cards.
    2) The desktop 4800 series was a huge success and an innovation for ATI. It stole much of Nvidia's high-end market share, and made Nvidia rethink their business strategy (which I'm thinking was a main reason for the 55nm die shrink).
    3) Therefore, since the mobile 4870 is an unadulterated version of the desktop variant (minus clocks), ATI has another winner in its hands.

    This is merely my logic based off of my readings. Obviously, anyone is free to correct me if I'm wrong somewhere.
     
  46. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    The thing is, though, if you read what I said earlier, I was referring to the GTX260/280 and HD4870/4850 to try and prove something about their mobile counterparts, which basically is this:
    fair enough...

    1) It's a given that GTX260M and 280M are going to be much less powerful than the desktop variants since they're basically based on older 8800 cards.
    this here is still up in the air. since they may be trying to pass a gts chip off as a gtx chip. waiting on the final on that one.

    2) The desktop 4800 series was a huge success and an innovation for ATI. It stole much of Nvidia's high-end market share, and made Nvidia rethink their business strategy (which I'm thinking was a main reason for the 55nm die shrink). the 4870x2 stole also, more than like a good call as well (55nm die)

    3) Therefore, since the mobile 4870 is an unadulterated version of the desktop variant (minus clocks), ATI has another winner in its hands.
    the jury is still out on this one. not a whole lot of happy people right at the moment. not taking away from their card, but it's like getting a brand new benz and told that you will need to use 95 octane to run it. (we only have 91 octane) so the card comes out with the potential to be a solid top dog, but is kept down with drivers.5 months down the line...they get drivers..hooray!! but guess what, since they took so long....the card is now outdated and then re branded in to something else. yes ati does the same as nvidia, but they are up front with theirs while nvidia is trying to pass off gts as gtx and so forth and so on.


    side note:
    this new chip has to beat me period. there is just no way around it. and if it doesn't....well....then it's going to be a pretty big problem. i've push this card to about 75 percent of what it's cable of doing. it's limited by the machine it's in. so im thinking nvidia was thinking the same thing. although using old tech to do this seems ridiculous to say he least, but if it kicks high tech in the butt....then what? (speculation of course) always two sides to every story....
     
  47. Dox@LV2Go

    Dox@LV2Go Notebook Consultant

    Reputations:
    247
    Messages:
    188
    Likes Received:
    0
    Trophy Points:
    30
    isnt that the case with nvidia (ok its 3 months for official drivers but hey...)

    Thats why ati users uses mobility modder (if they update it to support that card but I assume there will be modders out there that would support that card in their drivers) and nvidia users go to laptopvideo2go?

    Bottom line is both companies gots its fault. But Nvidia step over the line here. Imagine the HD4800 mobility series were based on HD3800 mobility series.
     
  48. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931

    yea, there is already talk about something like that...i read it a few places, but oh well/ like you say...they both have their faults non the less.

    ati users using mobilty modder are having a bit of an issue right about now. not all of them mind you, but more than the majority. and there are a few modders for the ati card...heck, if i was running crossfire i would be modding my own as well.
     
  49. NJoy

    NJoy Няшka

    Reputations:
    379
    Messages:
    857
    Likes Received:
    1
    Trophy Points:
    31
    Nvidia just disappoints me... buying my new laptop i was expecting way better performance from 9600M GT, which apparently just a rebrand of rebrand etc., not to say it's a heat generator from the first day (heck, it manages to go up as high as 94C under load while hovering around 58 when idle) I used to be an nVidia fanboy, but that time long gone...
    Hopefully, I'll be able to get an MXM-II 4670 sometime around summer
     
  50. aznofazns

    aznofazns Performance Junkie

    Reputations:
    159
    Messages:
    945
    Likes Received:
    0
    Trophy Points:
    30
    Well, considering how the mobile 4870 is so similar to the desktop variant, wouldn't it be pretty easy to make decent drivers for the mobile one?
     
← Previous pageNext page →