The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    ASUS G73JH-A1 with ATi Mobility Radeon HD 5870 1GB GDDR5

    Discussion in 'ASUS Gaming Notebook Forum' started by iaTa, Dec 31, 2009.

  1. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41



    AFAIK no Fermi based mobile gpu has been announced and the current 300m is just a rebrand 200m with higher clocks and gddr5
    http://www.fudzilla.com/content/view/17241/1/

     
    Last edited by a moderator: Jan 29, 2015
  2. ChinNoobonic

    ChinNoobonic Notebook Evangelist

    Reputations:
    273
    Messages:
    638
    Likes Received:
    1
    Trophy Points:
    30



    We know Nvidia has plans for a GTX 380M, but they also plan to release the GTX 285M for this quater. Releasing that and the Fermi based mobile cards at the same time would be stupid. Nvidia has no share of the desktop market for DX11 cards to compete with ATI. It would only make sense to release the desktop versions of Fermi to compete with ATI desktop cards and the GTX 285M to compete with the MR 5870 until Fermi mobile parts hit the market.

     
    Last edited by a moderator: Jan 29, 2015
  3. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    the bad news is when fermi arrived, ATI next generation gpu "Hecatoncheires" with brand new architecture and supposed to be 28nm will be out
    http://www.fudzilla.com/content/view/15891/34/
    http://www.fudzilla.com/content/view/16299/34/

    edit:
    added more info
    http://www.fudzilla.com/content/view/15918/1/
     
  4. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Too early to tell Mack. Those numbers with the system stats in the background just seem weird. That is a good number though. It is positioned to kick the 280m in the nuts.
     
  5. mackloon

    mackloon Notebook Consultant

    Reputations:
    126
    Messages:
    222
    Likes Received:
    0
    Trophy Points:
    30

    that is what i like to hear. i do not care about ati or nvidia being better...who gives a sheet. I just want to know that the preorder i have on this beech was a good decision. I am not a hard core gamer, but if i want to play a game over the next 2 years i want to know that i have somehing that can play the game at medium settings or better...of course i know some games...2 years from now, may have to played on lower or not at all...
     
  6. shaun1800

    shaun1800 Notebook Guru

    Reputations:
    0
    Messages:
    50
    Likes Received:
    0
    Trophy Points:
    15
    I think this has been brought up but we never really came to any conclusion. Because the g73's disc drive is on a unique slant, isn't it going to be a problem replacing it with anything else? I'm looking at ordering it with a standard dvdr from xotic and savings some cash.
     
  7. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Yea, the G51JH I have had the same weird slant on the Dvdrw. I replaced it with a standard LG hT10n Blu ray drive and just swapped the face plates. Internally, the are exactly the same.
     
  8. l3g4cy99

    l3g4cy99 Notebook Evangelist

    Reputations:
    78
    Messages:
    581
    Likes Received:
    0
    Trophy Points:
    30
    The drive is a standard optical slim that ASUS uses, the door is just slanted with the 5 degree angle of the chassis. Atleast this is what has been speculated to this point.
     
  9. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    This isn't where discussion should be heading, but I have to set some things straight, since there is a lot of Nvidia bashing going on (understandable).

    Nvidia and Ati have had a long history of switching places. I know it's easy to be caught up in the moment, and G92 is an especially un-nerving subject.

    Originally, G80 was a beast of a core, 90nm and 384-bit, inadvertently expensive to produce. Ati was playing the same game, but was late to the table, and the goods simply were not as good (The massive, 512-bit, 65nm R600). Nvidia was on top of their game, and quickly headed into 65nm production as well, with the 256-bit, faster, and optimized G92 core. It was cheaper, and has become much cheaper, and Nvidia continued to optimize, where it now stands in its 55nm G92b form (originally released as the 9800 GTX+, appropriately rebranded as the GTS 250).

    GT200 was Nvidia's (repeated) mistake. This time however, things really got out of hand. GT200 took time to be put in 55nm fabrication, originally had no design for entry level and mainstream, and it was out of the question to produce in 40nm. The chip is simply too freaking big, and Nvidia was lacking something...

    Ati had already realized the mistake, invested in 55nm early, and taking RV670 a step further, produced RV770 with GDDR5 memory. Instead of creating a massive and costly 512-bit chip like they had done in the past, and like Nvidia had just put upon themselves, they took a chance pairing GDDR5 with a smaller, faster, and more powerful 55nm 256-bit core. Ati has only been working at the same pace, sampling 40nm with the 4770, and releasing a proper successor with RV870.

    Get the gist here?

    (2006) Ati makes mistake with R600
    (2007-2009) Ati takes it slow and steady, realizes proper GPU succession. RV670 ( 256-bit, 55nm, architecure improvements) -> RV770 (GDDR5, doubled shaders, speed) -> RV870 (40nm, new DX11 architecture, doubled shaders, more speed)

    (2006) Nvidia gets lucky, G80 is a monster
    (2007-2009) Nvidia gets a grip on G80 and releases the G92 for everyone. GT200 is unarguably fast, but too large and expensive. No entry level products, but ol' G92 should suffice? Slow to 55nm fab with G92b, then GT200b, behind on 40nm, no GDDR5 support to be found.

    G92 is a fine core, but its heritage is from 2006. It has been perfected, but you can only perfect a certain technology so much, until it becomes obsolete. HOWEVER, it has been and still is a perfect fit for laptops. The original 8800M GTX was based on the G92 8800 GT, and soon was followed up upon with the 9800M GTX and Quadro 3700M. By the time GTX 280M was coming around, Ati was also putting their 4870 in laptops, and this should have pointed towards disaster for Nvidia in the notebook segment, but GDDR5 Mobility 4870 never surfaced, and essentially became an underclocked GDDR3 4850. The 4850 and G92b have always been at the same level on the desktop front, so Nvidia stayed ok.

    What I've been saying all along, is that we need a 256-bit GPU with GDDR5 memory for notebooks, at least in order to take a step forward with memory bandwith. The RV870 is too much for a laptop chassis at this point, a rehashed Mobility 4890 with GDDR5 would have done the trick, though. But instead, Ati sort of took the articulate route, realized it's too early for that bandwith doubling on the notebook front, and created the RV850 based Mobility 5870. Which is just fine and dandy really, from both an engineer's and marketer's perspective it absolutely makes sense. It demands less power, creates less noise and heat, costs less, and still offers a speed increase and next-gen platform support.

    Nvidia has been slow to the game, but is making advances. The 40nm, GT215 based GTS 360M (aka GT240), is a proper 40nm GDDR5 GPU. A first for Nvidia. The crying shame really, is that Fermi is still being kept under wraps. I say continue to let Ati reap the benefits, and give Nvidia the time to make a proper succession. It's about time!

    What I really want, is a G92c... the same old G92 core in 40nm fab with DX10.1 support and GDDR5. That would leave all mobile gamers with nothing but a big, fat, grin. Really.
     
  10. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    yes ati or nvidia being better DOES MATTER, competition bring the prices low so as performance up therefore consumer=win

    for the price/performance and value perspective, yes

    maybe you can get a future proof system
    Core I32 128 cores 512 threads 200Ghz
    GTX 680M GDDR9 12GIGS QUAD SLI
    5Terabyte DDR9
    :p
     
  11. Lanaya

    Lanaya Templar Assassin

    Reputations:
    656
    Messages:
    2,577
    Likes Received:
    4
    Trophy Points:
    56
    after a lot of looking the GTX280m in the Clevo W860CU, gets a 5244 GPU score. I made sure I saw at least 3 other comparisons:

    http://forum.notebookreview.com/showpost.php?p=5403625&postcount=100

    http://www.xoticpc.com/reviews/8690/3dmarkvantage-standard settings.jpg

    http://www.sunny16.org/images/W860CU/vantage_results.jpg [ from: http://forum.notebookreview.com/showthread.php?t=442139 ]


    so even if you knock 10% off of the GPU score for the 5870@ that res, you still get something that puts a lot of distance over the 280m.
     
  12. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Good overview. As an Nvidia and Ati owner you have it just about right.

    The important part I think that you left out is the mass recall of Nvidia's mobile chips 2 years ago that they denied anything being wrong with. I work for a company that had over 30 laptops with Nvidia GO7000 series chips that burned to the ground. There was a class action lawsuit and Nvidia lost a lot of juice with me and with other people that were in the I.T. industry. I will never forgive them for denying the problem to the very end, but I am willing to forget. They did get burned for a HUGE amount of money in the class action, so that just goes to show you that Karma is a BIATCH!

    http://www.geek.com/articles/news/n...aptops-clas-action-status-requested-20090513/

    Nvidia's cure for the Oem's was to issue a new Bios that pegged the internal fan at 100% ALL THE TIME so that their shoddy chips would last just a little bit longer. Maybe just passed the warranty period. Not how a good company does business in my book.
     
  13. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    I'm with you dude, but Joker insists that he could match that number with an over-clocked 260m. I'm calling shens, but I clearly found last year's AlienWare with a 280m that was highly overclocked on this forum that got trounced by this 5870. I understand their argument that it was not mxm3.0b etc. etc., but it was HEAVIILY over-clocked to the point of being border-line unsafe so I would think that that makes up the difference in it not being MXM3.0b.
     
  14. Lanaya

    Lanaya Templar Assassin

    Reputations:
    656
    Messages:
    2,577
    Likes Received:
    4
    Trophy Points:
    56
    that same 280m got 12,500 3dmark06, but had 900 CPU score over the 720m (they used a stock 920m) so I'd imagine with a 720m it'd have gotten 11,500 or so stock!
     
  15. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    i think you misinterpret what i am trying to say
    my point is why wait for fermi when 5870 is here

    also when fermi become available, ATI's next generation card will be out so it will be a never ending wait

    also we cant deny the fact that NVIDIA IS LAGGING BEHIND ATI
    what really matter is NOW, not the past
     
  16. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    Because I die hard for the green team.
     
  17. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    Nvidia really isn't lagging behind in notebook performance. And they have proven to me they are aware of their mistakes, providing 40nm mainstream chips with GDDR5 memory, and EOL'd GT200. They are completely out of the desktop front, besides the GT240 sampling of what's to come (it's a $99 card). I'm giving them time, because that's all they need right now. I would suspect GTX 380M is still under wraps because it is, in fact, a true GF100 DX11 core, and will more-than-likely be a mainstream derived GPU. Looking at GT240 as an entry level GPU for Nvidia, I would suspect it's possible for Nvidia to have a subsequent 256-bit, 384-bit, perhaps even 448-bit GF100 Lineup. This means there's still possibility of GTX 380M being 256-bit.

    The moment you get a grip on logic, hold tight.
     
  18. sl1982

    sl1982 Notebook Guru

    Reputations:
    20
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    I must admit I have no preference either way. I have owned both in desktops and ati in a notebook. The way I look at it I am going to purchase whatever GPU is the fastest at the time I am ready to buy a laptop. If there is anything I have learned if you wait around for the next technology to come out you never end up buying anything.
     
  19. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    the problem of fermi is big(expensive to make) and hot and to make matters worse they are on the same boat(TSMC)

    for the price/performance perspective i am confident that ATI has the bang for the buck(atleast this time)

    there are reports that femi's stream processor has been cut to 448(originally 512) http://www.semiaccurate.com/2009/12/21/nvidia-castrates-fermi-448sps/
     
  20. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    My last notebook rig was GTX 280M equipped NP5797. Currently, I'm looking at building a MicroATX p55, i5 750, CF 5750, mini rig, but not until March/April. I've been in the GPU mix for 8 years or so. Traditionally, you always want to take a look at what both Ati and Nvidia have to offer. For example, you might be buying this year's 9800 XT. Then Nvidia drops this year's 6800 Ultra. You never know!
     
  21. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Shens? I posted a picture. Are you claiming I faked it? And its not such a heavy OC that it's unstable lol.
     
  22. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Let's just hope they don't drop another FX5000 series! haha
     
  23. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    The 280 that I linked was border-line unsafe, not your 260.

    My G51JH with an i7 and 260m doesn't come CLOSE to that number you got. I'm calling shens because you aren't comparing apples to apples as I've said before.

    We can fine tune and optimize for 3dmark06 until the cows come home, but that doesn't give everyone a fair assessment of actual performance. Give EIment 3 months to mod the G73 and tweak the 5870, and I'm sure he could make a ridiculous 3dmark06 number appear like magic too.
     
  24. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    I actually had to correct myself, X800 came after 6800 :p

    9800 was a fine card, but Nvidia hit the mark with 6800.

    Staying on topic... G73jh is the best gaming notebook deal on the market right now! $1600 for what you get, is a steal.
     
  25. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    It was unstable mainly because your clocks were radically unbalanced. http://forum.notebookreview.com/showpost.php?p=5228316&postcount=12

    I remember those days, haha. It was fireworks for both Nvidia and ATi back then.
     
  26. GenTechPC

    GenTechPC Company Representative

    Reputations:
    7,361
    Messages:
    4,586
    Likes Received:
    839
    Trophy Points:
    181
    Yes when the shipment arrives. :)
     
  27. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    I'm suddenly feeling silly for arguing about something so asinine as a friggin' laptop computer. I am fondly remembering my Riva Tnt2 desktop days when I would actually be playing games, rather than waxing philosophical about a piece of computer equipment...
     
  28. Hydeo

    Hydeo Notebook Evangelist

    Reputations:
    72
    Messages:
    375
    Likes Received:
    0
    Trophy Points:
    30
    Probably because we were all younger then and didn't care about the equipment too much :)
     
  29. anothergeek

    anothergeek Equivocally Nerdy

    Reputations:
    668
    Messages:
    1,874
    Likes Received:
    0
    Trophy Points:
    55
    Today's desktop equipment is ridiculously powerful. I honestly do not see the need spending more than $300 on a 5850 or CF 5750/70, $200 on i5 750, etc. Notebooks are always behind the curve, you're always wanting more :D

    I had to sell my Sager. It was at a loss, but I was wanting.... more. I'm glad to see Asus offering such an awesome config at an unheard of price. 1080p, 17" screen, Blu Ray, 1TB storage, there's more than just gaming equipment! But yeah... a 5750 alone just isn't enough for me :p If the M17x had recieved 5870's today... shock/awe
     
  30. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
    Who says I fine tuned anything? That was a stock run with just a mild overclock! And the 260M in your G51 is crippled and underclocked.
     
  31. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231
  32. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Other factors are involved. You have an 820qm. 1333 Ram. Raid setup. These things factor into the total score, correct? That with the fact that you over-clocked everything makes your comparison less than ideal for those of us that are actually buying this laptop.

    It gives a false representation if it's not an exact setup comparison. We still don't even know for sure what our 3dmark score actually IS on the 5870 due to the screenshot released to us.

    According to the screengrab, it was run at a higher res with a lower clocked cpu.

    Most importantly, the 3dmark06 score will not even show us what the card is capable of in actual games since the new architecture of the chip should also make it more formidable than the previous Gtx mobile parts.
     
  33. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    The run was done on my M15x though I fail to see what difference a hard drive would make in a 3dmark06 run? The i820 plays a role in the CPU score and so you can add about another 400-500 points to the 5870 score to equate the two. And the only thing that was overclocked was the card, nothing else in the M15x can be overclocked. You missed the point of my post, it was to show that a 260M GTX with what I consider a mild overclock can nearly match that score. The 5870 is clearly the better card and it should be compared to the crusty G92.
     
  34. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    I suddenly feel like playing an actual game, but it's already midnight and I have to be up for work in 5 hours.
     
  35. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Get frontlines fuel of war. A lot of people hate on it but I find it extremely fun ;) BTW I have a G73 on pre-order for my girlfriend so obviously I think it's a solid system.
     
  36. sl1982

    sl1982 Notebook Guru

    Reputations:
    20
    Messages:
    63
    Likes Received:
    0
    Trophy Points:
    15
    What a lucky girl. I am trying to convince my wife to let me order one. I think I might just accidentally drop my current laptop on the floor.
     
  37. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    I am wondering why so many people consider 3DMark score a source of benchmark,

    http://en.expreview.com/2008/06/24/is-nvidia-cheats-on-3dmark-and-ut3/478.html

    Intel GMA score even manage to have the same score as ati :eek:
    http://en.expreview.com/2009/10/15/intel-graphics-drivers-employ-optimizations-for-3dmark-vantage/5501.html

    http://www.theinquirer.net/inquirer/news/1048824/nvidia-cheats-3dmark-177

     
  38. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    I don't know. I've been stuck in a rut where I only play Counter Strike Office map a few times a week, and of course I beat Dragon Age Origins to death. I've been thinking about seeing what all the hype is about COD:MW2, Left4Dead, Day of Defeat etc., since I've never ventured beyond Counter Strike...
     
  39. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Definitely give all those games a try, they're fun. I've been in the same rut actually. I took about 2 years off from PC gaming altogether and recently came back.
     
  40. wontonjon

    wontonjon Notebook Guru

    Reputations:
    0
    Messages:
    52
    Likes Received:
    0
    Trophy Points:
    15
    @evensen007,

    In your sig, it denotes the 80gb SSD but w/o a 2nd drive - are you ordering as such w/o a 2nd drive? (as i didn't know that was an option)...
     
  41. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Which one first? I also just came back to gaming. I trained all last year for the Florida Ironman, and pretty much just worked, trained, slept, repeat.

    I used to play Battlefield2 way back in the day and liked that. I don't know. It just seems like Counter Strike is easy to hop into, but there is no sense of community, continuity, or progress...
     
  42. evensen007

    evensen007 Notebook Deity

    Reputations:
    483
    Messages:
    1,324
    Likes Received:
    0
    Trophy Points:
    55
    Jon,

    No, I just didn't bother listing it. Honestly, it is going to go on ebay because from now on I will use SSD for O/S and external storage for... well, for storage!
     
  43. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    That's why people turn off PhysX before they run the CPU benchmark in 3DMark06. They're not cheating if the user has complete driver control over the PhysX switch. Those that run the benchmark with it on are the ones that are cheating. Secondly, why are you referencing to articles published in mid 2008? 177.39 is ancient, and the articles are, again, irrelevant to the benchmark scores that we are discussing. And this is not the first time you posted random info. Recall a few pages back.
     
  44. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    then how can u explain this? intel has no PhysX
    http://en.expreview.com/2009/10/15/intel-graphics-drivers-employ-optimizations-for-3dmark-vantage/5501.html

    intel, ati and nvidia use DRIVER OPTIMIZATIONS on 3dmark so it is NOT A RELIABLE SOURCE of benchmark

    [​IMG]
    [QOUTE=]According to Techreport, the system’s overall score climbs by 37% when the graphics driver knows it’s running Vantage, and the GPU score jumps by 46%, while the CPU score falls by nearly 10%.[/QUTE]
    [​IMG]
    [​IMG]

    also cheating in 3dmark is not new news, reports have been made since 2003
     
  45. Lanaya

    Lanaya Templar Assassin

    Reputations:
    656
    Messages:
    2,577
    Likes Received:
    4
    Trophy Points:
    56
    will someone please shut this guy up? he's really beginning to sound like a HORRIBLE broken record.
     
  46. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    I'm addressing the first link that you posted. Obviously Intel doesn't have PhysX so why would I be talking about the ability to toggle the PhysX switch in the driver? I thought that you would have seen my implication that I was referring to Nvidia drivers.
     
  47. 5150Joker

    5150Joker Tech|Inferno

    Reputations:
    4,974
    Messages:
    7,036
    Likes Received:
    113
    Trophy Points:
    231

    Hmm I'd probably go with MW2 first if you like shooters. After that L4D2 would be a good choice. If you liked Battlefield 2, then Frontlines Fuel of War on the 21st S-S server is tons of fun. Snipers are one shot killers :)
     
  48. raclimja

    raclimja Notebook Consultant

    Reputations:
    130
    Messages:
    220
    Likes Received:
    28
    Trophy Points:
    41
    COMMONSENSE, intel cheated, ati cheated so as nvidia
     
  49. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    The next person that ever trys to use 3dmark06 as a reference to measure the capabilities of a modern gaming laptops performance should be shot.

    3dmark06 is OLD AND UNRELIABLE

    You can easily have a slower lesser gpus laptop win a 3dmark06 bench just because it has a faster CPU because the bench is so CPU bound. use the GPU scores of Vantage to do your comparisons with or your just wasting your time.

    I know atleast one person here has seen me state it before and even post proof of it but still nobody speaks up and lets all this 3dmark06 stuff muck up a thread.

    May as well start benching Aquamark3 again while we are at it.
     
  50. Soviet Sunrise

    Soviet Sunrise Notebook Prophet

    Reputations:
    2,140
    Messages:
    6,547
    Likes Received:
    0
    Trophy Points:
    205
    So what? 10char
     
← Previous pageNext page →