The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Anyone run age of conan and warhammer online on a 9262 with 9800 gtx in sli?

    Discussion in 'Sager and Clevo' started by firstn20, Feb 2, 2009.

  1. firstn20

    firstn20 Notebook Evangelist

    Reputations:
    26
    Messages:
    520
    Likes Received:
    0
    Trophy Points:
    30
    hey guys

    Has any ran age of conan and warhammer online on a 9262 with 9800 gtx in sli? I think I finally decided to purchase a 9262 with 9800 gtx's in sli. I had a 1730 with the X9000 (not overclocked) running 8800 gtx in sli = 1024 mb of video memory back in july which I returned. Age of Conan ran fine, but I had to tinker around with the drivers since AoC was very picky on what driver to use, so it was a trial and error thing. But when I finally got it to work (the lighting, grass, aa etc...) it looked good, nothing TOO impressive but better than my dell 1530. I guess I expected it to be a lot smoother and for it to run fster at higher settings. But I didnt really take the time to fully customize the game for the 1730 because I was working a lot and returned the system within 2 weeks.

    Now with the 9262 with the 9800 gtx in sli (2 gigs of video memory) and the now faster quadcore chips, has anyone ran age of conan or warhammer online? Im curious to know how the system handles age of conan at MAX settings and with the highest resolution. Also I played warhammer online on my 1530 (8600 gt)and it looks alright, same class of graphics as wow. But my friend ran it on his desktop with his 9800 gx2 (single) and it looked so awesome. I never that game could look so good.

    But anyway, any feedback would be very appreciated. And one last question, does the new mobile extreme quadcore QX9300 cpus in systems such as the 5797's and alienwares out perform the desktop quadcore cpus Q9550/9650 found in the 9262's?
     
  2. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    I would imagine the 9262 with 2 9800M GTX's in SLI would run the game effortlessly with the correct drivers, and most likely even with stock drivers, also 2 cards in SLI mirror the same memory so 2 8800M GTX's is 512mb and 2 9800M GTX's is still only 1GB of memory.
    As for the new mobile quads, they do not out perform the Q9550 or Q9650. The QX9300 runs at 2.53ghz, and the Q9550 and Q9650 run at 2.83ghz and 3.0ghz respectively.

    The QX9300 and the 9550 and 9650 all share a 12mb L2 cache the only thing is that you can't really overclock the Q9650 or Q9550 in the 9262, and you most likely can over clock the QX9300 in the 5797 or the Alienware, (although I'm not sure on the Alienware) even IF you can overclock the QX9300 you would at best get around par performance with the 9550 and on top of that there is a tremendous price difference between these desktop processors and the mobile processors.. you will pay a huge premium for the QX9300 not worth it IMO plus you dont get SLI in the 5797.
     
  3. Shyster1

    Shyster1 Notebook Nobel Laureate

    Reputations:
    6,926
    Messages:
    8,178
    Likes Received:
    0
    Trophy Points:
    205
    Very nice analysis, Excal27. :cool:
     
  4. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    Thank you very much Shyster.
     
  5. naticus

    naticus Notebook Deity

    Reputations:
    630
    Messages:
    1,767
    Likes Received:
    0
    Trophy Points:
    55
    You must of had something seriously wrong with the setup or the drivers/game, sli 8800m gtx should play AoC without a problem and give you over 100 fps constant @ max settings, high res.
     
  6. kzap

    kzap Notebook Enthusiast

    Reputations:
    1
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    ive run warhammer online and it plays perfectly with the latest nvidia drivers, its not exactly the kind of game to slow down nor to impress with graphics.
     
  7. plasma.

    plasma. herpyderpy

    Reputations:
    1,279
    Messages:
    2,870
    Likes Received:
    0
    Trophy Points:
    55
    Anyone getting a 9800M GTX SLI / Quad Core rig to play Warhammer Online and AoC is wasting their money.
     
  8. IrishMettle

    IrishMettle Notebook Enthusiast

    Reputations:
    0
    Messages:
    48
    Likes Received:
    0
    Trophy Points:
    15
    For Warhammer I would agree, but for AoC I would disagree. AoC graphics are very demanding at times. If you want to run it maxed with all shadows, all particles, and all graphic sliders maxed out you need the best you can buy.

    I run in on my 5793 with the 8800m GTX on high with shadows only set to characters and I typically average 30-50 fps outside depending on the scene.

    During sieges you have to turn shadows off and particle to only yourself or it is not playable. That is really just performance issues on Funcom's end that they need to work on though...
     
  9. firstn20

    firstn20 Notebook Evangelist

    Reputations:
    26
    Messages:
    520
    Likes Received:
    0
    Trophy Points:
    30
    Ok question on the cpus on the 9262 vs mobile QX9300. People were saying systems such as the ocz whitebook, alienware m17 where getting HIGH benchmarks because of their QX9300 processors and that was the reason why the scores where skyrocketting. But as stated in this thread, the cpus 9550/9650 supposely out performs the QX9300, but this contradicts the first claim. In my other post asking about m17 benchmarks with the 3870's in crossfires vs 9262's 9800 gtx in sli, ppl were saying the only reason why the m17 were getting 15k-16k benchmarks is because of the cpu. But if the 9262's cpu is far mroe superior, why is the 9262 only hitting 15kish+?

    Thanks for your insights so far, please anyone else who ran aoc please let me know how it runs.
     
  10. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    lol You need to take 3Dmark scores with a grain of salt, IMO 3Dmark is a gimmicky joke. I would guaranty you the 9262 with a Q9650 and dual 9800M GTX's will out perform the m17 with a QX9300 and dual 3870's in real world performance in pretty much any game and any app that supports SLI and Crossfire and in games that don't.

    As for why an m17 can score higher than a 9262 it can be a number of reasons, perhaps the architecture of the 3870's "looks better" to 3Dmark or they were specifically designed to look better in 3Dmark for sales purposes. Or perhaps it gives higher scores to the processor because maybe some of these guys managed to overclock their QX9300's to 4.0ghz temporarily just for a quick 3Dmark score that would make them feel better about their purchase, who knows, we could speculate for a long time on this, but I can tell you, I would take the 9262 any day over a m17 without hesitation.
     
  11. firstn20

    firstn20 Notebook Evangelist

    Reputations:
    26
    Messages:
    520
    Likes Received:
    0
    Trophy Points:
    30
    yeah I would never get an alienware anyway, I was just curious as how the m17 with inferior tech outscore a uber 9262
     
  12. ichime

    ichime Notebook Elder

    Reputations:
    2,420
    Messages:
    2,676
    Likes Received:
    3
    Trophy Points:
    56
    The myth about desktop CPUs being so much more powerful/faster than its mobile counterpart of the same clock speed has been debunked. A stock clocked QX9300 will be less powerful than a Q9550 and Q9650. A QX9300 with 11.5 multipliers (3.06ghz) will be faster than both.

    That said, you won't see a huge difference or any difference in performance between a stock Q9650 and an OC'd QX9300 @ 3 ghz with the same GPU (comparing, for example, a D901c with a single 9800M gTX + Q9650 and a M570ETU with a 9800M GTX + QX9300, though the multiplier maxes at 11 for the ETU). So it comes down to the GPU. In this case, 9800M GTXs in SLi will perform better than two 3870s in most cases.
     
  13. Megacharge

    Megacharge Custom User Title

    Reputations:
    2,230
    Messages:
    2,418
    Likes Received:
    14
    Trophy Points:
    56
    No one here is saying desktop processors are so much more powerful than mobile processors. If a QX9300 can be safely overclocked to 3.0ghz then I would expect it to perform the same as the stock Q9650. If it can be overclocked to 3.6 safely I don't see how the QX9300 alone at 3.6 as opposed to the Q9650 at 3.0 would give a thousand or more point difference in 3Dmark unless there is another underlying factor at work, maybe the video cards?

    That being said even if its all about the processor and the QX9300 can be safely overclocked to 3.6, the performances differences between it and the stock Q9650 are not worth the price to performance ratio unless you have the money to burn.
     
  14. XanKage

    XanKage Notebook Geek

    Reputations:
    1
    Messages:
    80
    Likes Received:
    0
    Trophy Points:
    15
    I'd strongly recommend getting SLi cards, I got two 9800 GTX's and they're amazing! However, you should be aware that some games (such as World of Warcraft) do NOT support SLi and might cause trouble. For WoW, I solved it by doing some changes with the nHancer software (but I'm still getting lower fps than in WoW than in Fallot 3 :D). Just so you know.

    Good luck with your purchase!
     
  15. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,464
    Likes Received:
    12,852
    Trophy Points:
    931
    qx9300 is a great performing chip. no questions asked. and it will score slightly higher than a qx9650/q9650, but only for one reason only. cpu over clocking. if a 9262 we're allowed to over clock. im pretty sure it would run 18.5k easily. would not make much more difference in certain games, but would make a substantial amount of difference in encoding, trans coding & compressing.

    and one only needs to benchmark one program 3dmark vantage. i ran it with only one core and it scored exactly the same as 4 cores. so that fake myth about synthetic benchmarks not meaning anything is getting to be old.
    every last notebook title thread ask for benchmarks. from apple to zepto from gaming to any version of futuremark software.


    http://en.wikipedia.org/wiki/Benchmark_(computing)
    In computing, a benchmark is the act of running a computer program, a set of programs, or other operations, in order to assess the relative performance of an object, normally by running a number of standard tests and trials against it. The term 'benchmark' is also mostly utilized for the purposes of elaborately-designed benchmarking programs themselves. Benchmarking is usually associated with assessing performance characteristics of computer hardware, for example, the floating point operation performance of a CPU, but there are circumstances when the technique is also applicable to software. Software benchmarks are, for example, run against compilers or database management systems. Another type of test program, namely test suites or validation suites, are intended to assess the correctness of software.

    Benchmarks provide a method of comparing the performance of various subsystems across different chip/system architectures.

    that was the short version.. :)