The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    14" Widescreen and WoW

    Discussion in 'Gaming (Software and Graphics Cards)' started by battlecat, Apr 14, 2006.

  1. battlecat

    battlecat Notebook Consultant

    Reputations:
    0
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    30
    Hey I'm just about to get a 14" widescreen ASUS Z62j with a GeForce Go 7300 and I was wondering if the widescreen will stretch and deform the World of Warcraft at all? (since it's a new game I'm guessing no)

    Also, right now I'm playing WoW on my desktop with GeForce FX 5200 and was wondering how it will compare with the Go 7300??? :confused:

    Desktop specs:
    AMD 64-bit 3000+
    512mb ram
    FX5200 128mb

    Laptop:
    1.66ghz core-duo
    1gb ram
    Go7300 128mb

    Thanks in advance!

    PS: How do I check what RPM my current hard drive is? (will rpm affect playing games?)
     
  2. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Since its a new game I'd imagine it has an option in the graphics menu for a widescreen resolution.

    7300 will be a lot better than a 5200.
     
  3. pbcustom98

    pbcustom98 Goldmember

    Reputations:
    405
    Messages:
    1,654
    Likes Received:
    0
    Trophy Points:
    55
    WoW has most resolutions available..i have played it in 1680x1050, and at 1440x900, 1280x1024...

    i think your notebook card will be better, since the "FX" series of desktop cards were pure garbage...(or so ive heard)

    RPM will only effect load times..higher RPM, less loading..

    pb,out.
     
  4. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Its not that they were pure garbage, as a budget DX8.1 card its alright. But, the FX series were all marketed as DX9 chips, but they had lousy, to the point of non existant, DX9 performance. As a budget card its alright, better than the GF4MXs that it replaced, but its starting to show its age now. Even a GF 4ti card would still be a better option than it, tho that card was a legend.
     
  5. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    The FX series was good for DirectX 8.1, as that is what the really supported (even though they were supposed to be DX9). They're definitely better than the GeForce MX, I agree - you can't play a lot of newer games with the MX because it has no pixel shader support (it's DirectX 7 hardware).

    To answer the original poster's question, your resolution will be supported, and you'll see a nice boost as posted from the GeForce FX to the Go7300.

    Chaz
     
  6. battlecat

    battlecat Notebook Consultant

    Reputations:
    0
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    30
    Cool thanks guys, thats great news!

    Really?? Wow I have a GeForce 4 4200ti in my other computer! I thought the FX would be better :confused: :eek: :eek:
     
  7. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
  8. battlecat

    battlecat Notebook Consultant

    Reputations:
    0
    Messages:
    265
    Likes Received:
    0
    Trophy Points:
    30
    WOW!

    The GF Ti 4200 kills the FX5200!! And all this time I've had both....... **** :eek:
     
  9. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Yep. Another mistake Nvidia made with those cards was that people who had 4200 thought that the 5200 was a replacement, and therefore a better card than the 4200, which was not the case unfortunately.

    Get that Ti4200 back in you system post haste!
     
  10. iTwins

    iTwins Notebook Consultant

    Reputations:
    126
    Messages:
    135
    Likes Received:
    0
    Trophy Points:
    30
    Who said it's a mistake?! It's a business strategy... :(
     
  11. sionyboy

    sionyboy Notebook Evangelist

    Reputations:
    100
    Messages:
    535
    Likes Received:
    0
    Trophy Points:
    30
    Unfortunately it was a strategy that alienated many consumers, there are still people who won't buy nvidia again after the FX debacle.

    Bit silly really, when you have a market share of two companies and you refuse to buy from one company, doesn't leave you with much of a choice. :p