The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Am I Being Lied to.

    Discussion in 'Gaming (Software and Graphics Cards)' started by Todger, Apr 13, 2006.

  1. Todger

    Todger Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    I have just received my new Dell Inspiron 9400 laptop. My problem is when I ordered the thing, I chose the ATI Mobility Radeon x1400 graphics card with 256Mb of HyperMemory (sounds impressive enough). On getting the machine out of the box and booting it up I went into the BIOS to check out the system details and found the Graphics have only 128Mb of memory. The nice Indian man I spoke to at Dell said "this is normal, the other 128Mb of memory comes from the system", now I just get the feeling I am being lied to.
    I am interested to hear if anyone else has had this happen.
    Cheers, & Happy Easter.
    Todger.
     
  2. TedJ

    TedJ Asus fan in a can!

    Reputations:
    407
    Messages:
    1,078
    Likes Received:
    0
    Trophy Points:
    55
    No, thats the "Hyper Memory" in all the brochures... it's very much like nVidia's Turbo Cache setup.
     
  3. dda

    dda Notebook Guru

    Reputations:
    4
    Messages:
    74
    Likes Received:
    0
    Trophy Points:
    15
    No you're not. Although quite misleading the x1400 has 128mb of "dedicated" memory and 128 that it draws from the cpu.
     
  4. dello

    dello Notebook Geek

    Reputations:
    41
    Messages:
    81
    Likes Received:
    0
    Trophy Points:
    15
    well technically dell's stupid out of America suppport did NOT lie to you. he was quite right. The X1400m uses n ATI Technology that is called 'hypermemory' (nvidia has Turbo Cache). what it dose is that the graphics card (X1400) has its 'own' 128MB onboard dedicated memory that only the Graphics card can use. BUT, the graphics card can be set up to use upto 128MB memory from the system to increase texture space and improve performance. so yes the Graphics Card can "Use" upto 256MB Ram, but the RAM dedicated to it is only 128M.

    hope this clears it up :)

    Mine is same too, 128MB ATI RADEON X200M+128MB Hypermemory = 256MB
     
  5. Notebook Solutions

    Notebook Solutions Company Representative NBR Reviewer

    Reputations:
    461
    Messages:
    1,849
    Likes Received:
    0
    Trophy Points:
    55
    I respect your product but I must say: a 9400 with X1400! :confused:
     
  6. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    Basically its like getting the x300 with it, a light gaming option if you dont intend to game much.
     
  7. pbcustom98

    pbcustom98 Goldmember

    Reputations:
    405
    Messages:
    1,654
    Likes Received:
    0
    Trophy Points:
    55
    yes, this is normal..it is ram that is taken from the system memory.

    pb,out.
     
  8. falhurk

    falhurk Notebook Guru

    Reputations:
    0
    Messages:
    60
    Likes Received:
    0
    Trophy Points:
    15
    Actually, it's like getting this generations version of the X600.
     
  9. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    Considering its an x600 with sm3 its a light gaming chip as thats what the x600 has become.
     
  10. Dustin Sklavos

    Dustin Sklavos Notebook Deity NBR Reviewer

    Reputations:
    1,892
    Messages:
    1,595
    Likes Received:
    3
    Trophy Points:
    56
    Actually, it's still a good sight faster than the X600.

    And for what it's worth, the X600 isn't godawful either. It still plays all modern games, most of them moderately well. Tomb Raider: Legend just came out and if you make a couple of small sacrifices, it can run beautifully at 1280x800. Heck, even the X300 is still decent enough for most casual gaming.

    The biggest problem game I've seen recently has been F.E.A.R., but F.E.A.R. barely runs well on high end systems: it's just a badly coded game.
     
  11. Notebook Solutions

    Notebook Solutions Company Representative NBR Reviewer

    Reputations:
    461
    Messages:
    1,849
    Likes Received:
    0
    Trophy Points:
    55
    Yes F E A R is indeed a badly coded game... I must say I first didnt even wanted to play it, it is soooo scary!

    Charlie-Peru :)
     
  12. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    BF2 is hardly a shining example, remember I have the chip and I know what its capable of, and its starting to strugle especially with the likes of morrowind. Plus the x1400 is not a **** sight faster its slightly faster:

    3dmark 05:

    x1400 = 1750
    x600 = 1700

    Hardly a massive difference. Dont use 06 since the x1400 does have sm3 so it runs more tests.
     
  13. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    I've seen the X1400 go over 2k on some notebooks, yet it scores around ~1700 as posted on others . . must have something to do with the manufacturer's settings.

    The X600 is definitely getting outdated though, newer games it struggles. The X1400 should put up signifigantly better frames in FEAR, etc.
     
  14. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    The x1400 is not going to do well in fear if the x1300pro (desktop) is anything to go by. Certainly the details will have to be toned down a bit, but yes it will do somewhat better than the x600, however tbh I think even the x1400 is going to have to bow out soon if fear is anything to go by ^^.
     
  15. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    Certainly true . . You're probably not much better off with the X1400 than the X600, although the X1400 has yet to prove its worth, we'll see.

    When DirectX 10 hits next year (Q1 2007 is when Vista hits supposedly, perhaps DX10 around then), all the DirectX 9 cards are going to have to bow out. ;)

    Chaz
     
  16. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    Not really, for one thing all the interface special features are DX9 (vista will have a new version of dx9) and any games are going to be compatable with dx9 cards for awile after and by the time you get to 2-3 years down the line your card even if it had the right features bolted on would need to be faster anyway :p
     
  17. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,091
    Trophy Points:
    931
    I was actually referring to if you wanted to play the next-generation of games, but true.

    You'll still need a DirectX 9 card anyway, as I'm not sure how compatible the DX10 cards will be with DX9 games. From what I've read, not very because of the different architecture and geometry shaders vs. the current vertex + pixel, etc.
     
  18. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,441
    Messages:
    58,200
    Likes Received:
    17,916
    Trophy Points:
    931
    I am sure the cards should be able to handle it especially given how programmable the core is.