The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    1.5GB > 3GB upgrade 25-30% better

    Discussion in 'Alienware 14 and M14x' started by Paddon, Apr 20, 2011.

  1. TostitoBandito

    TostitoBandito Notebook Evangelist

    Reputations:
    109
    Messages:
    439
    Likes Received:
    1
    Trophy Points:
    31
    It's a lot easier to eat up a ton of system RAM than it is video card memory. Ask anyone who works with Adobe products, encodes/decodes video, or runs a database.
     
  2. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    The only internal differences are how efficient they handle RAM. While, AMD handles RAM more efficiently than Nvidia, but the point is moot - none of them require anywhere near double the RAM to be as efficient. Internal workings can be different, but never to the scale that a whole 1.5GB of difference is required

    3GB of RAM can actually *hurt* performance because latencies are increased, to make no mention of the increased power draw and lower battery life (2GB of GDDR5 drew 20W more power on the desktop 5870 2GB than the 1GB version :eek: )

    As for 8GB-16GB RAM, outside those who do heavy video/photo editing, it's called marketing - some people just want the biggest # on their machine, regardless of whether they can use it. Truth be told, the vast majority of people will never see the need for even 8GB of RAM, as 4GB is enough

    And there's no 2GB version of this card... RAM is a multiple of the bus width, hence you have 192-bits with 1.5GB/3GB configs, etc.
     
  3. Serephucus

    Serephucus Notebook Deity

    Reputations:
    205
    Messages:
    1,002
    Likes Received:
    6
    Trophy Points:
    56
    Nope, if they're both DDR3, then they're the same. It might happen, like with the desktop 260, that the processor count it bumped up without anyone saying anything, but that's not the case here.

    I know what you're getting at, but not really. While, yes, if you try really hard, you can use a fair amount of GPU RAM, chewie is right; 95% of the sales of cards like this are for the people that just want the bigger numbers, without a second thought.

    To make money. They know people will always think bigger and better - and unfortunately there are a lot of those people - so they make these cards. Do you ever wonder why there's never been a 4GB GTX 480? It's because NVIDIA's and AMD's profit margins are actually a lot less for their higher-end cards. They can't risk making something like that that won't sell - saying nothing of manufacturing limitations here - but they can with the low/mid-range cards.
     
  4. gameplayer22

    gameplayer22 Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    Desktop performance vs game performance, apples to oranges.

    You're missing froogle's OP about performance IN GAMES -- less stuttering in games. Yes, desktop performance won't gain much from more VRAM after a certain point, but with games, especially streaming world games, the extra VRAM could mean the difference between smooth gameplay, and stuttering.

    The more VRAM you have, the more high-resolution textures you can store, which means you don't hit system ram, HD, or disc-media (whereever the detail textures are stored) to load those 1024x1024x32 bit textures, not to mention the small versions of those for far away objects.

    Other things such as Z-buffers, motion blur/post effects, AA, require more VRAM. Even physics is now being offloaded to the GPU -- that doesn't come free; it (as well as general GPU computing, OpenCL, Folding@Home) comes at the cost of VRAM.

    So is it worth it? For most 3D games, yes, definitely. For daily desktop users, not so much -- but if you are the latter, why buy an Alienware?
     
  5. EviLCorsaiR

    EviLCorsaiR Asura

    Reputations:
    970
    Messages:
    2,674
    Likes Received:
    144
    Trophy Points:
    81
    Actually, I bought 8GB because I was upgrading my RAM anyway (from 2GB), and by that point 8GB was almost identical in price to 4GB (well, a little bit more expensive). I doubt I ever actually go above 4GB, but it's nice to have that little extra bit of space.

    I guess it's the same sort of thing on the 555M; however, I certainly wouldn't pay $100 extra for it. I'd pay $50 extra at most for the 3GB 555M over the 1.5GB 555M.

    I doubt that even in a streaming world game, with all of the settings cranked up, that 3GB of VRAM would make any difference over 1.5GB (on the 555M at least). I play extremely large MMOs with my cards having 'just' 512MB of VRAM and they play silky smooth, completely maxed out.

    Of course, you can store extra high-res textures, and things like AA do indeed require more VRAM, but you'd still never likely come close to 1.5GB. Most games, even at 1080p and with 16xAA, maxed out, won't peak above 1GB. We're talking about a laptop with a 900p screen and a GPU that won't be capable of playing most games maxed out at 1080p (not with 16x AA at least).

    If you were talking about the very best GPUs on the market, on an EyeFinity setup (or a very high resolution display) with lots of AA then you'd be right. In this case, the rest of the 555M would bottleneck it far before it would make use of anything above 1.5GB of VRAM.
     
  6. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,782
    Trophy Points:
    581
    This level of gross misinformation is becoming an extreme irritation. If you think all of that is true, stop posting until you've educated yourself on the topic.

    He's referring to desktop GPUs, and you still haven't addressed why the highest-end GPUs don't come with 3GB.

    Even the GTX 485M couldn't use its full 2GB.
     
  7. froogle

    froogle Notebook Evangelist

    Reputations:
    152
    Messages:
    480
    Likes Received:
    4
    Trophy Points:
    31
    Ultimately it all comes down to two things; the game, and the api used to hit the card (as I mentioned earlier).

    I got called out earlier with the question "what makes you think 3GB will be used on a card with less power and less resolution?". The answer is just a single acronym; DirectX 11.
     
  8. gameplayer22

    gameplayer22 Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    I guess I have to defend myself ... I've worked in the games industry for over a decade as a programmer, I think I'm a little educated on the subject of system resources related to games.

    I was reiterating froogle's OP about performance in games; did I not say that desktop performance was a non-issue at 1.5?

    We've always taken as much resources as possible, to provide the best user experience on the best hardware, but eventually, we have to scale back to cater to the average user. Most high-end cards aren't fully utilized because most average users aren't running on multi-monitor 30-inch displays.

    Steam Hardware & Software Survey

    Less than 10% run multi-monitors, so if you want to sell units, you cater to the 90% single-monitor users, who probably own the 512-1gb cards.

    There are other uses for the extra VRAM, again, for example, GPU physics and general GPU computation. It's very likely a smaller percentage of the target audience, but those are legit uses of having more VRAM.
     
  9. ryujin

    ryujin 2B or not 2B

    Reputations:
    824
    Messages:
    2,032
    Likes Received:
    0
    Trophy Points:
    55
    ok, i have to admit...i somtimes play devil's advocate for giggles.

    and i sometimes ask questions and pose alternate views to see how people respond.

    there are more pros for the 1.5 vs 3.0, but given the decent opposition in how the 3.0 could be a benefit i dont see why people think they are pulling the wool over peoples eyes.

    dont get me wrong, i know about laptops and enough to make decisions on my own, but i like to see responses before i chunk the funds....lol
     
  10. revdiesel

    revdiesel Notebook Evangelist

    Reputations:
    205
    Messages:
    382
    Likes Received:
    0
    Trophy Points:
    30
    I talked to a Nvidia rep here is that conversation regarding this card.

    I erased most of their name and my last name.

    Key points here. Alienware could have made this 144@590MHz with GDDR5. The 3GB card could make a difference in games to come with DX11 where 2GB can be used.

    [​IMG]

    Part 2

    [​IMG]
     
  11. stevenxowens792

    stevenxowens792 Notebook Virtuoso

    Reputations:
    952
    Messages:
    2,040
    Likes Received:
    0
    Trophy Points:
    0
    Thanks... good chat session!
     
  12. tuckrr

    tuckrr Notebook Consultant

    Reputations:
    16
    Messages:
    177
    Likes Received:
    0
    Trophy Points:
    30
    still not going to convince people who (at this point IMO are convincing THEMSELVES they don't want the mx14, and) dont think that the 3gb will help at all.

    let me think about this:
    GPU is actually soldered into the Mx14
    (ie you cannot change it down the road)
    what you pick you are stuck with so dont skimp out!
    FUTURE games work well with the 3gb card
    if you mess up now, you cannot fix your comp in the future.

    dont get me wrong, im sure both are excellent for gaming etc...but if you want the boost, it could be worth the 100$ to some
     
  13. revdiesel

    revdiesel Notebook Evangelist

    Reputations:
    205
    Messages:
    382
    Likes Received:
    0
    Trophy Points:
    30
    Yeah we don't know what some games in 2012 will hold as far as what they can utilize. I can see the card differences not mattering at all for most people that have the laptop for a year or less. But I can see it helping down the road if you keep it till next year and past.

    Some games this fall like Battlefield 3 may be able to use some of the extra memory. Think back to what happend with i7 quad cores even just last year in gaming. Alot of people were sticking to their high end duo CPU's because it didn't make much difference in gaming because the quad wasn't really used by the games then. Then some newer more demanding games like BC2 came out and the quad made a huge difference and people wouldn't dream of using a duo over a sandy bridge quad for gaming (unless your a half life 1 fps nut). I got one of the first Evga GTX desktop cards about a year ago or so with 2GB of memory and people said it was a total waste anything over 1GB. Less than a year later newer games came out that my card performed better than the 1GB cards. Think of it this way, would you get a 256MB or 500MB card for serious gaming now? No way, Thats what 1GB is already becoming. Just wait
     
  14. stevenxowens792

    stevenxowens792 Notebook Virtuoso

    Reputations:
    952
    Messages:
    2,040
    Likes Received:
    0
    Trophy Points:
    0
    @All - Let me say this and please don't take it the wrong way. I work in IT for a fortune 50. I've been here for over 13 years now. I am not the know it all expert, but I do know what I know.

    The 555 will NOT be as powerful as the 5850 or 460. However if you can game at 1366x768 (720p) the 555 will get 'almost' as many FPS as the 5850/460 will at 900p or 1080p. It is NOT a desktop replacement but a mobile gaming/media notebook. So.. if you set your expectation accordingly, we should be able to put it through the paces and see what she's got to give.

    And I will bleed her dry... Trust me.

    StevenX
     
  15. sk3tch

    sk3tch Notebook Deity

    Reputations:
    107
    Messages:
    986
    Likes Received:
    285
    Trophy Points:
    76
    That chat session only proves that one dude at Dell has an opinion. He faltered at the end when he said the 555M is better than the GTX 460M. Why did he say that? He went purely on numbers. He thought 555 > 460 so it must be better. Then you corrected him and he said "reliability" (heh). The reality (IMO) is it was chosen for a) Optimus and b) the thermal requirements for the 460M are too great in the M14x size.

    Bottom line: he fell for the numbers game in the chat, like he's hoping we all will fall for the numbers game when buying our 555M GPU option.

    EDIT: where I say Dell, replace "nVidia" :)
     
  16. revdiesel

    revdiesel Notebook Evangelist

    Reputations:
    205
    Messages:
    382
    Likes Received:
    0
    Trophy Points:
    30
    If you read the text before the chat screen and also the chat log you would have come to realize that this was not a Dell rep. This was an Nvidia rep.
     
  17. sk3tch

    sk3tch Notebook Deity

    Reputations:
    107
    Messages:
    986
    Likes Received:
    285
    Trophy Points:
    76
    My mistake. I edited my post to reflect that. My statements stand.

    You guys are gaming at 720p or 900p. You DO NOT NEED 3GB. In the future? Yeah, 2 years...3...it won't matter...your M14x will be a doorstop. Buy for today when it's a $100+ upgrade.
     
  18. Speedy Gonzalez

    Speedy Gonzalez Xtreme Notebook Speeder!

    Reputations:
    5,447
    Messages:
    3,143
    Likes Received:
    27
    Trophy Points:
    116
    All we need to know is if the 3GB version has more stream processor cores or more bandwidth because those are the only factors will affect performance if those are the same and the 3GB version only have bigger clocks is worthless because we can overclock the 1.5GB card to have the same performance

    Can we expect some one with the M14X to run GPUz to see what is going on soon !!! :)
     
  19. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    At the end of the day, you can get the GPU you think you want. If you don't think Dell/Nvidia reps aren't biased, fine by me, but as someone who's examined GPU architectures for many years now, I wouldn't touch the 3GB option unless it had actual clock differences

    But don't come to us later and say you didn't hear us warn you about 3GB being useless
     
  20. revdiesel

    revdiesel Notebook Evangelist

    Reputations:
    205
    Messages:
    382
    Likes Received:
    0
    Trophy Points:
    30
    Personally $100 is nothing to me so if it performs a little better on my 1080p screen at some point in its life cycle great. If not oh well. No one has the 2 versions in the M14x yet so its just a matter of opinion at this point.

    Also to the person saying the rep was dumb or something along those lines do to them saying the 555m was better than the GTX 460m. They were stating that the 555m was better for the M14x due to a heat issue. Not performance which they stated.
     
  21. ryujin

    ryujin 2B or not 2B

    Reputations:
    824
    Messages:
    2,032
    Likes Received:
    0
    Trophy Points:
    55
    given that the gpu is part of the mobo, and given heat issues in general in small form factors, i can see why the 555 was chosen...
     
  22. TostitoBandito

    TostitoBandito Notebook Evangelist

    Reputations:
    109
    Messages:
    439
    Likes Received:
    1
    Trophy Points:
    31
    Yeah, I have no problem with the choice. It is a preferred alternative to having heat issues or having a laptop the size of the m15x.
     
  23. chewietobbacca

    chewietobbacca Notebook Evangelist

    Reputations:
    515
    Messages:
    459
    Likes Received:
    1
    Trophy Points:
    31
    Only it's not a matter of opinion, this is something that's been examined over and over again. Check out the 5870 1GB vs. 2GB reviews... the 2GB didn't begin to make a difference until 2560x1600 and Eyefinity setups... and it cost 10-20W more in power usage from the RAM :eek:
     
  24. gameplayer22

    gameplayer22 Notebook Enthusiast

    Reputations:
    0
    Messages:
    12
    Likes Received:
    0
    Trophy Points:
    5
    Just so it's clear, it cost 10-20W more in idle, or on load? There's 1.7X times more pixels when you go from 1920x1200 to 2560x1600, which isn't gonna be free? As you push more pixels, whether its via resolution or more monitors, the GPU works harder, costing more watts.

    Power Consumption : AMD Radeon HD 5870 Eyefinity 6 Edition: One Card, Six Screens

    Idle difference was only 4w (115 to 119) between 1GB and 2GB, so in what context are you referring to 10-20W?

    And the benchmark for 1 vs 2GB, in Crossfire (all of this is getting a bit off topic):

    http://www.tomshardware.com/reviews/radeon-5870-eyefinity6,2595-11.html

    "The key here is that you’ll have to exceed the limits of a 1GB card at settings that’d still otherwise be playable on a 2GB board."

    Also, to clarify my standing on the OP, 1.5GB will be fine for current games and probably ones to follow in the next couple of years, but I wouldn't say that having 3GB is absolutely useless.
     
  25. vr4racer

    vr4racer Notebook Consultant

    Reputations:
    1
    Messages:
    121
    Likes Received:
    1
    Trophy Points:
    31
    how about the overclocking potential with both cards? Will one overclock better then the other? Usually the one with the lower memory 1.5g will have lower timings then say the 3g version and clock higher.
     
  26. stevenxowens792

    stevenxowens792 Notebook Virtuoso

    Reputations:
    952
    Messages:
    2,040
    Likes Received:
    0
    Trophy Points:
    0
    OH OH Oh Oh.. shoot.. I just thought of something. Is the motherboard that includes the 3.0 gb 555 going to be different (such as model and manufacturer) than the motherboard with the 1.5gb? Could it impact overclocking abilities..?

    Ugh... I wish they would just send me the tech sheets. I already placed my order so...

    StevenX
     
  27. stevenxowens792

    stevenxowens792 Notebook Virtuoso

    Reputations:
    952
    Messages:
    2,040
    Likes Received:
    0
    Trophy Points:
    0
    @Vracer - the one thing I did confirm today is that the clock rates are supposed to be the same!
     
  28. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    Usually higher density modules will not clock as well. As for the motherboards I doubt it. The chips are the same so the power consumption is vertually identical. Just a couple of watts more for the mem.

    They wont want to spend masses of money and validation for such a minor difference.

    Thats the same reason the i7 and 2GB 540M are a package upgrade so there are only 2 M11X motherboards. If they were seperate there would be 4 different motherboards.
     
  29. vr4racer

    vr4racer Notebook Consultant

    Reputations:
    1
    Messages:
    121
    Likes Received:
    1
    Trophy Points:
    31
    Im not talking about stock clocks, talking about if one overclocks better then the other.
     
  30. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,436
    Messages:
    58,200
    Likes Received:
    17,911
    Trophy Points:
    931
    In the past actually higher density ram usually came clocked at lower frequencies.
     
← Previous page