The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
 Next page →

    Acer 5920G Full Upgrade (GPU, CPU, HD, RAM)

    Discussion in 'Acer' started by philhxc, Oct 4, 2009.

  1. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    Hello, i'm a 3D artist and i have a pretty good workstation for doing all my projects, however, lately i've been meeting with clients at their offices and need to bring whats now MY notebook (it used to belong to my wife, hehehehe) but i definitely need to upgrade it to do on-location updates to my 3D scenes and shaders..

    1. I know that this notebook is x64 so i need to upgrade my 32-bit vista. Wich is the fastest GPU that this machine will support? (Current: Core2 Duo CPU T5550 @ 1.83GHz)
    2. Can it support MXM II Graphics Card Nvidia Quadro FX 770M 512MB DDR3 or wich is the fastest gforce? (Current: nV. 8600m GS 256mb)
    3. Whats the max and fastest ram that it supports on x64? (Current: 3GB Sync DDR2 667mhz)
    4. Fastest hardrive supported! (Current: WD 5400rpm 320gb)

    thank you all
     
  2. locoelbario

    locoelbario Notebook Enthusiast

    Reputations:
    11
    Messages:
    26
    Likes Received:
    0
    Trophy Points:
    5
    that al depenst on wich motherboard is in your laptop
     
  3. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    cool how do i find that out?
     
  4. moral hazard

    moral hazard Notebook Nobel Laureate

    Reputations:
    2,779
    Messages:
    7,957
    Likes Received:
    87
    Trophy Points:
    216
    What chipset do you have?
     
  5. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    no clue, how do i find that out?
     
  6. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    okokok..


    Chipset : Intel PM965
    SouthBridge: Intel 82801HBM (ICH8-ME)
     
  7. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Hi PhilHXC, and welcome to the forums! FYI, all 5920Gs come with the Intel PM965 chipset.

    1. Just because your laptop has 64-bit capabilities, it doesn't mean you have to run 64-bit Windows. 32-bit will work just fine. That being said, the license key on the bottom of your laptop will still work with the 64-bit edition of the same version of Vista your machine came with.
    Since your system is using the PM965 chipset, the fastest CPU your system can handle would be the Core 2 Extreme X9000 at 2.8 GHz. BUT because it's an Extreme CPU, your BIOS may not have support for it. Your safest bet would be the Core 2 Duo T9500 at 2.6 GHz.

    2. No, the 5920G will not support the Quadro FX 770M; it's based on the GeForce 9600M GT, which has been shown time and time again not to work on the 5920G. The fastest Nvidia GPU you'll be able to upgrade your 5920G to is the GeForce 9500M GS DDR2 (it's a refined 8600M GT), or possibly the GeForce 9600M GS GDDR3.
    The going trend for Acer MXM upgrades is the ATi Mobility Radeon HD 4650 DDR2, which is the most recent (and quite possibly final) release for the aging MXM-II slot.

    3. The PM965 chipset is limited to DDR2-667 memory, and upgrading to 64-bit won't let you get faster memory. You would be able to get more, however; 64-bit systems can use all 4GB of RAM if you have it installed.

    4. The fastest hard drive, well, isn't one. If it's blistering speed you want, it's time to look into a solid state drive. Think of it as a turbocharged flash drive inside your machine. The Intel X25-M and OCZ Vertex are the more popular and faster of the models out there. There's a sizable SSD thread on the Hardware forum if you care to read.
    If SSD isn't in the cards, look for a 500GB 7200rpm drive.
     
  8. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    well, thank you very much for your reply, it was most useful, so here are my new questions.

    1. I use and abuse Autodesk's Maya 2009, the installation disk i think includes 32bits and 64bits. Would it work better with 64bit system, (I'm planning to downgrade to xp x64) also i do lots of tweaking on my OS to make it work faster (remove and disable themes and unneeded services).
    2. Is any mobile Quadro compatible with my machine? fx570m? possibly? or should i just stick with GeForce 9500M GS DDR2?
    3. How much more memory can i get?
    4. I'm all clear about the drive.

    in my shopping bag now:

    • Intel® Core™2 Duo Mobile Processor T9500 2.6G/6M/800FSB
     
  9. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    1. 64-bit applications will run faster on a 64-bit operating system than a 32-bit application. So there's an advantage there. (32-bit applications will still work in 64-bit Windows, though).
    If I were you, though, I'd stay well away from Windows XP 64-bit. It's woefully undersupported and starting to show its age. Go Windows Vista or 7 64-bit if you must.

    2. I don't believe so, but the FX 570M is your best bet as it's based on the GeForce 8600M GT (which your machine is compatible with).

    3. The PM965 can handle a maximum of 4GB of RAM, but can only access ~3GB in a 32-bit operating system.
     
  10. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
  11. thelaptopguy

    thelaptopguy Notebook Geek

    Reputations:
    -3
    Messages:
    93
    Likes Received:
    0
    Trophy Points:
    15
    i wonder what the chances of any of the mobility 5xxx series making it to mxm ii are.
     
  12. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    OK Phil, allow me to stop you from doing anything silly - DON'T BUY THEM.
    Save the $800 and put it towards a laptop that will actually meet your requirements! Like a Lenovo ThinkPad W series or HP EliteBook. Just put up with the 5920G until you can afford a better laptop.
    (I speak from experience - I could've spent quite a bit on upgrading my old TravelMate 3252WXMi, but I elected to save and wait, and rewarded myself with the 6920G)

    @Thelaptopguy: Probably nil. Acer's laptops are already transitioning to MXM-A.
     
  13. classic77

    classic77 Notebook Evangelist

    Reputations:
    159
    Messages:
    584
    Likes Received:
    0
    Trophy Points:
    30
    Mine is the best 5920g on the internet, guaranteed!

    (see sig)

    The 4650 is without question the best card available for this notbeook. Don't mess around with Nvidia crap! I had an 8600m GT and my upgrade doubled my frames in some cases (see sig for 4650 upgrade thread).

    Suigi is right though. If youre gunna drop 600+ on this notebook, I'd just save up. $300 is the most I'd spend on an upgrade.

    PS: IF YOU REALLY WANT QUATRO, (which I could understand, since your talking about content creation) I BELIEVE its possible to flash your 8600m GT to its quatro cousin. Failing that, you could at least undervolt and overclock it.

    GOOD LUCK!
     
  14. philhxc

    philhxc Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    0
    Trophy Points:
    5
    yeah i was thinking about that, i think i'm gonna go ahead and discard my aging workstation and get a mobile workstation, dell has some nice ones i have on my sight the Presicion Mobile Workstation M6400 Advanced.
     
  15. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    A new lappy would be better mate. The only reason I'd spend quite ridiculous amount of money upgrading my 5920G (and I'm still doing it), is because I really love it and I don't want to change it :D It seems that you don't share my feelings, so as I and the others said, better get a new one :)
     
  16. amjman

    amjman Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I just upgraded my 7720, and wanted to quickly chime in:

    the T9500 cost me $214 new, shipped (for free) from Ewiz, and for me that was an awesome upgrade-and worth it, as I did research ANY laptop with a 17" screen and a T9500 and could find NONE under $800 worth looking at

    the 4GB you can get for very little, like $40-$50 (I had Kingston 4GB kit in mine before I turned it on the first time) while when I got my 4GB 2+ years ago it was $200

    the bluetooth upgrade I did was $20, and it is now built in

    the MXM card is $79, from here http://cgi.ebay.com/ATI-Mobility-Ra...emQQptZPCC_Video_TV_Cards?hash=item3ef914ea7b and here is a howto: http://www.theacerguy.com/2008/12/morris-aspire-5920-gpu-swapupgrade/

    And, I was considering this for my machine, for $138! http://www.newegg.com/Product/Product.aspx?Item=N82E16820609393 and then use the drive you have as the secondary one!

    So, for less than $500 and a little time you can have the fastest darned Windows 7 laptop ever...and Win7 does recognize the SSD drives natively and does the TRIM function automatically! I cannot wait to get mine!

    Just my 2c

    A
     
  17. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I was leaning towards an OCZ Agility 60GB, and using a ODHD adapter to stuff my Scorpio Blue 320GB in place of my Blu-ray drive.
     
  18. amjman

    amjman Newbie

    Reputations:
    0
    Messages:
    3
    Likes Received:
    0
    Trophy Points:
    5
    I looked at a review at Tom's Hardware, and saw that the OCZ, although available, was a bit slower than the model I suggested....although I may have to wait three weeks to get it!

    I am so glad that my 7720 has a ready slot for this upgrade....and I will post speed results once I get it! I cannot wait for Win7 to boot so fast I cannot count it!

    A
     
  19. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ TehSuigi,

    Interesting point about the vague possibility of the Quadro cards inthe 5920. With ref. to point two in your post made on 10-04-2009, 04:03 PM. The 9650M GT is working in the 5920, and the 9650 is a higher clocked 9600M GT. Granted though the 9650 is made in a newer 'smaller technology' process size. Since this is true, I have often wondered about the Nvidia 100, 200, and Quadro series cards in the 5920.
     
  20. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    @u6b36ef: That post was over a year ago - since then, it's become clear that the 9650M GT is indeed functional on the 5920G, which means that it must be a very specific blacklisting that forces the 9600M GT to not work.

    As for the 100 and 200 series, they aren't very common on the MXM-II form factor, if they appeared at all. And since Acer moved from MXM-I/II/III to MXM-A and now back to embedding the GPU into the system board, your only hope for those would be other OEMs who still use MXM cards.
     
  21. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ TehSuigi

    I keep wondering about the 9650M GT, mainly because price is generally less than the 4650. I can't find the following info though, which I need to to be sure of.
    1. Does the 9650 suffer with the poweriser issues like the 9500. (If it does, then I'd probably give it a miss).
    2. Does it run using most Nvid drivers.
    3. The temperature issues.
    I believe this 5920 has a good cooler, I.E. a recent revision. I'd rather avoid the need to undervolt, if 9650 runs to 90'C plus.

    Congrats on your 4650 success.

    U6b36ef.

    PS. I saw somewhere you commented you hadn't seen a CPU upgrade guide for 5920. Just in case you haven't seen it Morris made one, over on Acer Guy. The Acer Guy Blog Archive Morris? Aspire 5920 CPU Swap/upgrade
     
  22. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I wouldn't know - head on over to the Acer MXM thread and talk with people who actually have upgraded their 5920G to a 9650M GT.

    I wasn't the one who upgraded to a Radeon HD 4650 (though I honestly wouldn't mind) - were you thinking of classic77?
     
  23. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    Yeesh.
    I was thinking if this.
    http://img193.imageshack.us/img193/8586/dsc01015q.jpg

    Must have incorrectly remebered it as you, and made the relation 'cos you are like moderator here. Someone has the 4650 on eBay Type II at the moment. Too many bucks I think.

    To be honest I wonder about the Radeon cards. The 3650 draws 30W, whereas the Original 8600M GS/GT's draw 20-23W. The idea of potentially increasing the current load on the MXM II circuits and the laptop charger, worries me. (Unless the MXM circuits are somehow current limited ???)

    Undervolting the GPU card helps, from this perspective since, a reduction in volts means reduction in amps and watts. (P=VI and V=IR)

    AMD quote the 4650 between 15-25watts, according to notebookcheck, yeh. Unclear if that includes 5 watts for memory. Yet they list the 4650 at a total 35W. [Lost me!]

    U6b36ef
     
  24. downloads

    downloads No, Dee Dee, no! Super Moderator

    Reputations:
    7,729
    Messages:
    8,722
    Likes Received:
    2,231
    Trophy Points:
    331
    That part isn't strictly true. We are talking about reducing power output on a given card. The current stays the same and the voltage goes down hence the power output is reduced.
     
  25. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    Reduction in volts most definitely means reduction in amps (current through device, and therefor watts, power usage by device.

    Simplify: A torch. Reduce voltage - resistance of 'object' the same, so amps reduce.
    V=IR
     
  26. downloads

    downloads No, Dee Dee, no! Super Moderator

    Reputations:
    7,729
    Messages:
    8,722
    Likes Received:
    2,231
    Trophy Points:
    331

    That's where you're wrong. Resistance depends on both Voltage and current but it doesn't exist by itself. That's a quite from Wikipedia that explains it- see hydraulic analogy in Wikipedia. No water flowing no resistance right?
    R=V/I is Ohm's Law- you can imagine that you increase the current without changing the voltage (just as an example).
    If you do increase the current with the same voltage- resistance goes down. So the resistance does change.
    You can also make a search where people ask about why resistance changes when properties of electricity change (voltage or current).
    If you take that into consideration- you are going to see why undervolting lowers temperatures- because in essence lower voltage with the same current equals lower resistance. And that means lower power output.
    Hope I was clear on that one.
     
  27. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    Resistance in passive components is completely independent of anything except temperature. IT IS FIXED. A 25k ohm resistor is 25K at any voltage, on any planet.

    To quote you here downloads:
    "If you do increase the current with the same voltage- resistance goes down."

    YOU CAN'T!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
     
  28. downloads

    downloads No, Dee Dee, no! Super Moderator

    Reputations:
    7,729
    Messages:
    8,722
    Likes Received:
    2,231
    Trophy Points:
    331
    OK I started answering you but I've realized that at this point I would have to be impolite or use many exclamation marks like yourself.
    Unfortunately I can't (I'm a mod on this sub-forum) so I wish you a pleasant weekend. Maybe someone else will be able to convince you (or not- I don't really care)
     
  29. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    You have been 'discussing' incorrectly what I learnt in my first ever 'Advanced level' physics lesson.

    U6b36ef
    BSc Electronics (Dual Hons 1996-1999 Keele University),
    MSc,
    1st year of PhD Electronics (C-MOS chip design) at UMIST - University of Manchester Institute of Science and Technology.
     
  30. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Guys, please, what on earth does this have to do with making the upgrade? Keep it focused and on-topic, and avoid the snipes at each other. I've edited your posts to simply shove the off-topic bickering into spoiler tags.

    The Aspire 5920G has already been upgraded to the Radeon HD 3650, with few ill effects; you've already commented on the article at The Acer Guy. The cooling system seems capable of absorbing the increased wattage.

    U6, here's a link to the premiere resource on MXM - MXM-Upgrade.com. Hopefully it'll answer your questions.
     
  31. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ TehSuigi

    You have misunderstood my point also. I'm not discussing the 5920 cooling system ability, with the Ati cards.
    [Yes I did mention this on Acer Guy, but no-one knows any more.]
    If I am one in a million with my perspective about my point, that is fine. Until the available information is changed, I will keep my views.

    My issue is about the upgradabilty of the 5920, and my wonderings of the suitablity of the Ati's. Yes countless have done it, and the 3650 may be operating inside tolerances.

    Look here: http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module

    In the first table. The 'Max Power' rating of the MXM II slot is 25W.

    This information is also on the HotHardware website. http://hothardware.com/Reviews/NVIDIAs-MXM-Graphics-Module/

    The Ati HD3650 draws 30W; it is a 30W card. (Assuming information on notebookcheck website correct)
    http://www.notebookcheck.net/ATI-Mobility-Radeon-HD-3650.8839.0.html
    See: Current Consumption 30 Watt

    If the 4650 does draw 35W, maybe you should consider this before buying for a 5920.

    You see also, changing a 22W graphics card for a 30W GPU card, increases the electrical load on the system and charger by 9%.

    If you still don't understand anything here, please ask (?).

    U6b....

    This is why I thought you were HD4650 fitted, though I see now - it's not yours.
    http://forum.notebookreview.com/acer/381892-5920g-upgraded-radeon-hd-4650-a.html

    Don't accuse me of sniping. 'Downloads' comments were 'misleading', and needed correcting. (Forgiveably).
     
  32. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I'm going to go with Notebookcheck being wrong; they may be very good when it comes to most specifications, but I always take that site with a grain of salt.

    And I'm not "accusing" you of sniping; your tone was inappropriate.
     
  33. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    No.

    Plus undervolting and and electrical safety is part of upgrading GPU cards, and not 'off topic'.
     
  34. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    @u6b36uf - You are right to care about the power consumption. I've had the same fear as you before my upgrade to HD4670. The GPU itself runs, but if I want to charge my battery AND play a game, that's impossible. The battery simply does't charge. I consider an upgrade of my PSU to a 120W one, so I hope that it would do the trick and also it wont burn the floor while I'm gaming, as does my curent one :D
     
  35. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ triturbo

    This is a significant result you have.

    I've run a test of charging the battery with 8600 card installed running at full load on the GPU. (Monitored by GPU-Z: GPU Load tab showing 99%) It charges quite normally, quickly, seemingly at the usual rate. You probably knew this but I did it anyway, just in case. I needed to know for my own info.

    For the present, I can't recommend moving to 120W PSU. Good idea, but possibly might be fatal sooner or later, I don't know.

    Since the 5920 operates at 90W max, it should be able to run everything at the same time from that power supply. Screen, CPU, mem, GPU, DVD, USB ports, everything, etc. Your info is showing that the 4670 is drawing more power.
    A quick example here might make this easier for some to understand.

    In a home, switching on more lights and appliances, increases the current in the ring main system. This is to meet the demand of the increasing number of electrical gadgets etc. Plug something in that requires a lot of power, and what happens. The fuse trips. That's because the current has exceeded safe limits, through the ring main. The fuse/or circuit breaker is there to prevent more and more current flowing before the wiring gets to hot, and fire occurs.

    A standard 5920 can only draw 90W max, whatever the current rating of the PSU over 4.74A. (P=VI, power equals volts x amps, 19V x 4.74A = 90W.). Obviously a 3.42A PSU is low, but in some cases will run the laptop, because not everything is turned on. Laptop only draws what it needs. Then put a 19V 6.3A = 120W supply, and the laptop will only draw 90W through the PSU. It only draws what it needs. (Of course turn up the voltage of the supply, and things get dangerous.)

    It now seems apparent you have increased the power load of your laptop, by adding a HD4670. Your PSU can only output 90W. If you increase your PSU to 120W potential, with the higher load demand from the laptop, what will happen. More current will be encouraged to flow.

    In the case of the home and circuit the fuse trips to prevent fire or damage to parts. Where is the fuse on the laptop power circuit? (There are just components) If you run more power through, you could destroy the circuitry at the point where current enters the laptop. Or it might fail after time. Using another example, imagine a torch with 2x1.5v batteries, and put another battery in. It will burn brighter, but will not last as long. Increasing current through your power circuits may heat damage them. Heat is a leading cause of electronic component failure.

    Hope this all makes perfect sense.

    My guess is: you're probably OK running the 90W PSU, and charging your battery when you are not gaming.

    My worry with the HD4650/4670 is this: http://en.wikipedia.org/wiki/Mobile_PCI_Express_Module
    They state the max power rating of the MXM II slot is 25W. If the HD4650 is capable of drawing more than 25W across the MXM II circuit, then it might be getting the circuits warm. Your evidence appears to show the HD4670 drawing current from the battery charging circuit.

    Over-all though this doesn't make sense, why would a GPU manufacturer be making cards with a higher rating than the socket?
     
  36. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I still don't quite follow how an MXM-II card is designed to draw more than 25 W when that's the supposed limit for the MXM-II form factor, but then again I don't know a lot of things. :p
     
  37. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Since it's not a strict standard it will draw more than that. The slot capacity is not really that important, much like the power plugs on graphics cards (the 6 and 8 pin cable, the 8 pin is just the 6 pin with 2 extra ground pins, the 6 pin is 75W and the 8 pin is 150W, the 6 pin could probably easily handle 200W).

    What's important is the power circuitry on the GFX and mobo, if that has been designed to handle a higher wattage card then the slot capacity is not very important.

    Remember GFX cards are sold off the shelf and to put the logo on it, it needs to meet certain specifications. MXM modules are almost all sold in machines where this is not needed.
     
  38. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Ah, I get it. Thanks, Meaker!
     
  39. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ TehSuigi,

    It's astonishing, isn't it.

    The options are:
    1. The Hothardware website data (MXM II <= 25W) is old and strangely not relevant, since this page was produced on
    May 17, 2004.
    1.1 The Wiki site has got it's info from the hot hardware site.
    2. AMD/Ati don't care. That seems absurd, but strangely the Nvidia cards I have seen are all within 25W in the MXM II form factor.

    I can't blame folk for struggling with this. The flow of people using the HD4650 sort of takes you along with it. Even knowing what I do, I was still reluctantly pursuing the HD4650 route myself, until recently. Triturbo's news has stopped me dead in my tracks. Only evidence would change my mind now.

    @ Meaker,

    I don't understand your first sentence.

    The slot capacity is important. It is part of the circuit. I explained what happens if you pull too much current across a circuit. The tracks/rails/pins, whatever you choose to call them, are between your motherboard and your GPU. The socket will have a current rating, whether it's 25W or more.

    Your analogy with desktop PSU's is floored badly. Assuming your info is correct, that the two extra pins in an eight pin power socket are ground. Don't take it to mean that nothing else is changed, or that explains away the electrical design. Adding ground can supply you with the potential to increase current flow in a device and therfor up the power rating of the device. The positive cables to the device will be designed for higher current rating.

    - Don't put a 200W card on a 75W rail. Doing this is one of the reported main cause of destroying the PSU. If you need further convincing, imagine putting a 200W light bulb in 75W lamp. ( Don't do it though!)

    1. It is exactly the design of the GPU card which makes it 35W, so of course it will run in 35W - why not? (Not the mobo mysteriously pushing power in)
    2. Yes - if the 5920 circuits can transmit safely the higher current then it's ok only coupled if the MXM II can too.
    These 'MXM II slot and the mobo circuits' are 50% equal each to the point of the 'debate'.

    Given Triturbo's evidence is the first/only reported potenetial side-effect of the upgrade, this is drawn into question. (Rep Power up to Triturbo).

    Already explained is; the reason laptops are running Ati's is that the the higher current demand is somehow sustainable. For how long remains to be seen. This is what I want to know, but only an Acer design engineer/technician/circuit analyst with schematic can say for sure.
    I allready said I think Triturbo will be probably be ok. Triturbo's laptop PSU being so hot indicates to me the charger is running at full potential while gaming.

    Acer do make HD4650 laptops:
    Review Acer Aspire 5940G Notebook - Notebookcheck.net Reviews
    This almost validates the HD4650 with MXM II, as it does in other stock H4650 laptops. A good guess at this point though says the power supply for the graphics doesn't affect the battery charging circuit of the 5940.
     
  40. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    @u6b: Actually, the 5940G uses the MXM-A form factor, which is totally different in specifications from MXM-I/II/III/IV.

    The MXM-II Radeon HD 4650 actually hails from two late-model configurations of the Aspire 6930G; since the card was used by Acer themselves in an MXM-II machine, I naturally assumed that it would be safe to use in other Acer machines.

    The DDR2 Radeon HD 4670 showed up later through MXM-Upgrade.com, and I can't speak to where they're getting it from. triturbo's using an MSI-sourced GDDR3 Radeon HD 4670, which might explain the wonky power issue he's run into.
     
  41. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ TehSuigi,

    Well spotted my mistake with ref. to the 5940. Thanks for the right info!
    Thanks too for the info for the 6930G.

    I was actually thinking to quote an Acer with Ati card to try justify Ati on MXM II in another Acer. I knew only of the 5935G with the HD4570, but sadly no power consumption figures for the 4570.

    I have misread Triturbo's card as a HD4650, sorry, I'll go back to above posts and make amendments. (A 4670 with GDDR3 - whoa)

    Regarding Triturbo's card, power consumption is down to voltaging, which will be set in BIOS. You would have had to have compared MSI BIOS and Acer BIOS to draw any conclusions. Is MSI volted higher? (I'm guessing that it's the the extra load causing the results.)

    Generally, I'm cautious about undervolting. For Triturbo, if the HD4670 is not undervolted, doing so would really help to a degree.

    Aside: An overview of the HD4670 is in one statistic it draws 3W more than the HD4650 at load. Assuming that to be true, I assumed it was to support stability in higher clocks.

    Adios
     
  42. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    That's my thoughts exactly and that's why I pointed it out. I got you there about the higher flow through the circuit, but I've seen other members (though with non-Acer lapps) to upgrade their PSUs. If I recall it correctly DELL swapped the initially shipped PSUs on an Alienware model (I don't remember which one) with a higher spec one as it was proved insufficient, without changing anything in the lappy itself.

    Yep, it's because the higher voltage. My default voltage is 1.2v, where HD4650 comes with 1.1v. I do have undervolted my GPU and it runs cooler, but the power consumption is still pretty high (means hot charger, hotter than with HD3650).
     
  43. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Well considering the 4670 has 3 times the number of shaders and will be stressing its memory more then yeah its going to run hotter.

    I use a 95W coolermaster power supply. It's super thin and more efficient than standard PSUs.

    http://www.coolermaster.co.uk/product.php?product_id=6618

    They have a new even smaller model out now:

    http://www.coolermaster.co.uk/product.php?product_id=6645

    Also my optical drive is external and needs a USB port for power and this brick provides it. It also means on holiday my wife does not need to bring her ipods wall plug as the brick provides fast charge over usb.

    As for the tracks on the connectors strugeling with 35W vs 25W they are very similar in size to those used on the desktop which cope up to 75W.

    There is also the fact that the 480GTX uses the SAME connector as the MXM-A standard and draws 100W! The power pins on MXM II are not that much thinner.

    Yes my PSU analogy was flawed for a couple of reasons:

    1. PSU cables are MUCH longer and so have a much higher impact on the circuit, the connector in this case is only a few MM in length so it's effects are reduced.

    2. A PSU cable rated for 75W may well be coupled with circuitry in the PSU itself that is only capable of supply 75W.

    What I was trying to say, if I took a 200W 12V PSU and used the 75W 12V PCI express cable to connect it to a 200W conuming device the cable would get warmer than user and there would be a greater voltage drop accross it but it would still work fine, we pump far more W down the 8 pin CPU connector on most motherboards (up to 200W here) with similar gauge wire.

    So in conclusion you must be more concerned with the surrounding circuitry on the motherboard and MXM card itself.
     
  44. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Except the GTX 480M is an MXM-B card if anything, and thus rated for a higher TDP and power draw.
     
  45. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Ah but the slot, or interface, which is the rating we are discussing here which is the only thing the manufacturers don't have control over is the same between MXM-A or MXM-B.

    If I removed the slot used for a GTX480M and used it to replace the one in my machine it would work just fine.
     
  46. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ TehSuigi

    Although I figured Triturbo's situation is caused by card power usage, I forgot mention, this is really interesting speculation.

    @ Triturbo

    Your voltage figures explain Notebookchecks 'AMD published current consumtion' figures of 3W higher, for the 4670 over the 4650. All good.

    Maybe an email to Acer, to settle your issue about,
    "Please can I use a higher wattage PSU to accomodate my new GPU?". Can the power circuits sustain the extra load comfortably, etc?

    U6b36ef
     
  47. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    You're likely to get a:

    "We don't support any cards not listed in the FRU and would not comment on the ability of the laptop in question to accomodate a higher power draw"
     
  48. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,239
    Trophy Points:
    231
    ^ Sadly it would be like exactly what he said :( Probably they would add that I'd lost my warranty doing so. Nothing stops me to ask though :)
     
  49. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ Triturbo,

    I wrote in one of my previous messages, "rep power up one to Triturbo". I did press the rep power for you but your 'rep' stayed on 7. I tried to do it again, but got a message that said something that wouldn't allow me to do it again. It said something like, "You have already increased the rep of Triturbo. You need to spread some rep around elsewhere before you can do it again for Triturbo"

    I thought more about your situation regarding your HD 4670. I have a suggestion which might cost you performance, but may help to some degree. If you want 'try' to return your laptop to charging the battery while gaming, you could try this. I don't know if it will be sufficient.

    I have read that the HD 4650 and HD4670 are identical cards from a reference standpoint. (Meaning they are physically the same.) The difference between them is higher clocks and voltage for the 4670. If this is correct then this idea might be of use to you. Though you have most likely alleady thought of it.

    This idea is partly based on the information you allready provided about 4650/4670 voltages. Plus I assume you have overclocked your card (HD4670).

    Having undevolted your card, and assuming you have overclocked your 4670, why not reduce both these values again. [This would be where you would loose performance].

    As you said the 4670 standard V=1.2v but you are running at V=1.1v, which is the standard for the 4650, you said, ok.
    Why not try stepping down your clock speeds, eg - to stock clocks for the 4670.
    Then reduce the voltage to V=1.0v

    That would make the your card perform like an typical 'OC enthusiast' graphics card, for a HD4650. Undervolted and overclocked. Yet you would still have HD4670 at standard timings.

    I believe people will allready be doing this with their HD4650, OC'ing and UV'ing. You might find you don't need to clock all the way down to 4670 timings. You still have the DDR3 more than many do, lucky you eh!

    Your suggestion of increasing the PSU may have some merit. I emphasise I only use the word 'may'. As you said you seem to recall Dell doing it. Based on the fact that components have tolerance ranges, a few more watts in PSU 'might' be OK. Tolerance ranges mean components will work at higher and lower voltages (and current) than the standard they expect. If you were to increase your PSU to 95W like Meaker stated having done, it might be ok. Going from 90W to 95W is an increase of about 4.5%. This will probably fall within tolerance range of the components. While it may not destroy the components immeadiatley, over time they may tire more. They will operate warmer.
    However it is a proceed at your own risk case.
     
  50. u6b36ef

    u6b36ef Notebook Consultant

    Reputations:
    48
    Messages:
    164
    Likes Received:
    1
    Trophy Points:
    31
    @ Triturbo,
    @ 5920 users with the T9300, or T9500 installed,

    Hi I need info that I hope someone can share.

    I very recently purchased a T9300. It looks absolutely right. The CPU markings are T9300, with all the specs listed. SLAYY is there in the upper markings.

    However when installed, CPU-Z, Core Temp, SisoftSandra, and Intel Processor ID Utility, all report it is an Engineering Sample CPU.

    Does anyone know for sure why. I have wondered if the BIOS has no 'name' tags for cpu's over 2.4GHz, because Acer Aspire 5920's are standard up to 2.4GHz. I'm guessing, while the BIOS can report and utilise the spec of the CPU, no file exists for the name of it.

    I know not nearly enough about the internal workings of BIOS to be able to draw any conclusion.

    Please, I am wondering if other 5920 users are getting the same result???
     
 Next page →