The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Any experience with AS5742G?

    Discussion in 'Acer' started by jerg, Nov 15, 2010.

  1. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Hi notebookreview,

    I have been lurking here and also in ASUS / Lenovo subforums for a while now trying to determine a decently powerful laptop with a more-than-reasonable price. Recently I found this Acer laptop deal:

    Acer AS5742G-7220
    AS5742G-7220 I5-460M 2.53G 4GB 500GB DVDRW 15.6-WXGAG W7HP BROWN - DirectCanada

    Which I thought was fabulously low-priced for the specs it contains, and so I snatched it yesterday online.

    I tried really hard in finding any reviews of this model but only 1 short and not quite so helpful review came up, it doesn't seem to be a well-known one. Does anyone here by-chance have this model (or something very similar) and could comment on its pros/cons/flaws/build quality?

    Also, I can't seem to find info regarding if its screen is LED-backlit, it should be right?

    And lastly, are laptops like this (i5 CPU and GT3/400 series GPU) all equipped with nV Optimus technology? It does not mention anything about that in product description.



    Cheers.
     
  2. aylafan

    aylafan TimelineX Elite

    Reputations:
    438
    Messages:
    1,247
    Likes Received:
    1
    Trophy Points:
    56
  3. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
  4. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Just got the laptop, it only has a GT420M equipped (exactly same as the US version), what the fudge?
     
  5. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    See my post over here - you can overclock the GT 420M to mimic a GT 425M since they use the same core design.
     
  6. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Haha yep, I actually reported this on the seller website forums, and quickly got the same reply in terms of the fact that the two are basically the same except one's core clock is cranked a little higher.

    Is GT420M also different from GT435M (the much more powerful one) from only being underclocked? Or are there other differences between these two?
     
  7. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Believe it or not, the GT 435M is also based on the same core as the GT 420M and GT 425M - it's just clocked at 650 core/1300 shaders. So if your GT 420M is a really good overclocker (it does vary from chip to chip due to how these things get manufactured), it can hit GT 435M levels!
    You have to reach the GT 4 45M before you actually gain more shaders and GDDR5 memory.

    For example, my 9500M GS is actually clocked slightly higher than the 9650M GS based on the same GPU core; it's just slower because the 9650M GS got paired with GDDR3 video memory, while my 9500M GS is stuck with DDR2.
     
  8. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    How might I OC it easily and safely? I've read around and perhaps a combination of HWMonitor (to check GPU temperature peaks), nVidiaInspector (OC software similar to nForce), and 3DMarks06 (to check effective improvement) would suffice?
     
  9. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    First I've heard of nVidiaInspector.
    Here's what I use:
    -HWMonitor or this Windows Gadget to keep an eye on temps
    - Nvidia System Tools to overclock; it integrates nicely into the Nvidia Control Panel, and I've set it up so my GPU is only overclocked when gaming
    -3DMark06 is OK to check improvement, but it's growing weaker and weaker as a comparative tool between systems. Use a game-based benchmark, or your own experience.
    - OCCT or FurMark to test for stability; I cannot stress this one enough. These will stress your GPU to its fullest to see what your thermal ceiling is at the overclock, and to see if it's even stable at your higher speeds. Remember, the reason your GPU was clocked lower was because Nvidia didn't find it stable enough (for them) at GT 425M or GT 435M speeds.
     
  10. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55

    +1


    Thanks for the detailed explanation. I tried FurMark earlier but couldn't really figure out how to test stability effectively enough. Do I do something like ... msaa x4, post fx, xtreme burning mode on, test "stability test', and leave it for 5 mins to see what max GPU temperature it gets?

    What should be a safe limit to that if I gradually OC the core clock?

    So far I OC'd it from stock 500mhz to 650mhz (gt435m level), and max temp with furmark stability test for 5 mins and those settings got to 80 degrees and started lowering.



    As for ingame benchmark, I might try the Crysis one.
     
  11. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    I personally use OCCT's test, which is similar to FurMark; set it for an hour at default settings and look for artifacts. 5 minutes isn't really representative of what you're going to be doing.

    OK, so your core's at 650 MHz - did you increase the shader clock to 1300 as well? 80 Celsius is pretty damn good for full-bore temperature - just see how hot it gets at 15, 30, and 60 minutes.
    Check out the attached graph for my own results - not sure what happened between 30 and 40 minutes, but my GPU maxed out at 82 because of it.
     

    Attached Files:

  12. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Yep, in fact the tweakable part is the shader clock, and the core clock gets adjusted proportionally as a consequence (in nvidia inspector i mean).

    It did in fact give a very drastic boost to fps in crysis @ high settings. Before OCing it feels like 15 fps mingled with dips in fps. After it's very smooth (at least a constant 25 fps+), and no dips / stuttering at all.

    I'm gonna try the OCCT test, maybe 30 min or 1h. Hoping for the best.

    If Tmax stays 80 or so, does that mean potential for a bit more OCing? Or is that really not worth it in terms of long-term detrimental effects on the GPU?





    edit: so far 10+ minutes into the OCCT test (with shader complexity 7), and GPU temperature refuses to go over 69 C. It's been stuck at 69 C for the last 1/3 of the duration.



    edit 2: I stopped the test at 25 minutes....it's been going between 69 and 70 C from 13 minutes in up till then for 10 minutes straight. I don't think this stress test is working effectively at all (compared to FurMark). Even playing some Assassin's Creed on max settings Msaa x4 yielded Tmax of 73 C.
     
  13. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Then don't use OCCT! Different strokes for different folks, I guess.
    I'd still run FurMark for 30-60 mins to check for stability and heat.

    If Tmax stays at 80, there's certainly thermal headroom for higher clocks, but again you'd be limited by system stability.
    Overclocking does knock some time off your system's lifespan, but we're talking going from 15 years to 10-12. Are you really going to be keeping your 5742G for that long? ;)
     
  14. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Oh I didn't mean any offense toward OCCT, it's a great program (much less laggy than FurMark), I was just expecting some results from it and forcibly kept myself away from this new computer for that duration for that test to go on, and it ended up being not quite useful.

    What's strange is that after running FurMark on 1366x768, MSAA x8, post fx, xtreme burning (even more load than yesterday's short tests) for 40 minutes just now, the Tmax was 77 C, and most of the time the temperature was around 75~76 C well into the test. I wonder why, especially because I actually overclocked another 25 mhz onto core clock (from stock 500mhz to 650mhz yesterday, to 675mhz today), it should have ran hotter than 80 C...

    So weird.
     
  15. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    You have to take into account room temperature when doing these things, as well as anything the CPU's doing in the background; remember, the CPU and GPU share a common copper heatpipe that transfers heat to the grille that the fan blows on in order to cool your system.
     
  16. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    I just found out the hard way that OCing is not just about temperature... 700 mhz core clock playing Crysis - checking temperature now and then and it was kept below 74 C; however, 5 minutes in the entire computer crashes and restarts.

    So I thought the game was just buggy, fine, tried Assassin's Creed, this time the Tmax was below 72 C because the game is less demanding than Crysis. However, 10 minutes in, bam, freeze whitescreen followed by reboot.

    So I downclocked it to 625 mhz, and cautiously tried Crysis again (I was worried that this might have permanently screwed the chip), it ran fine...around same fps as the brief 700mhz play too. There are a few sound stutters here and there but that's an inherent glitch with Crysis and not related to the GPU overloading.

    I wonder if (since I tested it extensively yesterday with games, and GPU ran much hotter than today too) I can still bump it up to 650mhz and keep the OC at that level again? Is it likely for a GPU to not be able to OC as well if it's crashed a couple of times due to excessive OCing?
     
  17. TehSuigi

    TehSuigi Notebook Virtuoso

    Reputations:
    931
    Messages:
    3,882
    Likes Received:
    2
    Trophy Points:
    105
    Can't say I've heard of that being the case. Worst case scenario, you give it a shot and it doesn't work. Big deal, leave it at 625!
     
  18. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Haha I manned up, cranked it to 650 again, and played Crysis with (phew) no problem.

    Although the amount of perceivable performance gain of that extra 25mhz was hard to notice.



    Still, can't help but wonder, usually when you overclock the GPU and it crashes, isn't the video card driver the only thing that restarts? I find it strange that the entire computer busts and restarts each time. Maybe there's another factor?
     
  19. downloads

    downloads No, Dee Dee, no! Super Moderator

    Reputations:
    7,729
    Messages:
    8,722
    Likes Received:
    2,231
    Trophy Points:
    331
    Sometimes the computer freezes, sometimes just the screen freezes and the rest seem to work in the background and sometimes the whole machine just reboots. It seems to depend mostly on a card but also on a game to some degree.
     
  20. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    Actually there are several reasons it could be:

    1. The chip was not stable enough to hit the clock speeds.
    2. They have too many high performance chips and set some of them lower anyway.
    3. The chip is too leaky to meet the thermal limits at the clock speeds of the higher model.
     
  21. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    I can confirm (3) is not the case, I OC'd the chip to gT435M level earlier and did extensive stress tests / ingame tests and the temperature was always quite low (most of the time Tmax sub-75 C). However I'm pretty sure it's binned, because as I described in a previous post, clocking it 200 mhz over its stock level (or 50mhz over GT435M stock level, since they share the same core) proves to be extremely unstable in-game, despite the temperature being very manageable. So it's (1).

    Unless people start reporting that GT425M and/or GT435M chips can't be OC'd past 700mhz without it crashing in games, in which case I'd be happy because no binning occurred :D.





    Edit: A wholly unrelated (still related to 5742G) topic/issue from GPU, is that I can't figure out if my system has optimus implemented.
    Is i5 + GT420M optimus enabled by default? The cool Dell XPS14 kids have it. I googled around and Acer claims that their 5742 line would have optimus in some tech news a few months back when this line of new Acer laptops was announced, but nowhere else can I find any info about that.
     
  22. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    Still haven't found an efficient solution in testing the limits in OCing this video card; tweaking it up and testing it in Crysis is very slow and inaccurate (many other factors can be involved). Are there any softwares that will push it as hard as say Crysis, while one is able to know when the GPU can't handle it (artifacts etc) immediately?

    Also, testing new signature.
     
  23. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,431
    Messages:
    58,194
    Likes Received:
    17,901
    Trophy Points:
    931
    3dmark 06 and vantage are quite good actually. Firefly forest/Canyon flight in 06 and the space scene in vantage.
     
  24. jerg

    jerg Have fun. Stay alive.

    Reputations:
    141
    Messages:
    1,239
    Likes Received:
    0
    Trophy Points:
    55
    At least 3Dmark06 was proven to be not enough to push my GPU to its limit. I actually overclocked +200mhz over stock core clock yesterday, 3Dmark06 ran it fine (with a very good performance), but with that clock the GPU couldn't handle Crysis and crashed repeatedly. Now I am forced to return to a +150mhz overclock for stability.