Hi notebookreview,
I have been lurking here and also in ASUS / Lenovo subforums for a while now trying to determine a decently powerful laptop with a more-than-reasonable price. Recently I found this Acer laptop deal:
Acer AS5742G-7220
AS5742G-7220 I5-460M 2.53G 4GB 500GB DVDRW 15.6-WXGAG W7HP BROWN - DirectCanada
Which I thought was fabulously low-priced for the specs it contains, and so I snatched it yesterday online.
I tried really hard in finding any reviews of this model but only 1 short and not quite so helpful review came up, it doesn't seem to be a well-known one. Does anyone here by-chance have this model (or something very similar) and could comment on its pros/cons/flaws/build quality?
Also, I can't seem to find info regarding if its screen is LED-backlit, it should be right?
And lastly, are laptops like this (i5 CPU and GT3/400 series GPU) all equipped with nV Optimus technology? It does not mention anything about that in product description.
Cheers.
-
Here is a similar model of the AS5742G on Newegg.com. Read the reviews on it.
Newegg.com - Acer Aspire AS5742G-7200 NoteBook Intel Core i5 460M(2.53GHz) 15.6" 4GB Memory 500GB HDD 5400rpm DVD Super Multi NVIDIA GeForce GT 420M -
-
Just got the laptop, it only has a GT420M equipped (exactly same as the US version), what the fudge?
-
See my post over here - you can overclock the GT 420M to mimic a GT 425M since they use the same core design.
-
Is GT420M also different from GT435M (the much more powerful one) from only being underclocked? Or are there other differences between these two? -
Believe it or not, the GT 435M is also based on the same core as the GT 420M and GT 425M - it's just clocked at 650 core/1300 shaders. So if your GT 420M is a really good overclocker (it does vary from chip to chip due to how these things get manufactured), it can hit GT 435M levels!
You have to reach the GT 4 45M before you actually gain more shaders and GDDR5 memory.
For example, my 9500M GS is actually clocked slightly higher than the 9650M GS based on the same GPU core; it's just slower because the 9650M GS got paired with GDDR3 video memory, while my 9500M GS is stuck with DDR2. -
-
First I've heard of nVidiaInspector.
Here's what I use:
-HWMonitor or this Windows Gadget to keep an eye on temps
- Nvidia System Tools to overclock; it integrates nicely into the Nvidia Control Panel, and I've set it up so my GPU is only overclocked when gaming
-3DMark06 is OK to check improvement, but it's growing weaker and weaker as a comparative tool between systems. Use a game-based benchmark, or your own experience.
- OCCT or FurMark to test for stability; I cannot stress this one enough. These will stress your GPU to its fullest to see what your thermal ceiling is at the overclock, and to see if it's even stable at your higher speeds. Remember, the reason your GPU was clocked lower was because Nvidia didn't find it stable enough (for them) at GT 425M or GT 435M speeds. -
+1
Thanks for the detailed explanation. I tried FurMark earlier but couldn't really figure out how to test stability effectively enough. Do I do something like ... msaa x4, post fx, xtreme burning mode on, test "stability test', and leave it for 5 mins to see what max GPU temperature it gets?
What should be a safe limit to that if I gradually OC the core clock?
So far I OC'd it from stock 500mhz to 650mhz (gt435m level), and max temp with furmark stability test for 5 mins and those settings got to 80 degrees and started lowering.
As for ingame benchmark, I might try the Crysis one. -
I personally use OCCT's test, which is similar to FurMark; set it for an hour at default settings and look for artifacts. 5 minutes isn't really representative of what you're going to be doing.
OK, so your core's at 650 MHz - did you increase the shader clock to 1300 as well? 80 Celsius is pretty damn good for full-bore temperature - just see how hot it gets at 15, 30, and 60 minutes.
Check out the attached graph for my own results - not sure what happened between 30 and 40 minutes, but my GPU maxed out at 82 because of it.Attached Files:
-
-
It did in fact give a very drastic boost to fps in crysis @ high settings. Before OCing it feels like 15 fps mingled with dips in fps. After it's very smooth (at least a constant 25 fps+), and no dips / stuttering at all.
I'm gonna try the OCCT test, maybe 30 min or 1h. Hoping for the best.
If Tmax stays 80 or so, does that mean potential for a bit more OCing? Or is that really not worth it in terms of long-term detrimental effects on the GPU?
edit: so far 10+ minutes into the OCCT test (with shader complexity 7), and GPU temperature refuses to go over 69 C. It's been stuck at 69 C for the last 1/3 of the duration.
edit 2: I stopped the test at 25 minutes....it's been going between 69 and 70 C from 13 minutes in up till then for 10 minutes straight. I don't think this stress test is working effectively at all (compared to FurMark). Even playing some Assassin's Creed on max settings Msaa x4 yielded Tmax of 73 C. -
Then don't use OCCT! Different strokes for different folks, I guess.
I'd still run FurMark for 30-60 mins to check for stability and heat.
If Tmax stays at 80, there's certainly thermal headroom for higher clocks, but again you'd be limited by system stability.
Overclocking does knock some time off your system's lifespan, but we're talking going from 15 years to 10-12. Are you really going to be keeping your 5742G for that long? -
What's strange is that after running FurMark on 1366x768, MSAA x8, post fx, xtreme burning (even more load than yesterday's short tests) for 40 minutes just now, the Tmax was 77 C, and most of the time the temperature was around 75~76 C well into the test. I wonder why, especially because I actually overclocked another 25 mhz onto core clock (from stock 500mhz to 650mhz yesterday, to 675mhz today), it should have ran hotter than 80 C...
So weird. -
You have to take into account room temperature when doing these things, as well as anything the CPU's doing in the background; remember, the CPU and GPU share a common copper heatpipe that transfers heat to the grille that the fan blows on in order to cool your system.
-
So I thought the game was just buggy, fine, tried Assassin's Creed, this time the Tmax was below 72 C because the game is less demanding than Crysis. However, 10 minutes in, bam, freeze whitescreen followed by reboot.
So I downclocked it to 625 mhz, and cautiously tried Crysis again (I was worried that this might have permanently screwed the chip), it ran fine...around same fps as the brief 700mhz play too. There are a few sound stutters here and there but that's an inherent glitch with Crysis and not related to the GPU overloading.
I wonder if (since I tested it extensively yesterday with games, and GPU ran much hotter than today too) I can still bump it up to 650mhz and keep the OC at that level again? Is it likely for a GPU to not be able to OC as well if it's crashed a couple of times due to excessive OCing? -
Can't say I've heard of that being the case. Worst case scenario, you give it a shot and it doesn't work. Big deal, leave it at 625!
-
Although the amount of perceivable performance gain of that extra 25mhz was hard to notice.
Still, can't help but wonder, usually when you overclock the GPU and it crashes, isn't the video card driver the only thing that restarts? I find it strange that the entire computer busts and restarts each time. Maybe there's another factor? -
Sometimes the computer freezes, sometimes just the screen freezes and the rest seem to work in the background and sometimes the whole machine just reboots. It seems to depend mostly on a card but also on a game to some degree.
-
Meaker@Sager Company Representative
1. The chip was not stable enough to hit the clock speeds.
2. They have too many high performance chips and set some of them lower anyway.
3. The chip is too leaky to meet the thermal limits at the clock speeds of the higher model. -
Unless people start reporting that GT425M and/or GT435M chips can't be OC'd past 700mhz without it crashing in games, in which case I'd be happy because no binning occurred.
Edit: A wholly unrelated (still related to 5742G) topic/issue from GPU, is that I can't figure out if my system has optimus implemented.
Is i5 + GT420M optimus enabled by default? The cool Dell XPS14 kids have it. I googled around and Acer claims that their 5742 line would have optimus in some tech news a few months back when this line of new Acer laptops was announced, but nowhere else can I find any info about that. -
Still haven't found an efficient solution in testing the limits in OCing this video card; tweaking it up and testing it in Crysis is very slow and inaccurate (many other factors can be involved). Are there any softwares that will push it as hard as say Crysis, while one is able to know when the GPU can't handle it (artifacts etc) immediately?
Also, testing new signature. -
Meaker@Sager Company Representative
3dmark 06 and vantage are quite good actually. Firefly forest/Canyon flight in 06 and the space scene in vantage.
-
Any experience with AS5742G?
Discussion in 'Acer' started by jerg, Nov 15, 2010.