So I've had an older DV6t-SE (dv6t-3000) for a bit more than a year, and gaming performance has always been bad. Things like the Witcher 2 were actually unplayable because of, in addition to single-digit framerates, strange graphical artifacts (like a large, billowing, mud-textured polygon stretched across half the screen so you can't see what's going on).
I recently tried both Dead Island and the new Deus Ex on this machine, with similar results: Dead Island had the screen-covering polygon artifact and Deus Ex was just slow.
On a hunch, I decided to run those same games after switching to the integrated Intel graphics chip. And the result: the artifacts were gone from Dead Island, and Deus Ex was also faster, if still unplayable!
This is quite strange, right? I know the graphics switcher is doing something because the performance changes, but I can't imagine why the ATI chip would be so noticeably worse than the integrated Intel.
I will investigate further and report back, but I would appreciate any thoughts you might have in the meantime.
EDIT: 2630 3DMarks (05) on the ATI ... hmm. About the same for the Intel, which is reported as the ATI
Graphics Card: ATI Mobility Radeon HD 5650/5750
Vendor: Hewlett Packard
# of cards: 1
SLI / CrossFire: Off
Memory: 0 MB
Core clock: 0 MHz
Memory clock: 0 MHz
Driver name: ATI Mobility Radeon HD 5650
Driver version: 8.771.1.0
Driver status: Not FM Approved
Edit2: I might also note that the Windows Aero experience score has always been 3.2, but Gaming Graphics is 5.5.
Edit3: The simple expedient of disabling the Intel chip in device manager is a workaround, but isn't exactly ideal.
DV6t graphics performance: ATI worse than Intel
Discussion in 'HP' started by dsch, Sep 8, 2011.