The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    New Intel GMA HD vs ATI Radeon 3470

    Discussion in 'Lenovo' started by hatboy, Feb 3, 2010.

  1. hatboy

    hatboy Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    I was thinking about getting the T400 with Radeon 3470 since is selling cheap now right and I some time play game (mostly soccer game like PES or FIFA) when I have free time. The T410 has some nice features that I like, but with discrete graphic card is kind of out of my budget. So I'm curious how is the new intel graphic hold up against radeon 3470? Thanks!
     
  2. jmw03j

    jmw03j Notebook Enthusiast

    Reputations:
    1
    Messages:
    42
    Likes Received:
    0
    Trophy Points:
    15
    I believe the 3470 would be roughly identical with the 3470 winning barely in most tests. This isnt saying much though as neither are that quick. Just as a reference, the 3dmark06 score on the Radeon 3470 is about 1800 and on the Intel IGP its barely below at 1750 or so.

    The new Geforce NVS 3100m in the T410 gets about 3500 as a reference.
     
  3. ghost_recon88

    ghost_recon88 Notebook Geek

    Reputations:
    0
    Messages:
    96
    Likes Received:
    2
    Trophy Points:
    16
    Which new Intel graphic chip are you referring to, the X4500HD?
     
  4. MidnightSun

    MidnightSun Emodicon

    Reputations:
    6,668
    Messages:
    8,224
    Likes Received:
    231
    Trophy Points:
    231
    I believe the OP is talking about the integrated GPU in the new Core i_ CPUs from Intel. I've seen benchmarks that show the 3470 ahead by a small margin. Both will play your games just fine, though.
     
  5. lead_org

    lead_org Purveyor of Truth

    Reputations:
    1,571
    Messages:
    8,107
    Likes Received:
    126
    Trophy Points:
    231
    the intel GPU gets better every generation....
     
  6. hatboy

    hatboy Notebook Enthusiast

    Reputations:
    0
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    Yep I was talking about the new intel graphic in the new core i cpu. Thanks for the feedback.
     
  7. khtse

    khtse Notebook Consultant

    Reputations:
    133
    Messages:
    203
    Likes Received:
    0
    Trophy Points:
    30
    Yet they still suck...

    I don't play games on my Thinkpad, so that's barely OK for me. But from time to time I can't help but sigh at the CPU usage when playing video, as well as the poor video playback quality (usually more pixelated), when compared to my desktop computer with discrete graphic card.

    One thing I like about Apple is their implementation of nvidia graphics chip on their whole macbook line, including the thin-and-light macbook air. (don't get me wrong, I still prefer thinkpad to macbook).
     
  8. jaredy

    jaredy Notebook Virtuoso

    Reputations:
    793
    Messages:
    2,876
    Likes Received:
    1
    Trophy Points:
    56
    I don't think you understand how discrete graphics help. There can be hardware acceleration for video but that can be exclusive of graphics (3d) performance as well.

    The last generation intel GPU has hardware acceleration for certain types of video and has pretty darn good performance. CoreAVC (efficient software decoder) is also pretty good on CPU utilization.

    You don't need discrete graphics if you don't play games or utilize 3d intensive workstation applications.

    The low level performance of the 'integrated' nvidia graphics on the macbooks aren't exactly high end performers either.

    The new generation is just even better!

    Also remember not all video is easily hardware accelerated. WMV is terrible...flash is eventually going to be hardware accelerated will become more common. Either that or we will have different methods of rendering out flash type media.
     
  9. lead_org

    lead_org Purveyor of Truth

    Reputations:
    1,571
    Messages:
    8,107
    Likes Received:
    126
    Trophy Points:
    231
    since the Intel X4500 GPU, it has started to implement full native decoding of blu-ray movie, so that it wouldn't be using any CPU resources when you play most videos. So the question would be which intel GPU are you referring to?