The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Kepler 28nm cards vs 40nm fermi cards

    Discussion in 'Gaming (Software and Graphics Cards)' started by nissangtr786, Jul 21, 2012.

  1. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    Anyone else find it the biggest leap between one generation. Fermi was designed inefficiently and kepler was designed efficiently and it being 28nm vs 40nm fermi.

    I just looked at some results like 650m vs 560m and there is 75w power consumption difference between basic i7 quads. 650m wins just about on most benchmarks. The 680m takes less power consumption then the 560m. I think the notebook market needed this as before laptops were like desktop power hungry gpu's in laptops. Amd did very well with there 28nm design architecture as well as it is basically neck and neck with kepler and amd wins on other tasks like I believe compute performance.

    I want to get a 650m gddr5 version as it takes around 75w gaming with an ivy bridge cpu. Overall its an incredible leap in performance per watt in one generation.

    I suppose a 40nm inefficient card to a 28nm efficient card is an amazing difference. If nvidia 22nm maxwell is as expected then computing for ultrabooks will take the market by storm.

    Overall 3d gaming on nvidia maxwelll basic gpu's seems possible in the future. I was amazed by the 28nm cards like 650m performance and power consumption. In 2 years time we will be thinking of the 670m 675m like how the old fat ps3 is to the new ps3 that is coiming out.

    Overall its incredible how big a difference 28nm kepler is to a 40nm fermi. All I say is people who buy 670m 675m 630m fermi cards etc are getting ripped off big time. At the end of the day would anyone get the old fat ps3 for more money then what the new one cech 4000 ps3 slim is being released because the old one can play ps2 games. Thats the difference performance per watt for a 660m over a 670m.
     
  2. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    I think a lot of this efficiency came from the decision to ditch GPGPU and concentrate on building the GPUs for games instead. :)
     
  3. nissangtr786

    nissangtr786 Notebook Deity

    Reputations:
    85
    Messages:
    865
    Likes Received:
    0
    Trophy Points:
    0
    Well I knew kepler did ditch something but anyway we have quadro cards for that stuff I believe.
     
  4. MegaBUD

    MegaBUD Notebook Evangelist

    Reputations:
    45
    Messages:
    670
    Likes Received:
    3
    Trophy Points:
    31
    Well... the 660m is cheaper... and you can push this baby over 1200mhz on core... I would rather save 100$ today and spend it on my next laptop in 2-3years... The games that i play arent very hard on the hardware(SC2 Skyrim Mincraft Diablo3 L4Dead and more)...

    You might wanna get a 650m ddr5 instead of a 660m... its basicly the same card but cheaper...
     
  5. RainMan_

    RainMan_ Notebook Evangelist

    Reputations:
    180
    Messages:
    396
    Likes Received:
    11
    Trophy Points:
    31
    Fermi is old news now.

    I think NVIDIA did a great job especially with GT 650M , It's a great midrange card and it performs like the old high-end cards and even surpass them.
    GTX 680M is a shock as well, in a good way. I also noticed that Kepler cards tend to run cooler than Fermi.

    I wonder how NVIDIA Maxwell will perform.