The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Haswell gt3e to crush Nvidia and Amd low end market gpus ?

    Discussion in 'Gaming (Software and Graphics Cards)' started by fantabulicius, Apr 11, 2013.

  1. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Yep.

    I'm just not going to comment on any of these things any more. I'll just stick to and report the facts and let people figure it all out for themselves. Stating opinion and experiences only seems to infuriate people for some reason.
     
  2. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Can't imagine why they would infuriate people if it's an experience.
     
  3. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    it does, simply does, right now Im sending my korean lesbian ninjas, they sing you k pop till you are dead
     
  4. NBRUser0159099

    NBRUser0159099 Notebook Deity

    Reputations:
    184
    Messages:
    1,585
    Likes Received:
    0
    Trophy Points:
    55
  5. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    i saw that one, those numbers aint correct
     
  6. Fat Dragon

    Fat Dragon Just this guy, you know?

    Reputations:
    1,736
    Messages:
    2,110
    Likes Received:
    305
    Trophy Points:
    101
    Guy's writing a tech column for a finance magazine. Not really the most reliable source.
     
  7. Karamazovmm

    Karamazovmm Overthinking? Always!

    Reputations:
    2,365
    Messages:
    9,422
    Likes Received:
    200
    Trophy Points:
    231
    I think SEC is going after him, he edited those tables alright
     
  8. NBRUser0159099

    NBRUser0159099 Notebook Deity

    Reputations:
    184
    Messages:
    1,585
    Likes Received:
    0
    Trophy Points:
    55
    nevermind, he updated the article. I was skeptical because he wrote with such confidence and the screenshot of sleeping dogs seemed legit.
     
  9. Fat Dragon

    Fat Dragon Just this guy, you know?

    Reputations:
    1,736
    Messages:
    2,110
    Likes Received:
    305
    Trophy Points:
    101
    LOL Forbes tech writer.
     
  10. wickette

    wickette Notebook Deity

    Reputations:
    241
    Messages:
    1,006
    Likes Received:
    495
    Trophy Points:
    101
    Intel HD4600 is not a tie-breaker it's just an upgraded HD4000, when I hit 35 Fps in League of legends with an HD4000.. I can't think it'll replace AMD/Nvidia discrete solutions.

    However the iris 5200 will be able to play flawlessly "small (technically speaking)" games like LoL, Age of empires, Sins of a solar empire, (Civ V ??), etc and more importantly it will transform Windows tablets, throwing that out that crappy ""RT"" sigle , will keeping a sleek fanless design and an unrivalized power :), can't wait to play all the previously listed games on a tablet :).

    The 5200 IS a game changer for Intel and I'm willing to sell my Ipad for a REAL tablet this time, it's a shame it won't be out soon for the first generation of HAswell processor.
     
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Why do people think it will be fanless? That sucker will crank out some serious heat. If you think any CPU with the GT3e is going to be in a thin and light any time soon you can forget about it.
     
  12. BangBangPlay

    BangBangPlay Notebook Consultant

    Reputations:
    199
    Messages:
    241
    Likes Received:
    0
    Trophy Points:
    30
    Just read the article and there is a caveat stating that all original results were inaccurate because his GTX 780 was indeed "kicking in" to help boost Intel Graphics. Sometimes Optimus can be finicky and not behave exactly as desired. But anyways, I guess people have short memories. They seem to forget that many of Intels promises have been exaggerated and disappointing in the past. Couple that with more demanding software (games) and higher resolutions and Intel Graphics will not gain much ground on dedicated GPUs. It may be able to play older games, but I doubt that we will be able to rely on it much for any new software.

    He even states this in the conclusion in response to comments that 1080p is to high of a reference. I am sure Haswell will have its benefits, especially on mobile devices. But is isn't a leap in the evolution of hardware, more of a gradual and predictable step. I am sure his article got a few people to invest in Intel stock though...
     
  13. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    That was GT2 hardware which is your typical iteration improvement from one generation to the next. Very predictable numbers and not very impressive. Although 1080p while commendable as a recommendation, and I agree, it's not realistic.
     
  14. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
  15. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    At this point I wouldn't worry. That is really just a dedicated GPU slapped on a CPU PCB. Power draw, heat, and cost are still considerable factors.
     
  16. Kallogan

    Kallogan Notebook Deity

    Reputations:
    596
    Messages:
    1,096
    Likes Received:
    19
    Trophy Points:
    56
    I don't have high hopes for GT3e anymore. According to reviews, the 4950HQ tested sucked 90W at full load, and was on par with desktop cpus consumption. Granted it was tested on a desktop board with a 80+ psu probably not as efficient as an an external power block etc...Add a screen to that and the global power consumption even in a laptop should be pretty high. It's a good experimental chip, but overall, the chip may not be that efficient compared to dgpus based laptops. Not to mention the rocket price.

    That said, i'd love to see dual cores with that sucker on board to bring the package < 65W. But won't happen soon i guess.
     
  17. Rooter123

    Rooter123 Notebook Enthusiast

    Reputations:
    20
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    5
    I don't think that a dev system is valid for power consumption tests. Just like ES Hardware they often don't have power states enabled. To be honest power tests based on a dev system are really pointless.
     
  18. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Power states are for power saving times. The MAXIMUM power consumption has very little (if any) to do with it.
     
  19. Kallogan

    Kallogan Notebook Deity

    Reputations:
    596
    Messages:
    1,096
    Likes Received:
    19
    Trophy Points:
    56
    Yeah i know i'm extrapolating on biaised data sets. Guess we'll have to wait a proper GT3e notebook to be released. To see if it is really an efficient chip that worth the money or just a premium experimental chip which gives a vision about what APUs may deliver tomorrow. But Intel's claims about GT3e being meant for big notebooks may be true.
     
  20. Atom Ant

    Atom Ant Hello, here I go again

    Reputations:
    1,340
    Messages:
    1,497
    Likes Received:
    272
    Trophy Points:
    101
    I'd like to see now how efficient is GT3e against i5 + 8750M Radeon!
     
  21. Rooter123

    Rooter123 Notebook Enthusiast

    Reputations:
    20
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    5

    Disable all power states and do a full load test. Power consumption should be considerably higher. Nothing new really. Power consumption tests based on Pre Engineering systems are not really meaningful. It's not only down to power states, also the Boards efficiency, power supply efficiency and other things. There is no reason why a GT3e system should consume 10W more in idle.
     
  22. Rooter123

    Rooter123 Notebook Enthusiast

    Reputations:
    20
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    5
    delete..............
     
← Previous page