The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    Intel Core i5 is here

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by elijahRW, Sep 8, 2009.

  1. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Please, do not try to pick up a fight, nor anything.
    Check Forum Rules if you want on this matter.
    Thanks!
     
  2. elijahRW

    elijahRW Notebook Deity

    Reputations:
    940
    Messages:
    1,797
    Likes Received:
    0
    Trophy Points:
    0
  3. MrX8503

    MrX8503 Notebook Evangelist

    Reputations:
    126
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
  4. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    Im pretty sure all desktop CPU's are power hungry, that is their purpose, to be as fast as possible, but they are not hot and don't consume a lot of power all the time, mine uses 25 watts when idling, which isnt much at all.

    It is an improvement TDP wise, but the fact that it has no HT is kind of a bummer.
     
  5. iGrim

    iGrim Notebook Evangelist

    Reputations:
    47
    Messages:
    380
    Likes Received:
    0
    Trophy Points:
    0
    No, Im going to say buy Westmere when its released due to normal power usage....
     
  6. Jayayess1190

    Jayayess1190 Waiting on Intel Cannonlake

    Reputations:
    4,009
    Messages:
    6,712
    Likes Received:
    54
    Trophy Points:
    216
    He is holding a wafer full hundreds/thousands of 32nm processors.
     
  7. mattwoller

    mattwoller Notebook Enthusiast

    Reputations:
    22
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    5
    It isn't the number of cores that's a problem - back when 1 core was all you had, and the 3.06 introduced this second "virtual" core, you'd think background processes would run on the virtual core while a game would run on the main core, after all XP supported SMP. That wasn't the problem, though - a HyperThreaded core just isn't as good as an actual "real" core. So when you go from 4 to 8, it isn't that the application or game can't use the extra four cores, they just aren't as good as having an octo-core CPU.

    There is no reason why having 8 cores would slow you down, in any program, versus having 4. The program you're using simply wouldn't use the extra cores, but it wouldn't flat out run 10% slower because it wouldn't use those extra cores. This is why people have been reporting, for five years (since the 3.06Ghz Pentium 4 "B" Northwood with HT came out) that disabling HT has actually increased performance.

    Man, I remember back in the day you could lose 40% framerates by enabling HyperThreading. Made my 3.06 WAY less cool. :(
     
  8. mattwoller

    mattwoller Notebook Enthusiast

    Reputations:
    22
    Messages:
    21
    Likes Received:
    0
    Trophy Points:
    5
    Core i7 clearly is faster than Core i5, there's no mistaking that. This is why they cost 40%+ more than the i5s.

    The area of debate is whether or not the average user needs 8 cores (4 real, 4 virtual) over 4 in most tasks, including "high end gaming" and if those extra 4 virtual cores are worth the price premium.

    For 99.9% of all gamers, I fail to see how a Core i5 - running stock or overclocked - wouldn't be good enough for gamers. Benchmarks both synthetic and real world show the Core i5 to be nearly identical to the Core i7 (which makes sense, considering they are nearly identical to begin with) in performance. And quad core is the "way of the future" if you will. I play games often, I can't find a reason to say "hey I need eight cores" which is useful, given Hyper Threading can slow some applications down when enabled.

    For the rest of us (or you) who need 8 cores, the multimedia folks I suppose, you can buy a Core i7. I'd be curious what exactly you are doing (particularly given this is a notebook forum primarily) that would make you "need" 8 CPU cores, but to each his own. Myself (and nobody else, I would hope) are tellin you what you should or could do with your computer.

    But for the vast majority, there is no big reason to not be fine with Core i5 unless you plan on keeping your system for five years and making no modifications to it. In which case you more than likely wouldn't need the Core i5 or be the type of consumer with which this conversation would take place. :)

    To recap:

    Core i7 = faster than Core i5.

    A SuperComputer is faster than my 2.26Ghz Core2Duo-powered laptop.

    Most people will be fine with a Core i5, and for a variety of reasons it's simply the "better buy" or "better bang for the buck" if you will, for the same reasons people buy a $70,000 Z06 Vette over a $275,000 Ferrari.
     
  9. ajreynol

    ajreynol Notebook Virtuoso

    Reputations:
    941
    Messages:
    2,555
    Likes Received:
    0
    Trophy Points:
    55
    just so that I'm clear...these are desktop processors, right?
     
  10. ChinNoobonic

    ChinNoobonic Notebook Evangelist

    Reputations:
    273
    Messages:
    638
    Likes Received:
    1
    Trophy Points:
    30
    yes for now. mobile versions will be expected next year.
     
  11. MrX8503

    MrX8503 Notebook Evangelist

    Reputations:
    126
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
    What is normal usage?

    As far as I'm concerned, NOTHING touches the Core i5 in terms of performance/watt.

    There's a whole anandtech article about it, its pretty informative.
     
  12. iGrim

    iGrim Notebook Evangelist

    Reputations:
    47
    Messages:
    380
    Likes Received:
    0
    Trophy Points:
    0
    Normal CPU power usage is ~65 watts. Anything above that you have yourself a electrical heater which requires a big heatsink ect..
     
  13. MrX8503

    MrX8503 Notebook Evangelist

    Reputations:
    126
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
    ..............
     
  14. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    You can't compare P4's Hyperthreading with Nehalem's. Take a look at this one for example.

    http://www.solidmuse.com/2008/12/core-i7-to-hyperthread-or-not.html

    Theses particular quotes are interesting:
    It's not as good as an actual core, but its not meant to be. Power consumption increase is nowhere near the increase for one example, and die size doesn't increase with any significance either. Again, back in the P4 days, most PC apps were single threaded, and nowadays, most PC apps are 4 threaded. Games, are 4 threaded which means HT will benefit on dual core Arrandale.
     
  15. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    I think that is YOUR definition of "normal usage". As I said my i7 which I actually own sits as low as 25 watts, and never goes over 70C when im not OCing.

    I have no idea why I waste my time.
     
  16. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Normal usage depends of user. I for example use my laptop ALL the time, this is working, NBR'ing, chatting, modeling 3D, skype, or simple web browse or word processing. If non of the above, I play some game, if I have the time.

    For me, an i7 wouldnt be a waste. But I will go with whatever suits my needs, even if this means a C2D only. Again, it depends.

    Regular users, who just browse the web and write stuff in word for school, dont need an i7 or something similar, just a CULV and they are good to go, and it is quite capable and fast enough. For example, a mere word doc and some tabs in FireFox or IE8 plus the occasional iTunes will NOT take advantage of a faster i7.

    On the other hand, someone who works with large databases, number crunching, virtual machines, large amounts of data, tons of apps open, they will benefit from this.
     
  17. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    True, I do not notice a huge difference when doing normal things, jumping from Core2 Duo to i7 has a huge difference though, W7 is totally optimized for multi-cores so the multi-tasking is much better than on Core2.

    But my Core2 Quad downstairs is about the same as this honestly, not much slower at all.

    When I built I want to be ahead in the future, because 4 cores(in addition to the 4 virtual ones) will be very useful when developers start programming for them more regularly.
     
  18. Mr. Wonderful

    Mr. Wonderful Notebook Evangelist

    Reputations:
    10
    Messages:
    449
    Likes Received:
    6
    Trophy Points:
    31
    Eh. Waiting for Sandy Bridge for my next desktop upgrade. Evolution of Core 2 architecture? Yes please.
     
  19. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    hahahaha if you keep waiting, youll never get a laptop...lol
    As usual, buy when needed.
    ALthough, Sandy sounds quite interesting, but TOO far away in the future for me.
     
  20. xleonid

    xleonid Notebook Consultant

    Reputations:
    87
    Messages:
    169
    Likes Received:
    0
    Trophy Points:
    30
    @Vinyard
    Indeed wrong decision...
     
  21. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    there is a difference between what a cpu is actually using and what a cpu IS DESIGNED FOR.

    if it's written down to require 95W, and you get a 50W psu, it might work, but might suddenly just turn off when it, at some moment, require more power.

    and as desktops deliver power with ease, the specs can be much less tight than in a laptop. so yes, my quadcore doesn't eat much W while idling and surfing, but it's allowed to eat up to 105W (in my case, i think), when at full usage.

    and there's a difference from my laptop to my pc. the laptop is not allowed to burn that much power. when not on maximum usage, both obviously use much less power.

    and vinyard, cancle the order if you can..
     
  22. iGrim

    iGrim Notebook Evangelist

    Reputations:
    47
    Messages:
    380
    Likes Received:
    0
    Trophy Points:
    0
    Incorrect by CPU standards. I dont care if you have 100 cores we're still taking about ONE cpu in a single socket. Over CPU history 95watts is very high power usage for a CPU. Also, It is clear you dont take full advantage of the CPU. Browsing the web and checking emails does not exactly utilize four cores... :cool:
     
  23. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    btw, your 25W cpu has a TDP of 130W... so now you might notice that the i5 might, just might, consume 12-15W at the same task than yours ...

    this comparison is not true, but should give you an idea.. TDP is the limit of what it's allowed to require. nothing else.
     
  24. Kevin

    Kevin Egregious

    Reputations:
    3,289
    Messages:
    10,780
    Likes Received:
    1,781
    Trophy Points:
    581
    Please, just stop.
     
  25. elijahRW

    elijahRW Notebook Deity

    Reputations:
    940
    Messages:
    1,797
    Likes Received:
    0
    Trophy Points:
    0
    You took the words right out of my mouth :realmad:
     
  26. afhstingray

    afhstingray Notebook Prophet

    Reputations:
    351
    Messages:
    4,662
    Likes Received:
    2
    Trophy Points:
    105
    ditto. i smell a chronic troll.
     
  27. MrX8503

    MrX8503 Notebook Evangelist

    Reputations:
    126
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
    lol you should see his other thread.

    Anyway, its all about performance/watt that is how its always been when rating CPU's.

    Thats how Intel rates it, thats how AMD does it, and thats how every single tech site does it.

    It doesn't matter if you just browse the web or look at your computer screen while scratching your butt, if Intel produced a powerful chip within its prospective TDP and outperforms all others, then they were successful.
     
  28. coldmack

    coldmack Notebook Virtuoso

    Reputations:
    92
    Messages:
    2,539
    Likes Received:
    0
    Trophy Points:
    55
    Out of curiosity, is there such thing as a quad core i3 and if so how is the power usage on them?
     
  29. Hep!

    Hep! sees beauty in everything

    Reputations:
    1,806
    Messages:
    5,921
    Likes Received:
    1
    Trophy Points:
    206
    Nope. Have not heard anything about i3s yet. If ever, that is.
     
  30. coldmack

    coldmack Notebook Virtuoso

    Reputations:
    92
    Messages:
    2,539
    Likes Received:
    0
    Trophy Points:
    55
    Thank you. Also, out of curiosity, I know AMD has a triple core cpu, will they offer the i5 or i7 in a tripe core? When do you think we will know about i3's and other i5 and i7 cpus?
     
  31. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    Im not sure why you felt the need to explain that to me..

    It actually uses as much as 150 watts, the point im trying to prove is that it isn't hot, and it doesn't consume a lot of power all the time.

    Yea your wrong. And looking in to the FUTURE which is where were GOING, things will need to consume more power to be faster and have more cores. Im not sure what my normal usage has to do with your brutal argument, if it could even be called that, whatever.
     
  32. sgogeta4

    sgogeta4 Notebook Nobel Laureate

    Reputations:
    2,389
    Messages:
    10,552
    Likes Received:
    7
    Trophy Points:
    456
    Actually power consumption will go down in the future due to better (lower) manufacturing processes. Adding cores and speed do increase power consumption but overall the aim is more improving efficiency and performance-per-watt.
     
  33. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    I agree with sgogeta, the more we advance, the more power efficient the hardware becomes. For example, newer ATI 5000HD series are said to perform 40% better consuming 20% less (or was it 20% better and 40% less?? Cant really recall ATM). But the thing is that things get optimized to consume less.

    Right now those consume a lot since they are not as optimized as the newer are supposed to be.

    On a side note. Dont start a fight over this please. Regular usage depends on the user and not in the opinion of someone else. If you have something to say, PM the other. Thanks.
     
  34. SoundOf1HandClapping

    SoundOf1HandClapping Was once a Forge

    Reputations:
    2,360
    Messages:
    5,594
    Likes Received:
    16
    Trophy Points:
    206
    Just wait until we violate the laws of thermodynamics and energy and start generating electricity while we game.

    "I just powered some guy's house while playing Crysis 12!
     
  35. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    According to Fudzilla, the X2 version of ATI HD 5800 will consume 376W, which is substantially more than HD 4800 based X2's. In graphics, performance increases are far greater than power reductions process technology allows. It won't use less power at all.
     
  36. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    lets hope it's still more efficient. if so, it would mean that it would be much much faster :)
     
  37. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    HAHAHAHAHAHA that gave me a good laugh Forge. +rep
    But wow 376W is a lot!!! I wonder how much improvement will there be?? As in Crysis fully maxed-out in everything in 4 WUXGA displays and getting 200 FPS?!?! (if only...)
     
  38. MrX8503

    MrX8503 Notebook Evangelist

    Reputations:
    126
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
    Tick Tock

    Tick = Die Shrink (Optimized, Efficient)
    Tock = New architecture (Powerful, Brand New CPU)

    Nehalem = Tock
    Westmere = Tick
    Sandy Bridge = Tock
    Ivy Bridge = Tick
    Haswell = Tock

    There you go, I saved you guys arguing over power consumption until at least 2012. End of story.
     
  39. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    Yes I agree with the performance per watt part, if you do think about it I technically have 8 cores at 150 watts or whatever, so that part of it is true. The die shrinks allow more powerful technology though, total power consumption is not going down at all , it is going up, especially in the graphics card department.
     
  40. davepermen

    davepermen Notebook Nobel Laureate

    Reputations:
    2,972
    Messages:
    7,788
    Likes Received:
    0
    Trophy Points:
    205
    yeah, but gpu developers don't use the same high quality technologies to design and manufacture their chips. they are in general much hotter than if intel would develop them with their equipment. not that it would be the best way, intels way takes way longer and the gpu world is faster than the cpu world in evolution => the intel way would be too slow to be top-end but they would deliver the same chip nvidia or ati does with a much lower powerdraw for that same piece of work.
     
  41. MrX8503

    MrX8503 Notebook Evangelist

    Reputations:
    126
    Messages:
    650
    Likes Received:
    0
    Trophy Points:
    30
    Believe it or not but GPU's are actually more complex and have more transistors than CPU's.

    I think in the end we'll gravitate more towards efficiency though. For example we'd probably get Core i7 performance in a phone someday.
     
  42. Serg

    Serg Nowhere - Everywhere

    Reputations:
    1,980
    Messages:
    5,331
    Likes Received:
    1
    Trophy Points:
    206
    Indeed GPUs are far more complex than CPUs.
    Basically it is far more complex as in architecture versus a CPU.

    The CPU sounds more complex, but the difference is that the GPU has to do almost the same plus render the results and show them in the display.

    So the fact that GPU advances that fast is impressive. (although, major technology changes are not THAT huge, versus the ones on CPUs). For example NVIDIA 8 series to 9 series to GT200 series have seen little changes, besides manufacturing process and upping the count of shaders and clock speeds. ATI 2 series 3 series to 4 series have seen slightly more changes than NVIDIA, but still, it is not like Intel, who is launching something (supposedly) completely new.
     
  43. IntelUser

    IntelUser Notebook Deity

    Reputations:
    364
    Messages:
    1,642
    Likes Received:
    75
    Trophy Points:
    66
    The single biggest reason GPUs advance "faster" is due to infinitely parallel nature of the code. If general purpose code were like that we wouldn't have high frequency, large cache architectures, but rather like a GPU, optimized for parallel code.

    I tend to disagree with general perception that GPUs are more complex than CPUs. If anything its the other way. More accurately it depends which part of the circuitry you are looking at. They really aren't comparable. CPUs might have large and simple caches, but very complex compute circuits. GPUs use many dozens of simpler cores(complexity levels between cache and CPU logic) basically copied over multiple times with routing logic.

    If memory speeds have caught up at the pace of CPU development, there wouldn't need to be such a huge focus with caches. Unfortunately capacity is also a big thing with system memory which resulted in sacrifice of bandwidth. Graphics has no such issues.
     
← Previous page