Please, do not try to pick up a fight, nor anything.
Check Forum Rules if you want on this matter.
Thanks!
-
What the heck is it? -
The fact of the matter is that i5 is an improvement over Core i7. You complain about i5 being power hungry, yet nothing else is available in its power range. -
Im pretty sure all desktop CPU's are power hungry, that is their purpose, to be as fast as possible, but they are not hot and don't consume a lot of power all the time, mine uses 25 watts when idling, which isnt much at all.
It is an improvement TDP wise, but the fact that it has no HT is kind of a bummer. -
-
Jayayess1190 Waiting on Intel Cannonlake
-
There is no reason why having 8 cores would slow you down, in any program, versus having 4. The program you're using simply wouldn't use the extra cores, but it wouldn't flat out run 10% slower because it wouldn't use those extra cores. This is why people have been reporting, for five years (since the 3.06Ghz Pentium 4 "B" Northwood with HT came out) that disabling HT has actually increased performance.
Man, I remember back in the day you could lose 40% framerates by enabling HyperThreading. Made my 3.06 WAY less cool. -
The area of debate is whether or not the average user needs 8 cores (4 real, 4 virtual) over 4 in most tasks, including "high end gaming" and if those extra 4 virtual cores are worth the price premium.
For 99.9% of all gamers, I fail to see how a Core i5 - running stock or overclocked - wouldn't be good enough for gamers. Benchmarks both synthetic and real world show the Core i5 to be nearly identical to the Core i7 (which makes sense, considering they are nearly identical to begin with) in performance. And quad core is the "way of the future" if you will. I play games often, I can't find a reason to say "hey I need eight cores" which is useful, given Hyper Threading can slow some applications down when enabled.
For the rest of us (or you) who need 8 cores, the multimedia folks I suppose, you can buy a Core i7. I'd be curious what exactly you are doing (particularly given this is a notebook forum primarily) that would make you "need" 8 CPU cores, but to each his own. Myself (and nobody else, I would hope) are tellin you what you should or could do with your computer.
But for the vast majority, there is no big reason to not be fine with Core i5 unless you plan on keeping your system for five years and making no modifications to it. In which case you more than likely wouldn't need the Core i5 or be the type of consumer with which this conversation would take place.
To recap:
Core i7 = faster than Core i5.
A SuperComputer is faster than my 2.26Ghz Core2Duo-powered laptop.
Most people will be fine with a Core i5, and for a variety of reasons it's simply the "better buy" or "better bang for the buck" if you will, for the same reasons people buy a $70,000 Z06 Vette over a $275,000 Ferrari. -
just so that I'm clear...these are desktop processors, right?
-
-
As far as I'm concerned, NOTHING touches the Core i5 in terms of performance/watt.
There's a whole anandtech article about it, its pretty informative. -
-
..............
-
http://www.solidmuse.com/2008/12/core-i7-to-hyperthread-or-not.html
Theses particular quotes are interesting:
-
I have no idea why I waste my time. -
Normal usage depends of user. I for example use my laptop ALL the time, this is working, NBR'ing, chatting, modeling 3D, skype, or simple web browse or word processing. If non of the above, I play some game, if I have the time.
For me, an i7 wouldnt be a waste. But I will go with whatever suits my needs, even if this means a C2D only. Again, it depends.
Regular users, who just browse the web and write stuff in word for school, dont need an i7 or something similar, just a CULV and they are good to go, and it is quite capable and fast enough. For example, a mere word doc and some tabs in FireFox or IE8 plus the occasional iTunes will NOT take advantage of a faster i7.
On the other hand, someone who works with large databases, number crunching, virtual machines, large amounts of data, tons of apps open, they will benefit from this. -
But my Core2 Quad downstairs is about the same as this honestly, not much slower at all.
When I built I want to be ahead in the future, because 4 cores(in addition to the 4 virtual ones) will be very useful when developers start programming for them more regularly. -
Eh. Waiting for Sandy Bridge for my next desktop upgrade. Evolution of Core 2 architecture? Yes please.
-
hahahaha if you keep waiting, youll never get a laptop...lol
As usual, buy when needed.
ALthough, Sandy sounds quite interesting, but TOO far away in the future for me. -
@Vinyard
Indeed wrong decision... -
davepermen Notebook Nobel Laureate
if it's written down to require 95W, and you get a 50W psu, it might work, but might suddenly just turn off when it, at some moment, require more power.
and as desktops deliver power with ease, the specs can be much less tight than in a laptop. so yes, my quadcore doesn't eat much W while idling and surfing, but it's allowed to eat up to 105W (in my case, i think), when at full usage.
and there's a difference from my laptop to my pc. the laptop is not allowed to burn that much power. when not on maximum usage, both obviously use much less power.
and vinyard, cancle the order if you can.. -
-
davepermen Notebook Nobel Laureate
btw, your 25W cpu has a TDP of 130W... so now you might notice that the i5 might, just might, consume 12-15W at the same task than yours ...
this comparison is not true, but should give you an idea.. TDP is the limit of what it's allowed to require. nothing else. -
-
-
-
Anyway, its all about performance/watt that is how its always been when rating CPU's.
Thats how Intel rates it, thats how AMD does it, and thats how every single tech site does it.
It doesn't matter if you just browse the web or look at your computer screen while scratching your butt, if Intel produced a powerful chip within its prospective TDP and outperforms all others, then they were successful. -
Out of curiosity, is there such thing as a quad core i3 and if so how is the power usage on them?
-
-
Thank you. Also, out of curiosity, I know AMD has a triple core cpu, will they offer the i5 or i7 in a tripe core? When do you think we will know about i3's and other i5 and i7 cpus?
-
It actually uses as much as 150 watts, the point im trying to prove is that it isn't hot, and it doesn't consume a lot of power all the time.
-
-
I agree with sgogeta, the more we advance, the more power efficient the hardware becomes. For example, newer ATI 5000HD series are said to perform 40% better consuming 20% less (or was it 20% better and 40% less?? Cant really recall ATM). But the thing is that things get optimized to consume less.
Right now those consume a lot since they are not as optimized as the newer are supposed to be.
On a side note. Dont start a fight over this please. Regular usage depends on the user and not in the opinion of someone else. If you have something to say, PM the other. Thanks. -
SoundOf1HandClapping Was once a Forge
Just wait until we violate the laws of thermodynamics and energy and start generating electricity while we game.
"I just powered some guy's house while playing Crysis 12! -
According to Fudzilla, the X2 version of ATI HD 5800 will consume 376W, which is substantially more than HD 4800 based X2's. In graphics, performance increases are far greater than power reductions process technology allows. It won't use less power at all.
-
davepermen Notebook Nobel Laureate
-
HAHAHAHAHAHA that gave me a good laugh Forge. +rep
But wow 376W is a lot!!! I wonder how much improvement will there be?? As in Crysis fully maxed-out in everything in 4 WUXGA displays and getting 200 FPS?!?! (if only...) -
Tick Tock
Tick = Die Shrink (Optimized, Efficient)
Tock = New architecture (Powerful, Brand New CPU)
Nehalem = Tock
Westmere = Tick
Sandy Bridge = Tock
Ivy Bridge = Tick
Haswell = Tock
There you go, I saved you guys arguing over power consumption until at least 2012. End of story. -
-
davepermen Notebook Nobel Laureate
yeah, but gpu developers don't use the same high quality technologies to design and manufacture their chips. they are in general much hotter than if intel would develop them with their equipment. not that it would be the best way, intels way takes way longer and the gpu world is faster than the cpu world in evolution => the intel way would be too slow to be top-end but they would deliver the same chip nvidia or ati does with a much lower powerdraw for that same piece of work.
-
Believe it or not but GPU's are actually more complex and have more transistors than CPU's.
I think in the end we'll gravitate more towards efficiency though. For example we'd probably get Core i7 performance in a phone someday. -
Indeed GPUs are far more complex than CPUs.
The CPU sounds more complex, but the difference is that the GPU has to do almost the same plus render the results and show them in the display.
So the fact that GPU advances that fast is impressive. (although, major technology changes are not THAT huge, versus the ones on CPUs). For example NVIDIA 8 series to 9 series to GT200 series have seen little changes, besides manufacturing process and upping the count of shaders and clock speeds. ATI 2 series 3 series to 4 series have seen slightly more changes than NVIDIA, but still, it is not like Intel, who is launching something (supposedly) completely new. -
The single biggest reason GPUs advance "faster" is due to infinitely parallel nature of the code. If general purpose code were like that we wouldn't have high frequency, large cache architectures, but rather like a GPU, optimized for parallel code.
I tend to disagree with general perception that GPUs are more complex than CPUs. If anything its the other way. More accurately it depends which part of the circuitry you are looking at. They really aren't comparable. CPUs might have large and simple caches, but very complex compute circuits. GPUs use many dozens of simpler cores(complexity levels between cache and CPU logic) basically copied over multiple times with routing logic.
If memory speeds have caught up at the pace of CPU development, there wouldn't need to be such a huge focus with caches. Unfortunately capacity is also a big thing with system memory which resulted in sacrifice of bandwidth. Graphics has no such issues.
Intel Core i5 is here
Discussion in 'Hardware Components and Aftermarket Upgrades' started by elijahRW, Sep 8, 2009.