So I've been resisting my temptations to build a new system due to the launch of Skylake soon.....
now that I see the leaked benchmarks in gaming, even the older 5920K CPU beats it:
http://www.eteknix.com/intel-skylake-i7-6700k-performance-figures-leaked/
Sure the later models coming in 2016 with more cores will be better, but the launch models are the 4 core models which perform similar or a bit worse in multi-threaded apps
@Mr. Fox
-
Spartan@HIDevolution Company Representative
-
This new gen of processors are maybe another 5% gain in performance.
I built a desktop around a 5820k so I'm fine for some time. I'm not going to wait more than half a year to gain 5% in performance.Last edited: Jun 13, 2015Spartan@HIDevolution likes this. -
superparamagnetic Notebook Consultant
The 5930K is 6-core, while the 6700K is quad-core. Pricewise that's about $580 vs $~350. Not exactly an apples to apples comparison.
If all you're looking at are gaming benchmarks, then there's not much that Skylake has to offer. There's really not much that Broadwell, Hawell, and Ivy Bridge have to offer either since a Sandy Bridge quad core is more than sufficient for that vast majority of modern games. Unless you're using multiple top end GPUs, your CPU isn't going to be the bottleneck.
If you're looking at highly CPU bound operations like rendering, code compiling, video, etc., then each generation typically sees around 10% gain.
The other big advantage isn't so much the CPU performance, but the platform as a whole. Skylake will support DDR4 which allows for more memory (at least 16GB per dimms) with more bandwidth. Skylake also has more PCIe 3 lanes, which will allow for faster SSDs. There's support for next-gen connectors like Thunderbolt 3 and USB-C. The iGPU output is also expected to go entirely digital (no more VGA).
Most of what makes Skylake so great isn't really relevant to a gaming desktop. It'll be pretty huge for laptops though, though personally I think Cannonlake is the one to wait for. -
Spartan@HIDevolution Company Representative
-
I will stay with the desktop 5820k and 970 sli and the business laptop 4700mq with 755m. The 755m is good enough for 1600x900 screen.
Spartan@HIDevolution likes this. -
You're comparing a 6-core Haswell vs. a 4-core Skylake. Ofc the Haswell is gonna win. Also the 6700K will be in laptops. 5930K will most likely not be.
Skylake benefits:
IPC improvement and new instruction sets
Improved iGPU with DX12 support
Support for both DDR3 and DDR4
+4 PCIe 3.0 lanes (x20 vs. x16)
Thunderbolt 3.0 and USB 3.1
SATA Express NVMe SSDsLast edited: Jun 13, 2015 -
I dont know how everyone is saying the 5930k when the charts only show the 5820k. Have a closer look people.
-
-
It will need to deliver physics performance better than this or it won't be worth buying...
Comparison Result: 3DMark 11 Extreme - Titan vs 980M SLI vs Titan -
Ah, maybe so @octiceps - that might be. But, we will still need to wait and see if they are better overclockers than Sandy, Ivy and Haswell.
-
tilleroftheearth Wisdom listens quietly...
Way to compare guys. Gaming? You know I'm out of that discussion.
But comparing a much higher $$$ CPU along with ~50W higher TDP and 2 extra cores is not exactly a comparison either.
5% increase in IPC is worth a whole cpu's capabilities not so long ago, and those 5% increments add up over time... Besides, the benefits are not just in the chip itself, it is in the platform as a whole and that (afaik) we have not seen yet. But without doubt I know that the latest platform will be the best one to buy (once again).
For pure/raw performance reasons, it will be worth to buy/wait for a six/eight core version of the new architecture - but that is true of almost all platforms.
For myself, OC'ing is not where the value of a new platform is. Stability and ability to deliver full performance at it's stated spec's is far more important. This, along with Win10x64Pro (assuming it continues in the steps of Win8.1x64Pro) is what will make this the killer setup from September 2015 onward.dellRon, Spartan@HIDevolution and -=$tR|k3r=- like this. -
-=$tR|k3r=- Notebook Virtuoso
Last edited: Jun 13, 2015dellRon likes this. -
All I know is from Sandy Bridge i5 chips and up, there has been no reason to upgrade if all you are doing are word processing, coding, web browsing, even gaming (granted you are using an already existing high end chip with a good graphics card). The only use for these new processors may be for benchmark kings and people that actually REQUIRE multiple cores and such for their work, and even then a 2920XM is more than capable of handling anything thrown at it today, and the 3920XM even more so. Only time I see these chips really needing to be updated, is when 4K content becomes mainstream and would begin seriously taxing existing chips, but by that time, a sandy bridge laptop would probably be 8 years old. I say this because I am currently typing off a Thinkpad T60P. In 2006-7, FullHD content wasn't really all that available over the internet, and the internet requirements weren't all that high. By 2010, it was already getting long in the tooth. The same cannot be said for my X220 tablet. My X220 tablet with the i5 2520M and a fast SSD is 100% as capable today as it was in 2011 for anything short of gaming.
Starlight5 likes this. -
-=$tR|k3r=- Notebook Virtuoso
You are correct, if the older tech is working for you...... but many like to stay current, and to that end, there are numerous reasons to upgrade to a Skylake platform. My old GT783 (2860QM) still serves me well, but my GT72 or GT80 does everything soooo much better. If I were in the market for a new notebook, I'd wait for Skylake's debut, and Sandy Bridge wouldn't even be a consideration..... unless budget forced me into such a decision.
dellRon likes this. -
Top 5 reasons to upgrade:
- Better overclocking
- Better overclocking
- Better overclocking
- Better overclocking
- Just because
More efficient
Uses less power
Intel knows best for me
OMG, it's shiny and new
I just love spending money
My friends on Facebook say so
It's the easy way to get Windows 10
OMG, an extra 15 minutes on battery
Better clock-for clock stock performance
Intel graphics are finally as good as ATI Rage 128
Better overclocking
D2 Ultima, alexhawker, Papusan and 3 others like this. -
TBoneSan, Starlight5 and Mr. Fox like this.
-
-
tilleroftheearth Wisdom listens quietly...
If a new platform completes more work in the same 8 to 22 hours I have for it on any given day vs. my current systems; it is better. This has been true for the last several platforms over the last decade - Broadwell, I'll skip (for now).
OC'ing would make the work go faster, for sure. But it also killed O/S / Program installs, brought me back weeks and months vs. just using the system 'as-is' instead of fiddling with it (and paying others to fiddle with it) and constantly testing for 'better' instead of just using it as it was designed.
I can understand OC'ing to being the end all and be all of having a system. My needs differ.
Even if an ancient 4 year old $1K chip can keep up to a $300 current chip - so what?
I could have upgraded three times over that time period, saved a $100 and still have better performance, no incidents (oh no! I think I fried the chip! Or the O/S! Or the ...) and even more savings from the AC and power requirement savings from NOT running OC'ed, not to mention the bliss on my ears and the reduced stress on the other components too.
Still, I don't do that (I'd buy the $400-$700 chip instead - most BANG for the buck)... but just saying... -
-=$tR|k3r=- Notebook Virtuoso
"If 'IF's' and 'BUT's' were candy and nuts, we'd all have a wonderful Christmas!"
-
I think efficiency is good. But when they start sacrificing performance for efficiency, that defeats the whole purpose from the beginning.
That isn't even efficient. What would we call that? Conservancy?And DDR4 isn't faster than DDR3. It has increased latency and increased frequency which equals negligible gains in performance. If you decrease latency and increase frequency, you have faster memory.
-
Unlike cell phones and tablets, having more TDP at your beck and call is a good thing for high performance computers... kind of like American Express... don't leave home without it. The most important thing is avoiding BGA cancer so you can find another CPU if you draw a short straw in the silicon lottery, and/or being sure what you buy is fully unlocked (including TDP) so you can try to compensate for engineering and QC incompetence. OEMs do not care if you lose in that lottery. As long as it runs OK at stock specs, their heiney is off the hook for delivering something excellent.
@J.Dre - yup... "efficiency" is another lame excuse... the "yeah but" caveat of the new generation of high tech. That should be a fringe benefit of producing better chips for notebook and desktop CPUs, not a primary goal. I don't like lame excuses. Just give me awesome and if we happen to get lucky and score some efficiency in the process then that's a special side bonus. Kind of like those tasty clods of nacho cheese seasoning you sometimes find in the bottom of the bag of Doritos. -
tilleroftheearth Wisdom listens quietly...
DDR4 is (today) only slightly faster along with less heat and better efficiency - looks like a win to me. Tomorrows DDR SoDimms will only be better (and faster) in all ways.
Oh! You eat Doritos? ...
See:
http://globalnews.ca/news/1215894/i...s-confirm-rat-droppings-found-in-doritos-bag/
Hope they really were 'cheese seasonings' you ate.Mr. Fox likes this. -
-=$tR|k3r=- Notebook Virtuoso
LOL! OMG, time for me to exit this discussion, especially in light of the Mod's opinion, LOL!
Mr. Fox likes this. -
I don't think they're sacrificing performance for the sake of efficiency -- at stock speeds, Skylake will be faster than Haswell and Broadwell, albeit not by much. What they might sacrifice is the ability to overclock which, while quite popular on hardware-oriented internet forums, is only used by a small fraction of their customers.
Socketed CPUs for laptops are definitely on their way out. The large OEMs don't like them for obvious reasons and at some point the market for them becomes so small that it is cheaper for Intel to only make BGA ones.Starlight5 and Mr. Fox like this. -
The nicest thing about new tech is nobody that already has today's best needs to lose any sleep over what tomorrow brings. If the next best thing is better (by my definition) I will buy it and if it happens to be more efficient on top of more powerful and better overclocking we will all be in hog heaven together. As much as the OEMs would like us to be all goo-goo-gah-gah about it, the safest approach is to be flippant, maintain a low sense of urgency, and not rush into anything that looks good on paper or in a web site review. The days of early adoption being fun and safe have never been more treacherous.
D2 Ultima, TBoneSan, Starlight5 and 4 others like this. -
I was a huge advocate for Skylake as well, until I saw what it really is... Just less power consumption, offering similar performance. Woopty-do. We want less power consumption and more performance, Intel! Get your heads out of your butts. Unfortunately, Broadwell-C has yet to come out (of the closet, jk).
So, we won't see Skylake replacements for the 5820k until next summer/fall. The last thing I have hope for is Pascal.* Been disappointed by everyone and everything else since 2014 (except maybe Clevo).
*My fear with Pascal, is that NVIDIA will "dial it down" to re-brand it for at least two years.
The whole market is like: "Didn't make as much profit this year? Don't worry... Re-brand!"
Last edited: Jun 13, 2015 -
D2 Ultima, TBoneSan and Starlight5 like this.
-
Yep. We've seen plenty of those over the past decade.Starlight5 and Mr. Fox like this. -
-=$tR|k3r=- Notebook Virtuoso
LOL!
-
tilleroftheearth Wisdom listens quietly...
-
I think Skylake is a cool name. Probably the best thing about it.
To answer the thread title and all.tilleroftheearth, Starlight5 and ajkula66 like this. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
You guys talking overclock... Most notebooks can't maintain advertised turbo (and nowadays often even stock!) clocks. If Intel and OEMs make progress in this area, it'd already be fan-fracking-tastic. I doubt, though. =\
-
Spartan@HIDevolution Company Representative
Starlight5 and Papusan like this. -
Seanwhat, Starlight5 and Mr. Fox like this.
-
It doesn't help matters when some of the OEMs intentionally sabotage their own products by using BIOS settings that ruin the performance of otherwise amazing hardware. Alienware 17 and 18 (2013/pre-BGA) were great examples of this... CPUs did not hold stock turbo clock speeds with the voltage jacked up way too high and processor power and current limits set way too low. The 4930MX in the machine I have is beast. It can boot at 50x4 and be benched at 4.7-4.8GHz stable after proper settings are implemented. First impressions of performance were very unfavorable, but I didn't care much because I had absolutely no intention of running it stock. The prime objective was to immediately begin tweaking everything to see how much performance I could wring out of it by pushing it to its functional limits. Out of the box, it suffered from thermal throttling, thermal shutdown and frequently power-throttled to 800MHz under load using the factory BIOS settings.
The more disturbing problem we contend with today is the near absence of anything that might be regarded as a truly good foundation. I have never seen so many changes for the worst as what we have seen in the past 24 months. There is still a lot of fancy-looking stuff being sold out there, but the majority of the products marketed as "gaming notebooks" are overpriced, botched up messes. Most are crippled and under-performing by design, assembled by monkeys that lack technical skills, and have no incentive to do their jobs well. I think I can probably count the number of amazing high performance notebooks available for purchase on one hand now, with one or two digits to spare. That's really unfortunate. -
Last edited: Jun 13, 2015HTWingNut and Starlight5 like this.
-
Because who needs battery life, right?Starlight5, Charles P. Jefferies and Papusan like this. -
.
-
Anyway, Skylake is an upgrade for anyone on Sandy Bridge. Even on Ivy Bridge. It's quite a large upgrade, in fact, as the IPC gain is cumulative. I'd estimate 35-45% better than SB, and 28-35% better than IB. It may not make a whole lot of difference in games directly unless you're at a CPU bottleneck already, but other programs should see a notice-able improvement. It depends on the user. The REAL question is if mobile CPUs will get gimped. I'm all for desktop CPUs in laptops and all, but I always felt like mobile chips should have kept keeping up with desktop ones (though they are overpriced as all hell for absolutely no discernible reason). -
Think I was referring to a lower TDP when I made that post, not sure. Was kind of upset with stirred emotions from how things have been going in the market today. It's very upsetting.
-
Though, theoretically, if Skylake uses more power but has Haswell's TDP, then technically it has less TDP available XDMr. Fox likes this. -
ajkula66, Spartan@HIDevolution and Papusan like this. -
-
You guys r nuts. I'm thru with Intel + new computers. The day I can buy a notebook and throw out that useless 60GHz radiation emitter from it (WiGig) is the day I'm interested in notebooks again. Too bad not anytime soon b/c Intel decided to build that friggin radiation-health-hazard emitter right into the friggin CPU dye of the Skylake and beyond generations!
Mr. Fox likes this. -
I see... Radiation. Emitting. Hazard. Sounds very serious. We still have time to prepare ourselves.
Alrighty then. Let's all grab a roll of aluminum foil and get after it.
Papusan, octiceps and alexhawker like this. -
Hear, hear.
Get over it - I'll let you know when the WiGig starts giving me cancer and/or causing me to grow extra limbs. -
Want to cook yourself with something worse than microwave radiation? Help yourselves.
I've thought of a really good use for those Skylake laptops. If you're trapped on a deserted island. All you gotta do is bring some tin foils to wrap around the laptop and you've got a makeshift microwave oven handy to cook those wild fowls! -
"
At normal operating distances, Wi-Fi's intensity is generally so low that it's not worth worrying about: it's just part of the "smog" that is generated by radio and TV signals, AC mains wiring, the motors in home appliances, and the universe in general. (As my colleague Charles Arthur once pointed out here, the wavelength of Wi-Fi signals is the same as the cosmic background radiation: 12cm. If you're worried, don't go outside.
"Last edited: Jun 17, 2015
What's so great about Skylake?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Spartan@HIDevolution, Jun 13, 2015.