I would gladly give up famed Netflix 4k decoding, if I avoid iGPU completely. Processors more powerful than BGA shouldn't have this garbage that just takes up space on the silicon who rather could be used for the cores. As you say, You assume Nvidia will provide these features as well. And Intel comes with Intel Kabys Lake-X Hedt Processors. Many people with i7 quad-cores will jump on this train. These will definitely not be squeezed out from new features like this.
-
-
Well, I guess for the 1,300 people that actually use Edge and will bother with a Kaby Lake CPU, they will now have something they can do.
Last edited: Dec 20, 2016alexhawker, jaybee83, Papusan and 2 others like this. -
Considering I gut edge and Windows store and apps out of Windows 10, this is just marketing speak for something irrelevant. I may upgrade to that chip to have one that operates at over 5ghz, but not because of this. Before I do, I'll do comprehensive benches comparing 7, 8.1, and 10 so that all benefits are known for improvements that matter. My 4K tv does the 4K Netflix, not my computer which is used for computing.
Meanwhile, my gutted win 10 gets quite close to a stock win 7 sp1 un-optimized. I'll need to do something about win 7 in the coming weeks...
Sent from my SM-G900P using Tapatalk -
That's the best point, we aren't limited to Win10 / Kabylake at all, but the MS / Intel / Netflix propaganda harps on how Kabylake / Edge are *required*.Last edited: Dec 20, 2016TomJGX, jclausius, Papusan and 1 other person like this.
-
That applies to 16:9 laptops as well and I've said it quite a few times before
Of course on a desktop you can go as wide you can possibly want (given that you can bring it through the door, or have such desk, or if someone actually manufactures such display) and I'm actually eying LG's 38UC99-W, but on laptops I think that 16:10 provides the perfect blend of portability and usefulness.
-
That looks like a very nice display. Point taken. But, if you ever have considered TVs as displays, a good time to buy is a week or two before before the superbowl! (They are clearing out last years inventory). Also, rtings.com does good ratings. For displays, you kind of have to search, and that display looks good, but I'd want to check a couple things first (but for the size, the initial loom at the specs are good, but I'm just waking up)...
Two points: 1) it only uses sRGB, not an expanded gamut. Not the worst thing considering so little is authored beyond sRGB that you would have it set to that anyways for almost everything. 2) the resolution is 3840x1600 which is definitely wide, but a little off if you want to watch something at 2160, regardless of 3840 or 4096. A lesser concern is it doesn't say which HDCP is supported, whether 1.x or 2.2. Just some thoughts, hence why I brought up TVs for that size (although you want to guarantee at least 4:4:4 if going the TV route and barely any incorporate display port)...
Sent from my SM-G900P using TapatalkLast edited: Dec 21, 2016 -
Well aware about both - the gamut and the movie/TV shows watching. It would be used for multitasking and gaming. Color critical work and movies would be for either my DreamColor or the FW900
ajc9988 likes this. -
@triturbo @Prema @Mr. Fox @Papusan @Johnksss@iBUYPOWER @D2 Ultima @hmscott @iunlock @DeeX @zergslayer69
I got the i7-7500U on the HP X360 (see sig), what do you want me to test?
Also to mention I can somewhat break free of HP's 12w (-3w) restriction on CPU consumption temp to maybe 18/15w. -
What a JOKE
And people buy this?
People are crazy nowadays!! I really don't know what to call this soldered trashware. But if you absolutely want/must test this thing. Run same test from
http://www.notebookcheck.net/Intel-Core-i7-7500U-Notebook-Processor.172205.0.html
-
so far it's okay, just the overall ultrabook line is overpriced afTomJGX likes this.
-
Something that we can compare it with the previous CPUs. It would be interesting to see performance @12W and @15/18W as well i.e. how much is lost/gain with only couple of Watts.
-
I don't have another ULV to compare to
-
You could look into Hwbot to see if you can compete with cpu scores from there. Or ask in TS guide.
-
Perhaps I'm missing something here...
4K DRM being built into a CPU is a plus for consumers, not a hinderance or disadvantage, much the same as Verity on Android devices is a plus not a disadvantage. This isn't new per say, as similar DRM is what prevents 1:1 copying a DVD or BluRay without one of the many available decryption programs (AnyDVD [now Red Fox], DVD43, etc.). 4K DRM is also why one requires HDMI 2.0 and HDCP 2.2 to watch 4K.
This is no different than HDMI 1.4 and the HDCP standard on set top boxes that prevents one from connecting a consumer DVR (Hauppauge for example) to their DirecTV or Comcast box to record digital TV content. Are there issues with DRM in certain instances, absolutely (digital music purchases for example), however 4K video content is not in the same boat. DRM, for all intents and purposes, is a form of a copyright to prevent unauthorized usage by individuals not licensed for such usage. When we "purchase" media content, be it a video game, movie, music, etc., we're not purchasing the item itself but a license to utilize the content we purchased, regardless if it's digital or physical form.
- For example, if you were one of the hundreds of people who worked on, and financed, a film to bring it to market, would you be okay with consumers being able to receive the movie, show, documentary, etc. for free? If one answers yes, then the individual needs to ask themselves exactly how long they believe the TV & Film industry would last if that was okay and allowed.
- Everyone loves movies and TV shows, and while the genres will differ from person to person, we all like movies & TV shows and want the industry to still be there for us to enjoy.
Last edited: Dec 26, 2016 - For example, if you were one of the hundreds of people who worked on, and financed, a film to bring it to market, would you be okay with consumers being able to receive the movie, show, documentary, etc. for free? If one answers yes, then the individual needs to ask themselves exactly how long they believe the TV & Film industry would last if that was okay and allowed.
-
The use comes only in using the on board gpu. Do you use the gpu here or on your dgpu?
Sent from my SM-G900P using Tapatalk -
Why would anyone be using a GPU to watch 4K video... there's no discernible gain from doing so over the integrated graphics on the CPU. For gaming and video editing, absolutely... for simply watching a 4K blu ray or 4K streaming, nada.
- Remember, the 4K DRM the CPU processes is only for copyrighted 4K content
-
On desktops, which this is a desktop chip, do you regularly go to the back of your machine to switch your cables from being plugged into the desktop dedicated gpu to the mb plug and turn on the disabled igpu just so you can watch it from the igpu instead? If you do, you are the rare case!
Sent from my SM-G900P using Tapatalkalexhawker likes this. -
Per my signature, I don't have a desktop... While I understand the point you're trying to make, it's not altogether clear what yourself and others are taking issue with.
- If one is planning on watching 4K content through a PC, but is not using the integrated graphics on the CPU, why would anyone then expect the graphics DRM portion of the CPU to process 4K DRM when not utilizing the integrated graphics? Wouldn't one's issue be with the manufacturer(s) of the individual's GPU?
- One could simply add a second HDMI cable that always stays plugged into the integrated graphics HDMI port and is only switched out on the monitor/TV if either doesn't have a second HDMI port.
- One could utilize a DisplayPort cable from the motherboard to the display, or an HDMI to DisplayPort cable.
- One could use an HDMI 2x1 or 4x1 duplicator (such as MonoPrice sells), of which requires no manual intervention since there would only ever be one video source going in and coming out.
Last edited: Dec 31, 2016 -
win32asmguy Moderator Moderator
Posts have been removed which had personal attacks or expletives in them or quoted them. Please keep the discussion on topic or we will have to close this thread.
tilleroftheearth, electrosoft, Papusan and 2 others like this. -
More reviews:
- Anandtech - "In most of our benchmarks, the results are clear: a stock Core i7-7700K beat our overclocked Core i7-4790K in practically every CPU-based test (Our GPU tests showed little change). When overclocked, the i7-7700K just pushed out a bigger lead for only a few more watts. Technically one could argue that because this part and the i7-6700K are equal in IPC, a similar overclock with the i7-6700K achieves the same performance. But the crucial matter here is how lucky a user is with the silicon lottery – based on our testing, the Core i7-7700K CPUs tend to overclock rather nicely (although +300 MHz isn’t that much in the grand scheme of things)."
- Arstechnica - "As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever-smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate—but if Kaby Lake is any indication, it won't be coming from Intel."
- Techpowerup - "Given these performance figures, it's hard to recommend an upgrade to the Core i7-7700K from the i7-6700K. There are no IPC gains to be had, overclocking isn't that much better, and today's graphics cards don't significantly benefit from the faster processor. If you want to spend upgrade money for gaming, then the better investment is a faster graphics card. However, if you are buying new, then there is no reason to buy the i7-6700K, unless you can find it used at a significant discount."
- Tomshardware - "To its credit, Kaby Lake brings the mythical 5 GHz threshold into play for overclockers. But beyond the clock rate boost (roughly 200-300 MHz over Skylake overclocks), there is little reason for enthusiasts armed with Skylake CPUs to upgrade. If you already have a modern processor, spend those dollars on a new GPU or SSD. Kaby Lake is really only an option for power users building new PCs."
Tom's Hardware review in particular is quite in-depth, and well done. I think it's interesting that the retail CPU samples received by the U.S. and Germany branches have a 15 W difference in power consumption, with the reviewer suggesting that binning will be very important for Kaby Lake. Their review also covers the lower-end, "non-K" i5-7600 and i7-7700, which I appreciate.alexhawker, Ashtrix and tilleroftheearth like this. -
tilleroftheearth Wisdom listens quietly...
IPC gains? Yawn.
What are the overall platform gains? Significant? You bet.
Especially as time moves forward...
Of course anyone having a current platform won't gain a lot from the latest offerings... (duh...).
But there are people that will actually make back their $$$$ in a few weeks for a 1% or 2% gain - in other words; it's up to the buyer to purchase a system/platform/components that maximize their benefits while minimizing their costs (over the lifecycle of the system/platform/etc. in question...).
That doesn't make the current offerings any less valuable ( Arstechnica is out to lunch with their 'conclusion'...) - just makes the decision of whether to buy something or not in the hands where they belong; the buyer and the workflows/workloads involved (and that has never changed, of course). -
Well this is a strange perspective to look at it imo. I think it frames it a little wrong.
Your kind of justifying the lackluster improvement here. Sure niche things like 4K 360 video saw large improvements and power consumption lowering is always good, but your framing this from a perspective of some businesses and power users that use these chips for huge workloads where a 1% difference is a big deal. I would describe that market more as a Xeon or HEDT market myself, but I also think it makes it sound more okay than it is.
When you look into the past at previous trends these gains are pitiful: http://www.extremetech.com/wp-content/uploads/2012/02/CPU-Scaling.jpg
I'm not saying I expect Moore's Law to go on forever, but it's a sharp decline from when 5 years took us from 1,000 MHz to 10,000MHz (10x faster) (93-97ish) and now we go from 4GHz in 2011 to 5ish tops in 2017 (1.25x faster).
It also means real world users will see almost no difference, especially for gaming which is a pretty important part of this forum.
The cpu market used to have huge performance gains chip to chip, and the gpu market still does. I guess I just disagree with you saying this is a significant step forward. Mobile Pascal was a significant step forward, not this.Last edited: Jan 3, 2017keftih likes this. -
Sure, there are other factors which define a CPU, which Kaby Lake and Z270 has improved upon. However, servers and workstations are more likely to use server-grade chips, not the i3/i5/i7 line-up. That is because these CPUs are supposed to target mainstream consumers, and from a mainstream consumer point of view, these chips are quite disappointing.
We are not denying that Kaby Lake has made any advances whatsoever, but rather the fact that vast majority of the mainstream market has little to benefit from this "optimization" process. This is in contrast to the past trends, where at least some appreciable performance gains could be expected. I worry that Intel is shifting its focus away from mainstream consumers in favor of other markets... or perhaps this has already happened
Galm likes this. -
Use Google translate http://www.tek.no/artikler/test-intel-core-i7-7700k/366407/5
As in the review 4.9 GHz, which proved to be stable * with 1,321 1.312 volts.
Edit. http://www.guru3d.com/articles_pages/core_i7_7700k_processor_review_desktop_kaby_lake,22.html
Last edited: Jan 4, 2017 -
I would call 4.96GHz 5GHz... That is cool if it's achievable on air.
-
Don't look at the numbers from task manager. Windoze can't show the clock speed correct. The Redmond Morons can't do thing properly
-
I was looking at CPU-Z as well but meh whatever. Still fast.
-
More info about 7700K !!
"So while you game you can run at 5GHz, and when you want to run IntelBurnTest you can set a -200Mhz offset and the CPU frequency will drop to 4.8GHz for IntelBurnTest. The next thing Intel added is BCLK award adaptive voltage/frequency curve, and that should help simplify overclocking voltage levels"
Read more: http://www.tweaktown.com/reviews/7995/intel-kaby-lake-7700k-cpu-review/index11.htmlhmscott likes this. -
-
Guaranteed a Windoze Task Manager bug. There is much about this topic on the internet. Even with a small increase in BCLK clock I am reasonably sure that Windoze Task Manager can not show the correct clock speed. The Redmond Morons kill-destroy everything they touch!!
https://linustechtips.com/main/topic/708281-task-manager-is-showing-wrong-clock-speed/
http://superuser.com/questions/510188/windows-8-task-manager-not-reporting-actual-cpu-frequency
https://www.neowin.net/forum/topic/1111079-task-manager-reporting-wrong-cpu-frequency/
https://answers.microsoft.com/en-us...r/acfb0567-1bf7-4b4a-ab96-ab9de3efaec2?page=1
https://www.google.no/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=Task+Manager+windows+10+don't+shows+correct+clockspeed&tbs=qdr:y&start=30
http://forum.notebookreview.com/thr...0-owners-lounge.707507/page-284#post-10077112
http://forum.notebookreview.com/thr...0-owners-lounge.707507/page-283#post-10071997
Perhaps @D2 Ultima @Mr. Fox can take part in the topic? Highlight the topics more.Last edited: Jan 4, 2017hmscott likes this. -
Window Task Manager is clearly drunk. Constantly drunk.
-
YEEES!! And as well The Redmond Morons. See edited image in previous post
I put in...
4.9GHz = 4.84GHz (÷0.6GHz vs. real clockspeed)in Window Task Manager. If I clock down to the lower 4.8GHz will Task Manager show 4.76GHz (÷0.4GHz vs. real clockspeed) . Task Manager is A big Joke!! And people dont see it
Last edited: Jan 4, 2017 -
tilleroftheearth Wisdom listens quietly...
Like anything in life; I take 'information' presented as an indictor - not an absolute. Even from Redmond Morons (I wish I could be half as dumb as them sometimes...).
Really, does anybody know what time it is?
We're all just making our best guesses here... Enjoy!
-
tilleroftheearth Wisdom listens quietly...
No, not a strange perspective at all. Pragmatic.
What a manufacturer offers is a given. If a person cannot make it work for them, better than what they had/have 'now', the decision is obvious: don't buy.
My point is that Intel is chasing the improvements, right now, that will position the company (and I daresay computing in general...) in the best light when all the other secret and not so secret improvements all come together to bring an overall increased benefit for the 'average' user. Cat and mouse game (because they can't do all at once...) - I believe that Intel is playing the best hand they have at any given time - after all, they're not an alien race designing these chips... they're just like you and me too.
I just recently bought a $50 tablet on a whim that can be on standby for a week, runs Windows 10, charges in just over two hours (when 'off'), sips power when it is actually in use and I can hold all day without getting tired in the least... If people can't see the progress this is hinting at in the very near future... I can't say much more...
These are not lackluster improvements happening. These are improvements we were promised 20 years ago and I welcome them with open arms.
Even if the very small 'extreme' gaming community (here and elsewhere) doesn't see it that way.
To state this another way;
If I was a gamer, I would be doing things much like I am now. I would compare a full/complete new platform to the current platform I was using. FPS 'scores' (min, max and avg) would not be the defining indicator of whether a new contender was worthy or not. Neither would price be either (solely).
It would be the balance/sum total of the value I could get from selling/donating my old setup deducted from the hard cost of the new platform and all the (many) benefits it offers.
Staying still is very much like dying. We'll all reach that goal sooner or later and I'm not afraid of passing on. But making a conscience decision to hold myself back in the past is not for me. If I can (i.e. my wallet allows me...), I will get/borrow/use/buy the next best thing. Not because I'm (just) addicted to upgrading. But because the cost of staying with the old/familiar/'good enough' is too great. It eats into my time and that is the only thing any of us can't create more of. No matter how much we want to.
And even if a new platform saves me a few dollars (over multiple workstations) of electricity a year - it may still be worth to upgrade too (all else being equal) - because not only is time money - but money is time (saved) too (if done wisely). Still can't create it. But where we can trade for it wisely; we should.
To put an even finer point on things; don't let your imagination be guided by what manufacturers/editors/bloggers market/push/sell... Have an open mind towards whatever you're testing and twist/bend/shape it as best you can to fit your version of the world. When you can do that without breaking the thing - then you're at the bleeding edge: yours.
bloodhawk likes this. -
Core i7-7700K - Kaby Lake & Corsair RAM Overclocking
http://www.hardocp.com/article/2016...y_lake_corsair_ram_overclocking/#.WG3OoxvhCUk
"I set the Core i7-7700K to run at 4.5GHz and I used the XMP settings on the RAM which sets the clock to 3600MHz and the timings at 18-19-19-39-2T. The system booted right up and to the desktop without issue. This is not always what we have seen with Skylake processors in the past. In fact, Skylake gets downright tricky to tweak properly to get highly overclocked RAM settings with above 4.2GHz or so in mine and Dan's experience"
Intel Kaby Lake Core i7-7700K Overclocking Preview
http://www.hardocp.com/article/2016...y_lake_corsair_ram_overclocking/#.WG3OoxvhCUk
"Bad" Kaby Lake CPUs are looking to do 4.8GHz at less than 1.3v vCore. "Good" Kaby Lake CPUs are looking to do the magical 5GHz at 1.35v or less. 5GHz looks to be where these hit the wall however. The "golden" Kaby Lake will deliver 5.1GHz, but at a high 1.37v vCore" -
7600K and 7700K both dropped today. My local Microcenter has 25+ in stock.
TomJGX likes this. -
I ordered one on Amazon last night. I should get it tomorrow.
Anyone feel free to hit me up if they're after a delidded 6700k. -
Kaby Lake overclocking stats are up at SiliconLottery.
- 100% of 7700Ks can reach 4.8 GHz
- 91% can reach 4.9 GHz
- 62% can reach 5.0 GHz
- 24% can reach 5.1 GHz
- 4% can reach 5.2 GHz
-
Or
About 0.048V lower Vcore for 49x than previous SL 6700K chips. And don't forget that this is on desktop MB...
49x @1.392V - VCORE (Or less)
50x @1.408V - VCORE (Or less)
51x @1.424V - VCORE (Or less)
52x @1.440V - VCORE (Or less)
Passed the ROG RealBench stress test for one hour!!Last edited: Jan 7, 2017triturbo, ajc9988, hmscott and 1 other person like this. -
Really waiting/hoping to see a Kabylake CPU powered NB DTR run Win7...
-
From the comment section. Probably on a desktop.
http://www.tomshardware.com/reviews...00k-i5-7600,4870.html#comment-rsp_en_19098433Prema, TBoneSan, ajc9988 and 1 other person like this. -
Considering my ram started running at 4000 with the most recent bios update AND Kaby seems to work (for the parts that matter) on Windows 7 & 8.1, I may have to get a good binned chip...
Sent from my SM-G900P using TapatalkTomJGX, jaybee83, TBoneSan and 1 other person like this. -
-
Nvidia quietly opens 4K Netflix streaming on GeForce GTX 10-series graphics cards - PcWorld.com
" No Kaby Lake processor, no problem." But with limitations. As expected.Last edited: May 2, 2017hmscott likes this. -
Even worse, it requires Windows 10, Pfftt!! Pass...
With the Youtube / Netflix throttling, I've needed to drop down to 480p frequently recently...
4K HA!!
I've been telling people for years to focus on Optical BDR for 4k, as that doesn't require high speed internet, and doesn't eat into data cap limits.
There just isn't enough infrastructure to keep everyone streaming 4k yet, maybe not many at all what with the strange speed throttling I've been seeing recently (again).
And people are talking about streaming 8k, HA! Nutz, we can't even do 1080p60 reliably.jaybee83, TBoneSan, Ashtrix and 1 other person like this. -
That's why I put in... "But with limitations"
SCREW THEM ALL bruh!!
-
Yeah, those aren't just limitations, it's a joke, it's not happening for a long time still.
We've all been waiting for 4k streaming for years now, and it's not looking any better when I can't even default to 1080p60 reliably and frequently need to drop down to 720p/480.
I'd rather Youtube channels would have 1080p30 and 1080p60, as I'm not having good luck at 150MBps/12MBps service, which should be a slam dunk for 1080p60.
I hope Sony et al make a bigger push for 4k BDR otherwise a lot of those productions that release 4k HDR aren't going to have any outlet.Papusan likes this. -
Blind test: How easy do you see the difference between 4K and Full HD? I know it's done from bigger TV, But still... With my old eyes... I do not care
+ my sucky download speed
-
Maybe not so relevant but I put it here Gamers Report Core i7-7700(K) temperature spikes
-
Support.2@XOTIC PC Company Representative
Haven't had a lot of users report it, so no direct contact with the issue yet. The article didn't really say how widespread the problem was, are we talking a high number of users?
KABY LAKE "i7-7700K" FINDINGS. Not for FCBGA aka BGA
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Papusan, Dec 9, 2016.