The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    KABY LAKE "i7-7700K" FINDINGS. Not for FCBGA aka BGA

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Papusan, Dec 9, 2016.

  1. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    I would gladly give up famed Netflix 4k decoding, if I avoid iGPU completely. Processors more powerful than BGA shouldn't have this garbage that just takes up space on the silicon who rather could be used for the cores. As you say, You assume Nvidia will provide these features as well. And Intel comes with Intel Kabys Lake-X Hedt Processors. Many people with i7 quad-cores will jump on this train. These will definitely not be squeezed out from new features like this.
     
  2. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    6,160
    Messages:
    3,265
    Likes Received:
    2,573
    Trophy Points:
    231
    Well, I guess for the 1,300 people that actually use Edge and will bother with a Kaby Lake CPU, they will now have something they can do. ;)
     
    Last edited: Dec 20, 2016
    alexhawker, jaybee83, Papusan and 2 others like this.
  3. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Considering I gut edge and Windows store and apps out of Windows 10, this is just marketing speak for something irrelevant. I may upgrade to that chip to have one that operates at over 5ghz, but not because of this. Before I do, I'll do comprehensive benches comparing 7, 8.1, and 10 so that all benefits are known for improvements that matter. My 4K tv does the 4K Netflix, not my computer which is used for computing.

    Meanwhile, my gutted win 10 gets quite close to a stock win 7 sp1 un-optimized. I'll need to do something about win 7 in the coming weeks...



    Sent from my SM-G900P using Tapatalk
     
    jaybee83, Papusan and hmscott like this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    That's the best point, we aren't limited to Win10 / Kabylake at all, but the MS / Intel / Netflix propaganda harps on how Kabylake / Edge are *required*.
     
    Last edited: Dec 20, 2016
  5. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,238
    Trophy Points:
    231
    That applies to 16:9 laptops as well and I've said it quite a few times before :) Of course on a desktop you can go as wide you can possibly want (given that you can bring it through the door, or have such desk, or if someone actually manufactures such display) and I'm actually eying LG's 38UC99-W, but on laptops I think that 16:10 provides the perfect blend of portability and usefulness.
     
    ajc9988 and hmscott like this.
  6. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    That looks like a very nice display. Point taken. But, if you ever have considered TVs as displays, a good time to buy is a week or two before before the superbowl! (They are clearing out last years inventory). Also, rtings.com does good ratings. For displays, you kind of have to search, and that display looks good, but I'd want to check a couple things first (but for the size, the initial loom at the specs are good, but I'm just waking up)...

    Two points: 1) it only uses sRGB, not an expanded gamut. Not the worst thing considering so little is authored beyond sRGB that you would have it set to that anyways for almost everything. 2) the resolution is 3840x1600 which is definitely wide, but a little off if you want to watch something at 2160, regardless of 3840 or 4096. A lesser concern is it doesn't say which HDCP is supported, whether 1.x or 2.2. Just some thoughts, hence why I brought up TVs for that size (although you want to guarantee at least 4:4:4 if going the TV route and barely any incorporate display port)...

    Sent from my SM-G900P using Tapatalk
     
    Last edited: Dec 21, 2016
  7. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,238
    Trophy Points:
    231
    Well aware about both - the gamut and the movie/TV shows watching. It would be used for multitasking and gaming. Color critical work and movies would be for either my DreamColor or the FW900 :)
     
    ajc9988 likes this.
  8. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    TomJGX and jaybee83 like this.
  9. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    What a JOKE :D And people buy this? :eek: People are crazy nowadays!! I really don't know what to call this soldered trashware. But if you absolutely want/must test this thing. Run same test from http://www.notebookcheck.net/Intel-Core-i7-7500U-Notebook-Processor.172205.0.html
     
  10. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    TomJGX likes this.
  11. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,238
    Trophy Points:
    231
    Something that we can compare it with the previous CPUs. It would be interesting to see performance @12W and @15/18W as well i.e. how much is lost/gain with only couple of Watts.
     
  12. Mobius 1

    Mobius 1 Notebook Nobel Laureate

    Reputations:
    3,447
    Messages:
    9,069
    Likes Received:
    6,376
    Trophy Points:
    681
    I don't have another ULV to compare to :(
     
  13. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    You could look into Hwbot to see if you can compete with cpu scores from there. Or ask in TS guide.
     
    jaybee83 and ajc9988 like this.
  14. jmonroe0914

    jmonroe0914 Notebook Guru

    Reputations:
    2
    Messages:
    54
    Likes Received:
    6
    Trophy Points:
    16
    Perhaps I'm missing something here...

    4K DRM being built into a CPU is a plus for consumers, not a hinderance or disadvantage, much the same as Verity on Android devices is a plus not a disadvantage. This isn't new per say, as similar DRM is what prevents 1:1 copying a DVD or BluRay without one of the many available decryption programs (AnyDVD [now Red Fox], DVD43, etc.). 4K DRM is also why one requires HDMI 2.0 and HDCP 2.2 to watch 4K.

    This is no different than HDMI 1.4 and the HDCP standard on set top boxes that prevents one from connecting a consumer DVR (Hauppauge for example) to their DirecTV or Comcast box to record digital TV content. Are there issues with DRM in certain instances, absolutely (digital music purchases for example), however 4K video content is not in the same boat. DRM, for all intents and purposes, is a form of a copyright to prevent unauthorized usage by individuals not licensed for such usage. When we "purchase" media content, be it a video game, movie, music, etc., we're not purchasing the item itself but a license to utilize the content we purchased, regardless if it's digital or physical form.
    • For example, if you were one of the hundreds of people who worked on, and financed, a film to bring it to market, would you be okay with consumers being able to receive the movie, show, documentary, etc. for free? If one answers yes, then the individual needs to ask themselves exactly how long they believe the TV & Film industry would last if that was okay and allowed.
      • Everyone loves movies and TV shows, and while the genres will differ from person to person, we all like movies & TV shows and want the industry to still be there for us to enjoy.
     
    Last edited: Dec 26, 2016
  15. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    The use comes only in using the on board gpu. Do you use the gpu here or on your dgpu?

    Sent from my SM-G900P using Tapatalk
     
  16. jmonroe0914

    jmonroe0914 Notebook Guru

    Reputations:
    2
    Messages:
    54
    Likes Received:
    6
    Trophy Points:
    16
    Why would anyone be using a GPU to watch 4K video... there's no discernible gain from doing so over the integrated graphics on the CPU. For gaming and video editing, absolutely... for simply watching a 4K blu ray or 4K streaming, nada.
    • Remember, the 4K DRM the CPU processes is only for copyrighted 4K content
     
  17. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    On desktops, which this is a desktop chip, do you regularly go to the back of your machine to switch your cables from being plugged into the desktop dedicated gpu to the mb plug and turn on the disabled igpu just so you can watch it from the igpu instead? If you do, you are the rare case!

    Sent from my SM-G900P using Tapatalk
     
    alexhawker likes this.
  18. jmonroe0914

    jmonroe0914 Notebook Guru

    Reputations:
    2
    Messages:
    54
    Likes Received:
    6
    Trophy Points:
    16
    Per my signature, I don't have a desktop... While I understand the point you're trying to make, it's not altogether clear what yourself and others are taking issue with.
    • If one is planning on watching 4K content through a PC, but is not using the integrated graphics on the CPU, why would anyone then expect the graphics DRM portion of the CPU to process 4K DRM when not utilizing the integrated graphics? Wouldn't one's issue be with the manufacturer(s) of the individual's GPU?
    I've had all three versions of the Alienware 18, with each having either Nvidia or AMD in an SLI configuration, and when I wasn't using the PC for gaming, I always switch back to integrated graphics as there's no advantage, and several disadvantages, of utilizing the GPU when not gaming or video editing. Granted, the same HDMI ports were used for both the integrated graphics as well as the GPU, and while I do understand the point you're trying to make regarding the possible inconvenience of manually switching ports, there's a few different ways that could be managed so it's not an inconvenience:
    • One could simply add a second HDMI cable that always stays plugged into the integrated graphics HDMI port and is only switched out on the monitor/TV if either doesn't have a second HDMI port.
    • One could utilize a DisplayPort cable from the motherboard to the display, or an HDMI to DisplayPort cable.
    • One could use an HDMI 2x1 or 4x1 duplicator (such as MonoPrice sells), of which requires no manual intervention since there would only ever be one video source going in and coming out.
    In regards to rebooting in order to switch graphics, that shouldn't be a big deal since anyone running a 4K PC setup is likely to have either a SATA SSD or a PCIe SSD (via either PCIe or M.2), which would result in a 10 - 15s boot time, 20 - 30s from rebooting to logging back in.
     
    Last edited: Dec 31, 2016
  19. win32asmguy

    win32asmguy Moderator Moderator

    Reputations:
    1,012
    Messages:
    2,844
    Likes Received:
    1,699
    Trophy Points:
    181
    Posts have been removed which had personal attacks or expletives in them or quoted them. Please keep the discussion on topic or we will have to close this thread.
     
  20. keftih

    keftih Notebook Evangelist

    Reputations:
    87
    Messages:
    539
    Likes Received:
    501
    Trophy Points:
    106
    More reviews:

    1. Anandtech - "In most of our benchmarks, the results are clear: a stock Core i7-7700K beat our overclocked Core i7-4790K in practically every CPU-based test (Our GPU tests showed little change). When overclocked, the i7-7700K just pushed out a bigger lead for only a few more watts. Technically one could argue that because this part and the i7-6700K are equal in IPC, a similar overclock with the i7-6700K achieves the same performance. But the crucial matter here is how lucky a user is with the silicon lottery – based on our testing, the Core i7-7700K CPUs tend to overclock rather nicely (although +300 MHz isn’t that much in the grand scheme of things)."
    2. Arstechnica - "As it stands, what we have with Kaby Lake desktop is effectively Sandy Bridge polished to within an inch of its life, a once-groundbreaking CPU architecture hacked, and tweaked, and mangled into ever-smaller manufacturing processes and power envelopes. Where the next major leap in desktop computing power comes from is still up for debate—but if Kaby Lake is any indication, it won't be coming from Intel."
    3. Techpowerup - "Given these performance figures, it's hard to recommend an upgrade to the Core i7-7700K from the i7-6700K. There are no IPC gains to be had, overclocking isn't that much better, and today's graphics cards don't significantly benefit from the faster processor. If you want to spend upgrade money for gaming, then the better investment is a faster graphics card. However, if you are buying new, then there is no reason to buy the i7-6700K, unless you can find it used at a significant discount."
    4. Tomshardware - "To its credit, Kaby Lake brings the mythical 5 GHz threshold into play for overclockers. But beyond the clock rate boost (roughly 200-300 MHz over Skylake overclocks), there is little reason for enthusiasts armed with Skylake CPUs to upgrade. If you already have a modern processor, spend those dollars on a new GPU or SSD. Kaby Lake is really only an option for power users building new PCs."
    Results are exactly what we expected from the recent leaks. There are virtually no IPC gains (within margin of error), and little reason to upgrade unless jumping up from a lower tier CPU (e.g. i5 -> i7).

    Tom's Hardware review in particular is quite in-depth, and well done. I think it's interesting that the retail CPU samples received by the U.S. and Germany branches have a 15 W difference in power consumption, with the reviewer suggesting that binning will be very important for Kaby Lake. Their review also covers the lower-end, "non-K" i5-7600 and i7-7700, which I appreciate.
     
  21. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    IPC gains? Yawn.

    What are the overall platform gains? Significant? You bet.

    Especially as time moves forward...

    Of course anyone having a current platform won't gain a lot from the latest offerings... (duh...).

    But there are people that will actually make back their $$$$ in a few weeks for a 1% or 2% gain - in other words; it's up to the buyer to purchase a system/platform/components that maximize their benefits while minimizing their costs (over the lifecycle of the system/platform/etc. in question...).

    That doesn't make the current offerings any less valuable ( Arstechnica is out to lunch with their 'conclusion'...) - just makes the decision of whether to buy something or not in the hands where they belong; the buyer and the workflows/workloads involved (and that has never changed, of course).
     
    jaybee83, Papusan and bloodhawk like this.
  22. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Well this is a strange perspective to look at it imo. I think it frames it a little wrong.

    Your kind of justifying the lackluster improvement here. Sure niche things like 4K 360 video saw large improvements and power consumption lowering is always good, but your framing this from a perspective of some businesses and power users that use these chips for huge workloads where a 1% difference is a big deal. I would describe that market more as a Xeon or HEDT market myself, but I also think it makes it sound more okay than it is.

    When you look into the past at previous trends these gains are pitiful: http://www.extremetech.com/wp-content/uploads/2012/02/CPU-Scaling.jpg

    I'm not saying I expect Moore's Law to go on forever, but it's a sharp decline from when 5 years took us from 1,000 MHz to 10,000MHz (10x faster) (93-97ish) and now we go from 4GHz in 2011 to 5ish tops in 2017 (1.25x faster).
    It also means real world users will see almost no difference, especially for gaming which is a pretty important part of this forum.

    The cpu market used to have huge performance gains chip to chip, and the gpu market still does. I guess I just disagree with you saying this is a significant step forward. Mobile Pascal was a significant step forward, not this.
     
    Last edited: Jan 3, 2017
    keftih likes this.
  23. keftih

    keftih Notebook Evangelist

    Reputations:
    87
    Messages:
    539
    Likes Received:
    501
    Trophy Points:
    106
    Sure, there are other factors which define a CPU, which Kaby Lake and Z270 has improved upon. However, servers and workstations are more likely to use server-grade chips, not the i3/i5/i7 line-up. That is because these CPUs are supposed to target mainstream consumers, and from a mainstream consumer point of view, these chips are quite disappointing.

    We are not denying that Kaby Lake has made any advances whatsoever, but rather the fact that vast majority of the mainstream market has little to benefit from this "optimization" process. This is in contrast to the past trends, where at least some appreciable performance gains could be expected. I worry that Intel is shifting its focus away from mainstream consumers in favor of other markets... or perhaps this has already happened :oops:
     
    Galm likes this.
  24. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    Last edited: Jan 4, 2017
    jaybee83 and Ashtrix like this.
  25. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
  26. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    Don't look at the numbers from task manager. Windoze can't show the clock speed correct. The Redmond Morons can't do thing properly :D
     
    jaybee83 and hmscott like this.
  27. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    I was looking at CPU-Z as well but meh whatever. Still fast.
     
    Papusan and hmscott like this.
  28. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    More info about 7700K !!
    "So while you game you can run at 5GHz, and when you want to run IntelBurnTest you can set a -200Mhz offset and the CPU frequency will drop to 4.8GHz for IntelBurnTest. The next thing Intel added is BCLK award adaptive voltage/frequency curve, and that should help simplify overclocking voltage levels"


    Read more: http://www.tweaktown.com/reviews/7995/intel-kaby-lake-7700k-cpu-review/index11.html
     
    hmscott likes this.
  29. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    @Papusan knows this one, bump your BCLK a tick or two in the right direction to hit an even frequency. :)

    Unless we can't do that anymore - auto-mis-adjusting would suck...
     
    Last edited: Jan 4, 2017
    Papusan likes this.
  30. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,624
    Trophy Points:
    931
    Guaranteed a Windoze Task Manager bug. There is much about this topic on the internet. Even with a small increase in BCLK clock I am reasonably sure that Windoze Task Manager can not show the correct clock speed. The Redmond Morons kill-destroy everything they touch!!

    https://linustechtips.com/main/topic/708281-task-manager-is-showing-wrong-clock-speed/
    http://superuser.com/questions/510188/windows-8-task-manager-not-reporting-actual-cpu-frequency
    https://www.neowin.net/forum/topic/1111079-task-manager-reporting-wrong-cpu-frequency/
    https://answers.microsoft.com/en-us...r/acfb0567-1bf7-4b4a-ab96-ab9de3efaec2?page=1
    https://www.google.no/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=Task+Manager+windows+10+don't+shows+correct+clockspeed&tbs=qdr:y&start=30
    http://forum.notebookreview.com/thr...0-owners-lounge.707507/page-284#post-10077112
    http://forum.notebookreview.com/thr...0-owners-lounge.707507/page-283#post-10071997
    [​IMG]

    [​IMG]
    Perhaps @D2 Ultima @Mr. Fox can take part in the topic? Highlight the topics more.
     
    Last edited: Jan 4, 2017
    hmscott likes this.
  31. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Window Task Manager is clearly drunk. Constantly drunk.
     
    Ashtrix and Papusan like this.
  32. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    YEEES!! And as well The Redmond Morons. See edited image in previous post
    I put in...
    4.9GHz = 4.84GHz (÷0.6GHz vs. real clockspeed)in Window Task Manager. If I clock down to the lower 4.8GHz will Task Manager show 4.76GHz (÷0.4GHz vs. real clockspeed) . Task Manager is A big Joke!! And people dont see it :confused:
     
    Last edited: Jan 4, 2017
    ajc9988, hmscott, Ashtrix and 2 others like this.
  33. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Like anything in life; I take 'information' presented as an indictor - not an absolute. Even from Redmond Morons (I wish I could be half as dumb as them sometimes...). :)

    Really, does anybody know what time it is? ;)

    We're all just making our best guesses here... Enjoy!

     
    TomJGX, Papusan and hmscott like this.
  34. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    No, not a strange perspective at all. Pragmatic.

    What a manufacturer offers is a given. If a person cannot make it work for them, better than what they had/have 'now', the decision is obvious: don't buy.

    My point is that Intel is chasing the improvements, right now, that will position the company (and I daresay computing in general...) in the best light when all the other secret and not so secret improvements all come together to bring an overall increased benefit for the 'average' user. Cat and mouse game (because they can't do all at once...) - I believe that Intel is playing the best hand they have at any given time - after all, they're not an alien race designing these chips... they're just like you and me too.

    I just recently bought a $50 tablet on a whim that can be on standby for a week, runs Windows 10, charges in just over two hours (when 'off'), sips power when it is actually in use and I can hold all day without getting tired in the least... If people can't see the progress this is hinting at in the very near future... I can't say much more...

    These are not lackluster improvements happening. These are improvements we were promised 20 years ago and I welcome them with open arms.

    Even if the very small 'extreme' gaming community (here and elsewhere) doesn't see it that way.


    To state this another way;

    If I was a gamer, I would be doing things much like I am now. I would compare a full/complete new platform to the current platform I was using. FPS 'scores' (min, max and avg) would not be the defining indicator of whether a new contender was worthy or not. Neither would price be either (solely).

    It would be the balance/sum total of the value I could get from selling/donating my old setup deducted from the hard cost of the new platform and all the (many) benefits it offers.

    Staying still is very much like dying. We'll all reach that goal sooner or later and I'm not afraid of passing on. But making a conscience decision to hold myself back in the past is not for me. If I can (i.e. my wallet allows me...), I will get/borrow/use/buy the next best thing. Not because I'm (just) addicted to upgrading. But because the cost of staying with the old/familiar/'good enough' is too great. It eats into my time and that is the only thing any of us can't create more of. No matter how much we want to.

    And even if a new platform saves me a few dollars (over multiple workstations) of electricity a year - it may still be worth to upgrade too (all else being equal) - because not only is time money - but money is time (saved) too (if done wisely). Still can't create it. But where we can trade for it wisely; we should.

    To put an even finer point on things; don't let your imagination be guided by what manufacturers/editors/bloggers market/push/sell... Have an open mind towards whatever you're testing and twist/bend/shape it as best you can to fit your version of the world. When you can do that without breaking the thing - then you're at the bleeding edge: yours. :)

     
    bloodhawk likes this.
  35. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    Core i7-7700K - Kaby Lake & Corsair RAM Overclocking
    http://www.hardocp.com/article/2016...y_lake_corsair_ram_overclocking/#.WG3OoxvhCUk

    "I set the Core i7-7700K to run at 4.5GHz and I used the XMP settings on the RAM which sets the clock to 3600MHz and the timings at 18-19-19-39-2T. The system booted right up and to the desktop without issue. This is not always what we have seen with Skylake processors in the past. In fact, Skylake gets downright tricky to tweak properly to get highly overclocked RAM settings with above 4.2GHz or so in mine and Dan's experience"


    Intel Kaby Lake Core i7-7700K Overclocking Preview
    http://www.hardocp.com/article/2016...y_lake_corsair_ram_overclocking/#.WG3OoxvhCUk

    "Bad" Kaby Lake CPUs are looking to do 4.8GHz at less than 1.3v vCore. "Good" Kaby Lake CPUs are looking to do the magical 5GHz at 1.35v or less. 5GHz looks to be where these hit the wall however. The "golden" Kaby Lake will deliver 5.1GHz, but at a high 1.37v vCore"
     
    TomJGX, jaybee83, Ashtrix and 2 others like this.
  36. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331
    7600K and 7700K both dropped today. My local Microcenter has 25+ in stock.
     
    TomJGX likes this.
  37. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    I ordered one on Amazon last night. I should get it tomorrow.
    Anyone feel free to hit me up if they're after a delidded 6700k.
     
  38. keftih

    keftih Notebook Evangelist

    Reputations:
    87
    Messages:
    539
    Likes Received:
    501
    Trophy Points:
    106
    Kaby Lake overclocking stats are up at SiliconLottery.
    • 100% of 7700Ks can reach 4.8 GHz
    • 91% can reach 4.9 GHz
    • 62% can reach 5.0 GHz
    • 24% can reach 5.1 GHz
    • 4% can reach 5.2 GHz
     
    TomJGX, triturbo and hmscott like this.
  39. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    Or :rolleyes: About 0.048V lower Vcore for 49x than previous SL 6700K chips. And don't forget that this is on desktop MB...
    49x @1.392V - VCORE (Or less)
    50x @1.408V - VCORE (Or less)
    51x @1.424V - VCORE (Or less)
    52x @1.440V - VCORE (Or less)
    Passed the ROG RealBench stress test for one hour!!
     
    Last edited: Jan 7, 2017
  40. Ashtrix

    Ashtrix ψυχή υπεροχή

    Reputations:
    2,376
    Messages:
    2,080
    Likes Received:
    3,275
    Trophy Points:
    281
    Really waiting/hoping to see a Kabylake CPU powered NB DTR run Win7...
     
    hmscott and Papusan like this.
  41. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
  42. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,750
    Messages:
    6,121
    Likes Received:
    8,849
    Trophy Points:
    681
    Considering my ram started running at 4000 with the most recent bios update AND Kaby seems to work (for the parts that matter) on Windows 7 & 8.1, I may have to get a good binned chip...

    Sent from my SM-G900P using Tapatalk
     
  43. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
  44. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    Nvidia quietly opens 4K Netflix streaming on GeForce GTX 10-series graphics cards - PcWorld.com

    " No Kaby Lake processor, no problem." But with limitations. As expected.
     
    Last edited: May 2, 2017
    hmscott likes this.
  45. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Even worse, it requires Windows 10, Pfftt!! Pass...

    With the Youtube / Netflix throttling, I've needed to drop down to 480p frequently recently...

    4K HA!!

    I've been telling people for years to focus on Optical BDR for 4k, as that doesn't require high speed internet, and doesn't eat into data cap limits.

    There just isn't enough infrastructure to keep everyone streaming 4k yet, maybe not many at all what with the strange speed throttling I've been seeing recently (again).

    And people are talking about streaming 8k, HA! Nutz, we can't even do 1080p60 reliably.
     
  46. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    That's why I put in... "But with limitations" :vbthumbsup: SCREW THEM ALL bruh!!
     
    Ashtrix and hmscott like this.
  47. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Yeah, those aren't just limitations, it's a joke, it's not happening for a long time still.

    We've all been waiting for 4k streaming for years now, and it's not looking any better when I can't even default to 1080p60 reliably and frequently need to drop down to 720p/480.

    I'd rather Youtube channels would have 1080p30 and 1080p60, as I'm not having good luck at 150MBps/12MBps service, which should be a slam dunk for 1080p60.

    I hope Sony et al make a bigger push for 4k BDR otherwise a lot of those productions that release 4k HDR aren't going to have any outlet.
     
    Papusan likes this.
  48. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    Blind test: How easy do you see the difference between 4K and Full HD? I know it's done from bigger TV, But still... With my old eyes... I do not care :vbbiggrin: + my sucky download speed :eek:
     
    hmscott, Ashtrix and jaug1337 like this.
  49. Papusan

    Papusan Jokebook's Sucks! Dont waste your $$$ on Filthy

    Reputations:
    42,708
    Messages:
    29,842
    Likes Received:
    59,625
    Trophy Points:
    931
    hmscott and Ashtrix like this.
  50. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Haven't had a lot of users report it, so no direct contact with the issue yet. The article didn't really say how widespread the problem was, are we talking a high number of users?
     
    hmscott, Ashtrix and Papusan like this.
← Previous pageNext page →