The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Razer Core discussion & benchmarks

    Discussion in 'Razer' started by MSGaldenzi, Mar 16, 2016.

  1. ETisME

    ETisME Notebook Guru

    Reputations:
    4
    Messages:
    70
    Likes Received:
    29
    Trophy Points:
    26
    Blizzard games are quite CPU intensive though.
    SC2 in particular is very heavy on CPU. Diablo 3 as well is pretty CPU intensive.

    And honestly you might not need top class CPU but you do need a reliable good CPU minimum
     
  2. mindinversion

    mindinversion Notebook Evangelist

    Reputations:
    64
    Messages:
    515
    Likes Received:
    83
    Trophy Points:
    41

    Well you're talking about a moving from a dual core with no turbo boost to a quad core overclockable part. I didn't bother to check L2 and 3 cache, but there may be a difference there as well. There are potentially HUGE architectural differences between these two processors.

    Yea. . . maxed out at 1080 on a 2014 blade pro the average CPU usage is something like 14%. During intense fights it can spike up to around 24%, and when dumping cache it VERY OCCASIONALLY spikes for like 1/2 second to 34%.

    As for SC2, it runs just fine at 1080ish res on medium/high settings on the surface pro 4 and SB [SP4 I've read, SB w/ nvidia I've personally tested] SB also had the 6500u, the same processor in the blade stealth. . . so you're not winning any arguments on that one.

    And for the record, consoles =/= PC. The processors, architecture, and data handling are 100% different, and most of your advertised "cores" etc are marketing gimicks. IE, a 780ti may have over a thousand "cuda cores" . . . does that mean it's a thousand + "cpu" graphics card? How many of those console cores are virtual? How are the data paths. . . . Ahhh, why am I bothering...

    In all fairness, I don't play unoptimized console ports like Witcher 3, so I guess in those situations it's very possible they're soaking up CPU.. but that's someone else's problem ; )
     
    Last edited: Mar 27, 2016
  3. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Current consoles and pcs are very alike. In fact exactly the same architecture except for the memory type on the ps4 that uses gdd5 for system resources as well. Sc2 and diablo3 are nothing in terms of cpu load. Like i said the division uses up to 70% on a quadcore 6700hq laptop cpu. 65% on an overclocked 2500k desktop cpu
     
  4. mindinversion

    mindinversion Notebook Evangelist

    Reputations:
    64
    Messages:
    515
    Likes Received:
    83
    Trophy Points:
    41
    In that they both have a CPU and GPU and run programs, yes. How programmers interface, program, manage resource allocation and settings. . . no.
    It's like comparing a high end gaming desktop to a chromebook [both in hardware and performance]. . if the chromebook came with a power adapter the size of a small dog [seriously, look at the literal brick that is the 100w Xbox 1 power adapter.. compare it to the 150w power adapter for the Blade. . . sheesh]

    If they were so alike, the state of PC game porting wouldn't be anywhere near the horrible mess that it is...
     
  5. TareX

    TareX Notebook Geek

    Reputations:
    0
    Messages:
    81
    Likes Received:
    11
    Trophy Points:
    16
    Is it true that Thunderbolt 3.0 is only cappaed at PCIe 4x, like the Alienware amplifier?? Here's a comment on reddit:

    Link: https://www.reddit.com/r/oculus/com...se_a_vr_ready_laptop_it_passes_oculus/d1g3bw5
     
  6. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    You have seriously no clue what you are talking about. What the hell does the power adapter have anything to do with this discussion? The PS4 on the other hand doesnt have one and has an internal powersupply. Your point?

    Still there are no architectural differences, hell the Xbox one even uses Direct X 12. The only difference is that develoeprs can optomize for 1 setup thus for a large part circumvent the lower specs. A large difference from PS3 and X360 that uses variants of the PowerPC architecture which were actually really different.

    I suspect also no developing history. I actually do :)

    This is correct. There are simply no more lanes left on mobile core i5/i7 cpu's and consumer desktop core i5/i7's. But with current generation cards it doesnt really affect performance. Only a few frame.
     
    Last edited: Mar 28, 2016
  7. cfortney

    cfortney Notebook Enthusiast

    Reputations:
    0
    Messages:
    10
    Likes Received:
    0
    Trophy Points:
    5
    Have there been any Thunderbolt 3 enclosures announced besides the Razer Core or the Acer 960m dock? I picked up an Acer v15 Nitro Black Edition, it's got Thunderbolt 3 over USB-3.1 Type-C and I really want to get a graphics dock for it! However the Razer Core just seems grossly overpriced at 500. And I'm confused about the Asus ROG XG2 that they showcased at CES, and there haven't been any press releases since. One article on TomsHardware claims that you can use just one USB 3.1 Type-C with it for graphics (the other for powering the laptop), and elsewhere I've read that it requires both.
     
  8. mindinversion

    mindinversion Notebook Evangelist

    Reputations:
    64
    Messages:
    515
    Likes Received:
    83
    Trophy Points:
    41
    Um. . . the AMD Jaguar processor having 8 cores while an intel core i5/7 has 4 constitutes an "architectural difference" Both processors may be more or less functionally identical, but their architectural design and optimizations are VERY different and [as a developer I'm sure you're vastly more aware of these nuances than I am] are worlds away different in programming structure. If they weren't, software disasters like Arkham Knight [probably] wouldn't happen (and the complaint about horrible console port optimizations in the PC gaming world would be either drastically reduced or eliminated)
     
  9. MSGaldenzi

    MSGaldenzi Notebook Deity

    Reputations:
    109
    Messages:
    1,251
    Likes Received:
    85
    Trophy Points:
    66
    http://www.amd.com/en-us/innovations/software-technologies/technologies-gaming/xconnect

    Looks like AMD is stepping up to the hotswapping of GPUs. Been using AMD on my alienware amplifier for a while now and absolutely love their software over nvidia. Much more polished than years ago when it looked like it was straight out of dos. Been seriously debating selling my AW17r3+amp for a Stealth+core seeing as I don't game as much as I used to and really don't need the quadcore as much as I had thought I did.
     
  10. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    If you have 4 cores and need 4 threads more, the OS you are using will automatically queue the other calls behind those other 4 threads, unless there are functions in place that are forcing to use multiple cores next to eachother. WHen people talk about architecture they talk about the nature of the processor not the amount of cores etc. But how they run. THey only need to change the schedules if there is a different amount of cores available.

    Arkahm Knight was a disaster because they used the textures streaming directly from the consoles that have a far more light load and better caching of the harddrive used.

    If you used an SSD the game would run already much more fluid and had less errors because the engine didnt need to wait for some calls to be carried out and thus not time out.
     
  11. TareX

    TareX Notebook Geek

    Reputations:
    0
    Messages:
    81
    Likes Received:
    11
    Trophy Points:
    16
    What does that have to do with PCIe 4x vs 16x.... All desktop GPUs run on a PCIe 16x connection. Running current ones 4x would produce about 15-20% performance issues, and I imagine the gap would be a lot bigger when running a Pascal GPU off a PCIe 4x connection.
     
  12. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    X16 or X4 are the amounbt of PCI express lanes

    Depending on the CPU type used you have a certain amount of lanes. The Blade, Alienware AGA laptops etc dont have more than 4 lanes left to connect a graphic card to. Running a desktop gpu on 4x doesnt cause a 15% performance loss by the way. The most I measured was 10 and that is when on the AGA the external GPU is passing back the signal to the internal laptop screen.
     
    Last edited: Mar 29, 2016
  13. TareX

    TareX Notebook Geek

    Reputations:
    0
    Messages:
    81
    Likes Received:
    11
    Trophy Points:
    16
    From Razer's forums:
    Source: https://insider.razerzone.com/index.php?threads/the-razer-blade-stealth-razer-core.10510/page-79

    So we're talking 10% from converting PCIe to TB + the reduction of 10% using the 4x PCIe instead of the 16x, and another 10% from CPU throttling, since the 6500hk is considerably slower than your Alienware's... so 30% drop. Honestly, not seeing Core benchmarks so close to launch and 3 months after announcing the product is quite shady. I cancelled my Razer and Core preorder because my guess is these numbers will get even worse with the next generation of Pascal GPUs. The whole idea of the GPU Amplifier and Core is to have a laptop that can last longer than a year, not one you'll regret next month when Pascal GPUs are out and severely underperforming as external GPUs.
     
    Darkhan likes this.
  14. rinneh

    rinneh Notebook Prophet

    Reputations:
    854
    Messages:
    4,897
    Likes Received:
    2,191
    Trophy Points:
    231
    Hmm benchmarks on the alienware AGA show otherwise. Unless Dell implemented it better and TB3 is actually less ideal than the proprietary standard of Dell.
     
  15. Game7a1

    Game7a1 ?

    Reputations:
    529
    Messages:
    3,159
    Likes Received:
    1,040
    Trophy Points:
    231
    [​IMG]
    I feel as if the CPU (overhead) bottleneck is present here. When I did my Graphics Amp analysis with my R1 (i7-5500u), with my GTX 960, the average I got was 86.86 FPS with the GTX 960 and 80.96 FPS with the R9 285 on the highest settings, sans Anti-Aliasing and something else (I forgot) (from my desktop, I got 15 to 17 FPS more, or a 15 to 17% drop from desktop to GA), in Bioshock Infinite. Drivers may have improved since then, but that 76 FPS with the R9 Nano doesn't paint a very good picture unless it's at a higher resolution (I did my benches at 1080p).
    If not, this is why I've never recommended AMD GPUs for the 13. I know DX12 will subdue the problem, but I wonder by how much.
     
    Last edited: Mar 31, 2016
  16. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    There is additional overhead to go through the Thunderbolt 3 controller. Dell's Alienware Graphics Amplifier goes straight to PCIe. Both solutions use PCIe 3.0 x4 as maximum bandwidth.


    I think you might be blowing things a bit out of proportion here.

    1) Those drops you mention (10% + 10% + 10% = 30%) are in extreme cases where the computer is GPU-constrained by PCIe bandwidth AND simultaneously CPU-constrained at the same time. You're not going to find that except in artificial synthetic benchmark scenarios.

    2) Real-world difference in slightly lower GPU performance isn't very noticeable in-game. A difference of ~20% lower GPU performance means slightly lowering some image quality settings. And when you're actively playing a game and focusing on the gameplay, you're not going to notice the difference between something like High / Ultra detail quality on grass. You need to specifically look for that, in order to notice.

    How is it shady? A company announced a product, and releases it 3 months later. Benchmarks come out when independent review sites get their hands on the final product, and are able to run tests on it.

    What's the alternative? That Razer comes out and says "we got a 3dMark11 score of 11,542 when using laptop ABC and video card XYZ?" You'd have to take that information with a grain of salt, considering that it's the manufacturer reporting their own score. Somehow, I doubt that you'd be satisfied with a single benchmark score reported by the manufacturer, without independent review sites running their own tests.
     
  17. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Definetly not getting a Stealth but interested in the Core, if it works with other laptops not named Razer ;)
    Ironically a Dell XPS 15 would be a nice match for the core.
     
    soulvengeance likes this.
  18. BMM

    BMM Notebook Enthusiast

    Reputations:
    0
    Messages:
    19
    Likes Received:
    3
    Trophy Points:
    6
    Pre ordered a Intel Skull Canyon nuc (i7-6770HQ + Gt4e,128Mb EDram) as it's 100% compatible with the Razer Core. Planning on benchmarking both it and my Stealth + Core,in games once my Core arrives,with a MSI 980 Ti. Will post results.
     
    IKAS V and hmscott like this.
  19. Punchdrunk

    Punchdrunk Notebook Consultant

    Reputations:
    10
    Messages:
    119
    Likes Received:
    34
    Trophy Points:
    41
    I don't suppose you have any date estimation for when your Core will be delivered?
     
  20. KillerFry

    KillerFry Notebook Consultant

    Reputations:
    22
    Messages:
    162
    Likes Received:
    25
    Trophy Points:
    41
    Mine has been delayed until May 23rd... ... crap...
     
  21. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Really great alternative but are you sure it's 100% compatible?
    Would love to see the results , when you getting all your new hardware?
    Definetly looking forward to it.
     
  22. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    Sure it's 100% compatible. It was demo'd with it.
     
  23. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Nice!
    Is there a link or video of the demo?
    Looking to get a laptop that supports eGPU'S ( XPS 15,RB or Acer Nitro) but this might work for me too.
    How you liking your XPS 15 so far Eason? Better than your 15 RB?
     
  24. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131

    https://www.google.co.th/search?q=s...e.0.0j69i57.4191j0j7&sourceid=chrome&ie=UTF-8

    Really liking it except for the occasional hiccup. It's not that buggy now but every once in a while sth comes up that shouldn't happen. For example, I found that after 30 minutes of gaming my CPU is clocking down and I need to use throttlestop to force max clocks. Random crap like that.
     
  25. IKAS V

    IKAS V Notebook Prophet

    Reputations:
    1,073
    Messages:
    6,171
    Likes Received:
    535
    Trophy Points:
    281
    Thermal throttling ?
    I thought the XPS had good cooling .
    Thanks for the link.
    :)
     
  26. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    No, it isn't thermal throttling, or BD Prochot. It seems to be firmware initiated, but luckily TS stops it. The thermals on the XPS 15 are great, definitely a few steps up from the blade.
     
  27. GoodToGo

    GoodToGo Notebook Consultant

    Reputations:
    58
    Messages:
    294
    Likes Received:
    0
    Trophy Points:
    30
    Right now, there are no reviews whatsoever for Razer core. I would love to see people use them with non razer laptops like XPS 15, Lenovo P70 or even an ultra book. If it is compatible all over, it could be a potential game changer. What scares me are drivers for TB3 that could be screwed up for different manufacturers. There is no way to know till you actually try it. For example, XPS 15 drivers for TB are also flaky from what I see.

    Also, Core has been delayed to April 29th. Some last minute issue being worked out with Intel.
     
  28. Punchdrunk

    Punchdrunk Notebook Consultant

    Reputations:
    10
    Messages:
    119
    Likes Received:
    34
    Trophy Points:
    41
    Any update on this? Still delayed?
     
  29. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    5 things to know before buying:

     
    hmscott likes this.
  30. KillerFry

    KillerFry Notebook Consultant

    Reputations:
    22
    Messages:
    162
    Likes Received:
    25
    Trophy Points:
    41
    Got my Core yesterday, but installed it today. Basically, I took thr 980 Ti from my desktop and put it there.

    There is some driver and a GPU switcher tool you have to get from Razer, but it's all good. Oh yeah, I am using a Blade for the moment; I will ask my buddy with the Stealth to let me have it for the weekend.

    Before taking the GPU from my desktop, I ran 3DMark's Fire Strike, to get a sense of the performance hit. Also, I recently had spent my time playing Heroes of the Storm and Overwatch, so I have the general fps I got in those games present.

    Right now I'm re-running Fire Strike for the third time on the Core. First run was just the Core and the laptop and external display. Second time was the Core with all my desktop USB devices conected to it, because I figure resources are shared, so that might impact results. And this 3rd time it's the Core with nothing plugged, and just the external display.

    I will post the results tomorrow as well as my desktop's specs. I'm about to go to sleep.

    Just some quick things: when playing Overwatch on the external display, I tried to put some YouTube videos on the laptop's display, but they got heavily corrupted there. The audio was playing, but maybe the bandwidth was not enough to handle rendering the game and decoding the video. Also, Doom is giving me issues on loading screens; it gets "stuck", but then finishes loading if I Ctrl+Alt+Del... weird.

    The fps loss in Overwatch (with Ultra preset) went from 85-90 fps to 70-80 fps.

    Anyway, I'll keep experimenting, ask your questions and I'll try to answer them if I can.

    Sent from my Nexus 6P using Tapatalk
     
    pau1ow, Eason and hmscott like this.
  31. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    Thanks. Looking forward to your results. If it's a 20% drop-off with a quad-core CPU with modern cards then I'll probably cancel my pre-order.
     
  32. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,046
    Trophy Points:
    431
    20% difference over tb3 switching from a desktop cpu to a low wattage notebook cpu is pretty good if you ask me.

    Sent from a 128th Legion Stormtrooper 6P
     
  33. ETisME

    ETisME Notebook Guru

    Reputations:
    4
    Messages:
    70
    Likes Received:
    29
    Trophy Points:
    26
    that big of a drop in the fps is pretty worrying when overwatch is quite a well optimized game.
     
  34. Red Line

    Red Line Notebook Deity

    Reputations:
    1,109
    Messages:
    1,289
    Likes Received:
    141
    Trophy Points:
    81
    Try to connect 980Ti straight to external monitor/TV and test it again. This could be the same issues Alienware AGA are facing, you loose like 5% when you conncet external GPU to a monitor and around 20% when you play on your laptops screen. It can depend on the title as well.
     
  35. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    I wouldn't call a 6700hq low wattage, because it's basically the highest wattage of computer that the core would ever be used with. Remember, it's designed for laptops.

    20% performance loss on a dual core low voltage cpu would be understandable, though. I'm concerned that any bottleneck would get worse in the future with more powerful cards.

    Edit: according to scores posted on Reddit there is it's nearly no performance loss!

    RBS and 980ti getting 4100+ on fire strike ultra with an external monitor and about 3900 on internal. Nice
     
    Last edited: May 27, 2016
    hmscott likes this.
  36. KillerFry

    KillerFry Notebook Consultant

    Reputations:
    22
    Messages:
    162
    Likes Received:
    25
    Trophy Points:
    41
    So! Here is the link to my results comparison: Clicky clicky!

    1. First result is my desktop.
    2. Second one is the Core with both the internal and external displays.
    3. Third one is the Core with just the external display.
    4. Fourth one is the Core with both displays and all my USB devices plugged in to the Core as well as physical network.
    5. Added a fifth column that has the results for the integrated 970m in the Blade.

    From those results I can guess that the card temperature has more to do with the variance in results than the bandwidth.

    Oh, yeah, also, one of the big differences is the Physics score. Here is my desktop, which is a somewhat beefy system, nothing is overclocked (it's currently summer here; I only overclock my CPU during winter):

    Intel Core i7-5930K
    Asus X99-Deluxe
    16GB Corsair Vengeance LPX DDR4 2666MHz (4x4GB sticks)
    256GB Samsung 950 Pro (boot + Blizzard games)
    1.2TB Intel 750 SSD (PCIe card version, holds most of my Steam library and work files)
    750GB Seagate Momentus XT
    Nvidia 980 Ti (reference cooler, which is the same one I use in the Core)

    Monitor is an Asus ROG Swift PG278Q. All benchmarks were run on this display.

    I think it is understandable to have a difference when it comes to CPU related things. Now, when it comes to GPU things, there is around a 20% difference in the worst case scenario.

    Also, notice that for some reason the run with all USB devices plugged in is actually higher than the one without them. Which is funny. Here's the devices that I use in a regular basis:
    • Finalmouse 2016
    • Corsair K65 RGB
    • Mayflower Electronics Objective2 (headphone amp)
    • Tascam US-1800 (audio recording and game comms, heh)
    • Vantec 7 Port USB 3.0 Hub
    Like I mentioned, Overwatch went from 85-90-ish to 70-80-ish fps; still felt pretty fluid though. I mentioned Doom; I had been playing Doom previous to the Overwatch launch but had never recorded any fps. Ooops! On my desktop it feels... pretty fluid! And on the Core, other than the loading screen snafu, it feels... pretty fluid! Last night I didn't feel like playing Heroes of the Storm, so I don't know how that goes.

    Something worth mentioning: noise! The GPU is usually inside the case, so when fans ramp up one can only hear what escapes from the case. In this case, not only is the case smaller but there are grills just in from of the GPU. It is much more audible; maybe a little bit annoying, if I'm honest. Most of the time I am wearing headphones, but when I take 'em off, it is pretty noticeable.

    Another small (pun intended) detail: the USB-C cable is short. I believe this was a conscious decision in order to maintain the 40 Gbps of the core. In my mind I would have the Core on the right of my desk - where my desktop currently is - and the Blade to my left - where I have open space in my desk. That ain't gonna happen, no sir. Maybe both Blade and Core will have to be to the left of me. So, keep that in mind.

    Anyway, I will be putting the 980 Ti in my desktop - the i7-5930K has no iGPU, and I need to get some thing off the desktop. I will re-run Fire Strike there, and will also see how many fps I was getting in Doom; let me know if there is any other easily available benchmark or maybe a game and - if I have it - I will run give it a go on the desktop and then bench it on the Blade+Core.

    Also, my buddy won't be using his Stealth over the weekend as he is going out of the city, so today I will pick it up and can also run some of the benchies on the Stealth+Core ;)

    Edit: Some words and formatting.
     
    Last edited: May 27, 2016
    Dingohk, AriStar and pau1ow like this.
  37. gametime10

    gametime10 Notebook Geek

    Reputations:
    2
    Messages:
    97
    Likes Received:
    15
    Trophy Points:
    31
  38. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,046
    Trophy Points:
    431
    I'm talking about 20% vs DESKTOP. Over TB3 to the laptop display? 20% sounds realistic to me. 5% loss to a display hooked up directly to the card in the core sounds fantastic. I don't see what's to complain about.

    Edit: just saw the graph. That still doesn't sound horrible.

    So what does the 970M in the blade get on this same test? Nevemind saw the added test. That is a huge jump in perf over the 970M. It's only going to get better from here as tech improves. Seems worth it if you don't want 2 discreet computers and just want to hook up to the core instead.

    Sent from a 128th Legion Stormtrooper 6P
     
  39. KillerFry

    KillerFry Notebook Consultant

    Reputations:
    22
    Messages:
    162
    Likes Received:
    25
    Trophy Points:
    41
    Which is exactly what I've always wished for, and finally the technology seems to be getting there.

    It does seem like a big hit looking at it from the desktop down; but from the laptop up it looks like a pretty decent improvement. Albeit, a little costly, for now; but such is the way of new things.

    Sent from my Nexus 6P using Tapatalk
     
  40. KillerFry

    KillerFry Notebook Consultant

    Reputations:
    22
    Messages:
    162
    Likes Received:
    25
    Trophy Points:
    41
    Just some quick facts I found out while re-orginizing my desk space:

    1. Remember I mentioned it has a small cable? Well, I thought to myself: "What if I use the somewhat longer Type-C cable from my Nexus 6p?" It doesn't work; the Core does not even turn on. It seems it needs its Type-C cable or it just won't play.
    2. If the laptop goes to sleep, pressing a mouse or keyboard attached to the Core will not wake the Blade. You have to press a key on the laptop's keyboard.
    3. If the Type-C cable is disconnected while you are using both displays - the laptop's internal and an external display -, and have the external as the main Windows display, then after the disconnect it all goes to the laptop's internal display with the DPI scaling options of the external display. That is, I use my ROG Swift with a 100% scaling option (which is to say... none), and the Blade's display has a 150% scaling; when the Core is disconnected, the Blade's display default's to the 100% of the previous main display (the ROG Swift in my case). A user log-out or a reboot fixes this.
    These were just a few quick fire notes while I continue to arrange my desk.
     
    pau1ow likes this.
  41. AkiraSieghart

    AkiraSieghart Notebook Consultant

    Reputations:
    0
    Messages:
    129
    Likes Received:
    32
    Trophy Points:
    41
    Shouldn't the 1080 have more of a difference when compared to the 980ti? A user on Reddit is saying that he was getting ~4100 on his desktop PC with a 980ti. On the other post, a user is saying that they were getting ~4600 with the 1080 in their desktop. That just seems low to me knowing the performance difference between the cards. Or maybe I'm just tripping.

    The MSI GS32 is a new ultra book announced by MSI that has 16GB of RAM (up to 32GB), the same 6500u, a integrated 950m, more storage options, and connects to MSI's dock via full PCI-E 3.0 x16. The thing is apparently available to purchase through XoticPC and I can't wait for someone to get their hands on it to run it through the same tests. I want to see how much of those lower scores are attributed to TB3 conversion loss. MSI also has their GS30 with the 4870HQ quad-core which is available with the same dock so we'll end up getting quad-core benchmarks, too.

    All in all, the 6500u seems to be doing impressively. I'm surprised. I'm still waiting for real-world gaming benchmarks before making a decision, however.
     
    hmscott likes this.
  42. Deathalo

    Deathalo Notebook Consultant

    Reputations:
    1
    Messages:
    130
    Likes Received:
    2
    Trophy Points:
    31
    Dingohk and hmscott like this.
  43. KillerFry

    KillerFry Notebook Consultant

    Reputations:
    22
    Messages:
    162
    Likes Received:
    25
    Trophy Points:
    41
    Hello there!

    So, here are some updated benchmark results using 3DMark's Fire Strike Ultra: Clicky clicky! Last time I ran regular ol' Fire Strike, but this time I went for Ultra. Something of note, is that I ran three times the benchmark for each case, and but I'm only adding to the comparison the third run. I did this so that the first two runs would heat the GPU up and simulate an more "real" gaming session scenario. It is still a small test sample tho; but hey, I don't have all the time in the world to bench, I also have to game for fun!

    Here's the column order:
    1. My desktop PC - 3,878.
    2. Blade 14 using the laptop's internal display - 3,460.
    3. Blade 14 using my external display - 3,692.
    4. Stealth using the external display - 3,487.
    Now, some real games and general fps. Right now I'm rotating my gaming time between Heroes of the Storm, Overwatch and Doom; so those are the only ones I tried for now. All of them were played on the external display (a ROG Swift) and with the same highest quality settings that I do on my desktop (which is to say, everything maxed out! ;)):
    • Heroes of the Storm
      • Desktop - 80-100 fps, staying mostly in the 90's.
      • Blade 14 - 60-90 fps, staying mostly in the upper-70's.
      • Stealth - 50-70fps, staying mostly in the mid-60's.
    • Overwatch
      • Desktop - 80-90 fps, staying mostly in the upper-80's.
      • Blade 14 - 70-90 fps, staying mostly in the lower-80's
      • Stealth - 60-80 fps, staying mostly in the mid-70's.
    • Doom
      • Desktop - 130-170 fps, staying mostly in the 140's
      • Blade 14 - 90-120 fps, staying mostly in the 100's
      • Stealth - 30-50 fps, staying mostly in the upper-30's

    Again, one of the biggest differences was the Physics score, since we're going from a 5930K to the 6700HQ and the 6500U. Now, this is of particular interest because when going to actual games - such as Overwatch and Heroes of the Storm - I would believe that physics are what impact fps the most, particularly on the Stealth. Maybe if I were to lower the physics settings the fps could improve. I didn't try it, but I am confident that it is so.

    Now, something that got me thinking a lot, is that if you compare these results to the ones in my previous post using regular Fire Strike ( Clicky clicky in case you wanna see them), you will notice that the difference between the desktop and the Core vary wildly. My guess is that in Ultra there is more time taken by the GPU rendering the actual scene; therefore, bandwidth limitation is not an issue. Notice how the actual graphic test fps in Ultra just lose around a 7-8%, even in the Stealth! But when you move to lower graphical settings, such as regular Fire Strike, in which the GPU doesn't take as much time rendering, it is then that bandwidth might become the bottleneck; we're talking about 20% and 50% differences for the Blade and Stealth, respectively. This too, reinforces my believe that physics has something to do with fps loss in actual games.

    I find this particularly interesting because it means that the higher settings you run your game at, the smaller the performance gap might be - barring a CPU bottleneck, though. Also, it will be cool to find out the implications of Vulcan.

    Other things: it seems like the experience is a little more refined with the Stealth; it is more plug-and-play. I think the reason for this might be the 970m on the Blade 14. Actually, Nvidia Geforce Experience is very, very confused on the Blade; sometimes it thinks I have a 970m, sometimes a 980 Ti.

    Also, I noticed that my USB headphone amp would behave weirdly. When there was nothing playing, it... "disconnected". Not from Windows, there was not USB unplug sound. A soft-disconnect/sleep, if you will. As soon as I started playing something it would take a few seconds before actual sound came out. As soon as the music would stop, it would "disconnect". I wonder if, in order to save bandwidth, the Core soft-disconnects devices that are not being used.

    Well, there you have it folks! A somewhat light more in-depth-ish look at the Core. All in all, I am pretty satisfied and excited for this sort of technology. Around 13 years ago when I went to college out of town, I had to take my desktop and my laptop to school; obviously I used the laptop for school related activities (a solid eMachines M6807 with ATI Mobility 9600 capable of running Doom 3!) and my desktop for gaming. I dreamed that something like this would exist some day.

    And now it does!
     
    Last edited: May 29, 2016
  44. hfm

    hfm Notebook Prophet

    Reputations:
    2,264
    Messages:
    5,296
    Likes Received:
    3,046
    Trophy Points:
    431
    It is truly exciting. The tech implemented in this manner is also extremely early. Even though I'm not in the market for it right now (I just use mine on a lap desk in my recliner) I may be some day. I'd jump on it without hesitation if I was, looks solid and performant.

    Sent from a 128th Legion Stormtrooper 6P
     
    Last edited: May 30, 2016
    hmscott likes this.
  45. ETisME

    ETisME Notebook Guru

    Reputations:
    4
    Messages:
    70
    Likes Received:
    29
    Trophy Points:
    26
    Thanks for the gaming benchmark. Gets to show the cpu is not unexpectedly the big bottleneck and exactly why I chose to wait until benchmark is out.

    I am gonna wait out the msi and their gaming dock since apparently they have full bandwidth
     
  46. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    Whoa, what's with the Doom performance on the stealth?
     
  47. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    Whoa, what's with the Doom performance on the stealth?
     
  48. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    Whoa, what's with the Doom performance on the stealth?
     
  49. AkiraSieghart

    AkiraSieghart Notebook Consultant

    Reputations:
    0
    Messages:
    129
    Likes Received:
    32
    Trophy Points:
    41
    ASUS' XG2 is also due to be reveled (it was announced at CES 2016). It uses two USB-C connectors and doesn't have the performance hit from the PCI-E to TB3 that Razer has. It does have the same 40 Gb/s data transfer but ASUS' laptop will have a 6700HQ instead of the 6500u. I'm personally going to go with either MSI's dock or ASUS'.
     
  50. Eason

    Eason Notebook Virtuoso

    Reputations:
    271
    Messages:
    2,216
    Likes Received:
    892
    Trophy Points:
    131
    I saw that- because my pre-order hasn't shown signs of shipping soon, I decided to cancel it. I don't even have a laptop at this point, just a mini-itx system, so I'm going to try to wait for the XG2.

    edit: I'm a liar. I emailed HIDEvolution to cancel my pre-order, then they told me "are you sure? We're getting 5 units on Wednesday and we can fulfill your order then." I folded like a Japanese laundromat. :(
     
    Last edited: May 30, 2016
    pau1ow and hmscott like this.
← Previous pageNext page →