What kind of MS botch exactly?
Maybe I can apply the same solution to my GL702ZC and see if I get any benefit if you describe the process.
-
-
Ok... so, what's the solution?
What did you do to get the performance back up?hmscott likes this. -
hmscott likes this.
-
I personally dislike Passmark but as mentioned it was the easiest to see the issue once it crops up. Remeber while installing and updating to keep constant backup images, I like Macrium, so once it crops up it can be fixed rather than install from them get go. FYI this can can happen to Intel or AMD systems, so keep an eye out.
-
Nothing as it is, the problem must first be admitted by them and then it can be addressed. @ajc9988 seems to have turned it on and off with HPET. I noticed it eventually turning on the performance hit but could with the setting alone regain the performance back. I'll try again att a later point though as it has been a few months.
Last edited: Jun 21, 2018 -
Powercolor RX Vega 56 Nano Review
OC3D TV
Published on Jun 20, 2018
https://www.overclock3d.net/reviews/gpu_displays/powercolor_rx_vega_56_8gb_nano_edition_review/1
Vasudev and Robbo99999 like this. -
Robbo99999 Notebook Prophet
-
https://wccftech.com/amd-ryzen-thre...-thread-cpu-specs-performance-overclock-leak/
cooler; https://www.corsair.com/us/en/Categ...O-RGB-360mm-Liquid-CPU-Cooler/p/CW-9060031-WW
Not too double post but this too me is amazing. If 3.4 base and 4.2 single core turbo on only 250w, that is amazing. The cooler too was what they originally intended from an AIO, and that is one of the round base plate options, Agreed one of the higher end ones but still.
To me the overclock scores are a bit low but this again could be from several issues. Exciting times ahead! -
-
Below is the ROG new Intel CPU with a special 16 phase power delivery. In the video CB R15 only got 6109, so the TR seems to beat it.
-
ha! called it! i knew that the 28 core and tr2 were gonna be reaaaally close to each other in terms of raw computing powah. intel ipc makes up for two cores so at identical clocks the 32 TR2 cores should be equivalent to 30 intel cores. add to that the higher intel frequency and ull get 28 = 32 cores
(not splitting hairs at this point and arguing about 1-5% differences in CB15 scores here
)
-
I think you will find the Intel offering will draw quite a bit more power to get to CB R15 6100 score. Same will be heat dissipation etc.. Now this while AMD will not yield much more over stock with overclocking And Intel will but only to closely match the performance level of the new 2990x.
It goes too show as well how bad it is getting over at Intel. Once 7nm is here they will be in lousy shape. Even if the get to 4.0 on base turbo boost and 4.8 on XFR2 and assuming they stay at only 32 cores.
As it is we do not know where Epyc 7nm is going to end up. It is possible it could give the current 28 core Xeon a run for its money now and this is their already low yield $10,000 server CPU. -
-
I hadn't seen the video as I'm at work just now.
Intel usually boosts higher on single and all core boost to give it the needed advantage... The IPC itself is about 5% (any other advantage is likely due to software optimizations for Intel).
At any rate, the Intel CPU will likely cost a lot more (if the going price right now is close to $10 000 given the hardware which is re-used).
Tr 32 cores is likely to cost in the area of $1500... Or $1350 possibly given that 12nmLP dropped costs.
Then there are also power consumption questions between the two.
More comprehensive benchmarks and real world utilization would be helpful/useful for giving more accurate results between differences in two products. These are preliminary leaks -
$1,500, you mean they are not a free upgrade.
TBH, I would love to see them reward their first adopters by offering it for $1,500 but also offering a $500 rebate initially for a short time to present TR owners too. Hey I can have pipe dreams too.
Last edited: Jun 22, 2018 -
I'm not saying the same will necessarily happen again, but it gives us an approximate ballpark for the 12nmLP products.
So, based on previous pricing, 32c TR could end up being in the $1500 ballpark (possibly less)... maybe up to $2000.
Unless Intel does something radically different, they will likely price their 28c much higher in comparison. And if they do release it over clocked to 5ghZ with a custom cooler, the price will go up significantly (as will the power consumption from not just the CPU, but the custom cooler).Last edited: Jun 22, 2018 -
-
i apologize guys, i wasnt talking about the water chilled beast that scored 7334 but the more realistic one with regular water cooler that scored like 6000ish
Sent from my Xiaomi Mi Max 2 (Oxygen) using TapatalkLast edited by a moderator: Jun 23, 2018 -
Actually water chiled may have been fine, it is the vapor chilled that is a deal killer.
-
There was an intro video guys for TR2 but it looks to be gone;
https://www.pcgamer.com/amd-makes-a...asing-launch-of-its-32-core-threadripper-cpu/
the link was;
https://www.youtube.com/AF_fRowqsc -
At any rate, stock TR 32cores apparently scores about 6% higher than the supposed 28c CPU from Intel (also set to stock) - and the Intel 28 core CPU according to some reports might never see the light of day in the first place - and if it does, it will probably end up being FAR more expensive.
Plus, that Intel 28 core is close to 32c TR mainly because it can clock higher on all cores as I said before - but we'd also need a more realistic benchmark as Cinebench isn't always giving us the real difference.
We need a software that can take advantage of both architectures equally (without preferential treatment), equalize the clock speeds on both and then see which one is faster. Anything else is skewed one way or another.
The water chilled version of Intel 28c... I'm not sure if that was ever tested. Can't find any references to it. Only to AMD TR 32c being overclocked to 4.12 GhZ across all cores on water cooling.
The Intel 28c on stock scores 5900 in CB.Last edited: Jun 23, 2018Vasudev likes this. -
The ROG unit was water cooled, albeit the core count nor frequency announced. Not knowing the memory used on the one leaked TR2 I would take all those cores with a grain of salt as well. Not too much longer before we really know.
-
Another reason Intel is in trouble is AMD will have 4ghz Zen2 modules by the crapton they will be able to make as many 32 core TR2s as they need. Whereas Skylake-EX monoliths that will have all cores functional will not be so readily available...
Vistar Shook and hmscott like this. -
-
This is the CPU-Z screenshot:
https://www.techpowerup.com/245301/...of-amd-ryzen-threadripper-32-core-cpu-surface
"On the CPU-Z screenshot, the 2990X is running at 3.4 GHz base with up to 4.0 GHz XFR, and carries a 250 W TDP - a believable and very impressive achievement, testament to the 12 nm process and the low leakage it apparently produces." The rest goes on to say it was overclocked to that on all cores, but I have my doubts on that. Now, AMD usually uses ram within the spec for the chip, so it is likely between 2666 and 3200 (I forgot where they are stopping this round, but yeah), and not with the greatest timings for this test. They may even have done 2133 to default the chips on the ram they had, that way to give that this may be the worst you would expect. The truth is, they didn't give us the info to draw the conclusions that are being drawn.
They say it was all core at 4.12, but that doesn't make ANY sense. Think about it, a stock TR 1950X with stock mem is between 2900 and 3100, with all core on 3.7GHz with 2133 loose timings. That is 26.18points per thread per ghz. If you take that, then a perfect linear scaling of the 3100 score would give you 6904 for the CB15 score for 32 Cores at 4.12GHz. The 6399 score is 7.3% slower than perfect linearity. At 2900 we get 24.49 points per thread per GHz, which would scale to 6458 points. So, I may be wrong and this is just a very non-tuned rig the reporter examined. That means it is losing 1%-7.7% score due to the extra dies having no IMC and having to always jump die, which is not outside of the realm of possibility. So I could be wrong and the ram is slow as **** and it isn't tuned, and we are seeing that, whereas a person that can tune it could toss another 1,000 points on the score, potentially (going from people getting in the 3400-3600 range on CB15 scores, and depending on base, with having double the cores, so just a doubling of the spread from 500 to the 1,000 point number).
But now that I've run the math, I may be incorrect and the reporter did see an overclocked chip give that score.
Edit: now, if the ram was run much faster than that or with tight timings, I start to question how much the extra dies hit performance more. Either way, that is a discussion for when more info is out.
Last edited: Jun 23, 2018bennyg, TANWare, Vasudev and 1 other person like this. -
It is great knowing the chip is a true monster. Based on one sample with limited info it is hard to assume overclocking performance. Even slow ram slowing down the IF could be bottle necking it as well. We do know something is wrong but until it is out in the wild in volume it is hard to tell what or even speculate. Again all I know is what appears to be stock would make me extremely happy but I also am always up for more!
ajc9988 likes this. -
Although, that could be down to AMD using lower speed RAM for the purpose of portraying baseline performance as not everyone might be able to afford high speed RAM... then again, TR2 is for servers... you'd think that businesses would at least be using 3000MhZ and above with lowest possible latencies.
Maybe AMD should have 2 benchmarks... 1 that tests with lower speed RAM and another one with high speed RAM.
If the latter can bring up performance another 5-10% (this also depends which RAM AMD was using at the time), that can easily place it very close to 5GHZ Intel LN overclock stunt score... and it would still cost far less than the Intel chip without the LN. -
yrekabakery Notebook Virtuoso
-
Vasudev likes this. -
Thankfully TR2 is not for servers. The Epyc 7551P goes for about $2,300 as it is so this would cost even more. This makes even more sense too as with those 2 non direct memory access CCX's performance hit keeping these from competing against the Epyc chip(s). I would like too see $1,500 as well for these but I think it will be higher. I could easily see upwards of $2,000 if the performance holds up.
Dropping the price too $1,300 - $1,500 would be like death to Intel. They would be forced to sell that 28 core for well under $2,000 from the original $10,000. Then again they could sell it for $6,000 so that they can say it is on the market even if they never sell one.Last edited: Jun 23, 2018jaybee83, Vasudev, triturbo and 1 other person like this. -
https://www.tweaktown.com/news/6236...per-2990x-32c-64t-rumored-1750-usd/index.html
https://segmentnext.com/2018/06/27/amd-ryzen-threadripper-2990x/
https://hothardware.com/news/ryzen-threadripper-2990x-32-core-cpu-preorder
https://techreport.com/news/33856/rumor-ryzen-threadripper-2990x-makes-its-first-showing-at-e-tail
https://www.overclock3d.net/news/cp..._ryzen_threadripper_2990x_appears_on_3dmark/1Last edited: Jun 27, 2018jaybee83, Vasudev, ajc9988 and 1 other person like this. -
the german cyberport listing notes 3.4 Ghz in the title with 4.0 Ghz boost. however, in the description below it also mentions 3.8 Ghz. yes of course, now u might say: cant trust that listing because they just copy pasted it from the old ryzen parts. BUT: TR1 doesnt have a distinctive 3.8 Ghz base or boost, so why is it included? Add to that the 3DMark listing that popped up, also stating 3.8 Ghz.
so my assumption: 3.4 Ghz base, 3.8 Ghz all core boost, 4.0 Ghz regular boostVasudev likes this. -
Look at the ryzen 1700.
It has a base clock speed of 3ghz, all core boost at 3.2 GHz and single core boost to 3.8ghz.
It could be that tr2 has a base clock of 3 GHz, 3.4 GHz boost all cores, and 3.8-4ghz single core boost.
Yes the 12nmLP did drop voltages and allowed higher frequencies, but we're still talking about 32 cores here.
If my 1700 was quadrupled, its tdp goes to 260w with frequencies unaltered.
So, depending on how well AMD fine tunes tr2… 3.2-3.4 GHz base, 3.8 GHz all core boost and 4-4.1ghz single core boost might be doable. -
It could also be CPU0000 is an engineering sample too.
-
thats likely supposed to be intel's upcoming mainstream 8core i9-9900k...
-
https://www.theinquirer.net/inquire...ll-be-much-cheaper-than-intels-i9-7980xe-chip
https://segmentnext.com/2018/06/28/32-core-ryzen-threadripper-2990x/
https://hothardware.com/news/ryzen-threadripper-2990x-32-core-cpu-preorder
http://hexus.net/tech/news/cpu/119642-amd-ryzen-threadripper-2990x-listed-1509-germany/ -
While things are still in the air... may I just say that I already mentioned the price will likely be lower than $2000 for 32c TR2 and possibly more in line with $1500 (lower even maybe) considering previous AMD's pricing of TR line vs Ryzen and the fact that 12nmLP dropped costs?
-
-
They provide an index, you could post all of the index with the points of interest highlighted:
Published on Jun 29, 2018
Support us on Patreon https://www.patreon.com/hardwareunboxed
News Topics:
00:18 - Micron Begins GDDR6 Mass Production
02:07 - Threadripper Price Drops
03:24 - G-Sync HDR Module Costs $500+
05:12 - Corsair Acquires Elgato Gaming
06:03 - AVerMedia Launches 4K HDR Capture Boxes
07:04 - SD 7.0 and SD Express Spec Announced
07:56 - Razer Hunstman Keyboard with Optical Switches
08:53 - Ryzen Threadripper 2990X Appears at Retailer?
10:12 - Nvidia Next-Gen GPU Engineering Board Pictured
The HDR G-sync news about being so late to market due to a custom FPGA that in small quantities costs $2000 each, even in quantity for production they still cost $500, which makes AMD's Freesync 2.0 HDR a whole lot cheaper to manufacture and sell, go AMD!!
There is also a little known NBR rule that image or video only posts aren't allowed, we are required to post text, like a title or other comment with them. The Random Image / Video threads are exempt from that rule, apparently.
There are a number of channels following in the Level1Techs footsteps doing this kind of news show now, I've been posting the Level1Tech's videos for a while in OffTopic:
http://forum.notebookreview.com/thr...26-2018-kraznich-inside.820421/#post-10755406Last edited: Jun 29, 2018jclausius likes this. -
As far as the TOS it counts on the original post. Where as an image without any textual content may be an image only one with text content may not be. That is so long as the content does not violate TOS or try and get around it. Like off topic, language etc.. At least this is my interpretation of it.
hmscott likes this. -
-
The rules;
http://forum.notebookreview.com/threads/forum-rules.109941/
I may even have been the one who deleted the post, I would have to see the image. An image with text included in it, to me, already includes text but since there are no filters it is subjective somewhat. Like a comic 9 times out of 10 is off topic and therefore without supportive text not a viable post. Just having a saying etc. is the same thing.hmscott likes this. -
- Image replies are not allowed (posting just an image in response to a question)
Unless the video is about a single subject, or is the 1st subject in the video, I post a Start @ time + some context, and if posted with the video I include the timecode indexed list of subjects highlighting the ones I find of interest. -
Right, if it were a video say just showing Linus's pretty face holding up a CPU that would be a no-no. Again, I would have to see it and it is subjective from mod too mod. Again the content, now having text even if not need almost guaranties the post will be kept.
I agree a legend is helpful but I though about everything in the end was somewhat AMD related, even GDDR6.hmscott likes this. -
AMD is looking to replace motherboards with chiplets it would seem:
https://pcgamesn.com/amd-kill-the-motherboard-with-chiplets
Aggresively priced AMD based laptop with really good specs:
https://www.forbes.com/sites/marcoc...awei-matebook-d-14-hits-the-u-s/#1ecd9e6427e9
AMD also appears to be sending Ryzen/Radeon care packages to devs for optimizations (should help with software performance - but you'd think that AMD has been doing this already?) :
https://www.extremetech.com/computing/272447-amd-is-sending-ryzen-radeon-care-packages-to-developershmscott likes this. -
The integrated silicon weave is cool, long time coming. It should bring about new standards for size / density and cooling, but I hope they make it user DIY friendly, it could easily go completely locked in - and would make sense from design benefits to do so.
AMD has a huge window opening up to approach laptop vendors to fill the looming gap left by Intel's failure to deliver 10nm laptop CPU's, vendors that were counting on them are left twiddling their thumbs, and are gonna have to rehash last gen parts from Intel to have new products.
Intel silently launches 10NM/Cannon Lake cpus
http://forum.notebookreview.com/thr...-cannon-lake-cpus.817097/page-2#post-10755492 -
Hardware Unboxed
Published on Jun 29, 2018
Now Linus has laid down the Gauntlet to Nvidia to come on and explain why HDR G-sync costs and sucks so much because G-sync isn't competitive with AMD Freesync to the point that TV vendors like Samsung are including AMD Freesync, Microsoft Xbox is including Freesync (Sony too), etc etc etc, all easy to switch between with AMD Freesync, what's Nvidia going to do now?
WAN Show June 29 2018
Published on Jun 29, 2018
00:11:07 - NVIDIA G-Sync HDR module adds $500 to monitor pricing
Here's the PCPER article that brought this all to light:
ASUS ROG Swift PG27UQ 27" 4K 144Hz G-SYNC Monitor: True HDR Arrives on the Desktop
https://www.pcper.com/reviews/Graph...z-G-SYNC-Monitor-True-HDR-Arrives-Desktop/Tea
"...After disconnecting the cables running to the PCB in the middle, and removing the bracket, we gained access to the electronics responsible for controlling the LCD panel itself.
A New G-SYNC Module
Now that we have a better view of the PCB, we can see exactly what the aforementioned blower fan and heatsink assembly are responsible for— the all-new G-SYNC module.
Over the years, there has been a lot of speculation about if/when NVIDIA would move from an FPGA solution to a cheaper, and smaller ASIC solution for controlling G-SYNC monitors. While extensible due to their programmability, FGPA's are generally significantly more expensive than ASICs and take up more physical space.
Removing the heatsink and thermal paste, we get our first peek at the G-SYNC module itself.
As it turns out, G-SYNC HDR, like its predecessor is powered by an FPGA from Altera. In this case, NVIDIA is using an Intel Altera Arria 10 GX 480 FPGA. Thanks to the extensive documentation from Intel, including a model number decoder, we are able to get some more information about this particular FPGA.
A mid-range option in the Arria 10 lineup, the GX480 provides 480,000 reprogrammable logic, as well as twenty-four 17.4 Gbps Transceivers for I/O. Important for this given application, the GX480 also supports 222 pairs of LVDS I/O.
DRAM from Micron can also be spotted on this G-SYNC module. From the datasheet, we can confirm that this is, in fact, a total of 3 GB of DDR4-2400 memory. This memory is likely being used in the same lookaside buffer roll as the 768MB of memory on the original G-SYNC module, but is much higher capacity, and much faster.
While there's not a whole lot we can glean from the specs of the FPGA itself, it starts to paint a more clear picture of the current G-SYNC HDR situation. While our original speculation as to the $2,000 price point of the first G-SYNC HDR monitors was mostly based on potential LCD panel cost, it's now more clear that the new G-SYNC module makes up a substantial cost.
It's an unstocked item, without a large bulk quantity price break, but you can actually find this exact same FPGA on both Digikey and Mouser, available to buy. It's clear that NVIDIA isn't paying the $2600 per each FPGA that both sites are asking, but it shows that these are not cheap components in the least. I wouldn't be surprised to see that this FPGA alone makes up $500 of the final price point of these new displays, let alone the costly DDR4 memory."ajc9988 likes this. -
Posting this video both here and in the AMD Ryzen thread for different reasons. This is based on research papers which included an AMD researcher, but it references mesh network interposers, which may explain why Intel went with mesh on their current lineup, although Jim at AdoredTV did misstate when Intel adopted the mesh network, as it theoretically existed with project Larabee and was found on the Xeon Phi chips (at least by gen 2) before it was used widely on their server and HEDT lineup. Meanwhile, this has obvious potential uses on graphics cards and CPUs by AMD, which may be incorporated into future products, so properly belongs there as well (as some monitor one thread and not both). So here is the video and the papers on which the video is based:
http://www.eecg.toronto.edu/~enright/micro14-interposer.pdf
http://www.eecg.toronto.edu/~enright/Kannan_MICRO48.pdf
AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.