Does anyone know when the successor to the current Nvidea cards is expected to come out? Has anything been mentioned about them?
-
FrozenSolid Notebook Evangelist
-
So far Nvidia has used up their "refresh" with slower Max-Q GPU's, and since AMD didn't come out with anything faster than a 1080 with the current Vega line, Nvidia isn't really motivated to release anything faster.
No need to drop prices either, so going forward it's likely going to be this way for another year, when both AMD and Nvidia have their next "process" ready - AMD 7nm and Nvidia 10nm. -
FrozenSolid Notebook Evangelist
That is a bit sad but thanks for the info. I will just shelve the idea of refreshing my computer if there is no benefit to it. At least for another 12 months.
hmscott likes this. -
You never know though, sometimes things happen quicker than expected, so keep watching the rumors headlines sites
Papusan likes this. -
With 1080 in the P775DM2, am I correct in assuming you are looking to improve on the P375SM-A's aging 980M?
I cannot imagine the 1080 not easily shredding almost anything you might want to throw at it now. -
-
Something other than Max-Q?
-
Yes, but now we are going to be stuck with those +/- variants of the existing chips a while longer.Papusan, Mr. Fox, sicily428 and 1 other person like this.
-
I just saw where Nvidia is now putting higher performance GPU's "into" laptops by providing an external box:
Nvidia shows off Titan Xp and Quadro external GPU solutions
http://hexus.net/tech/news/graphics/108514-nvidia-shows-titan-xp-quadro-external-gpu-solutions/
NVIDIA External GPUs
http://nvidianews.nvidia.com/news/n...ve-power-to-millions-of-artists-and-designers
It looks like Nvidia has given up on higher performance internal GPU's for this generation. -
Meaker@Sager Company Representative
This is why I wanted AMD to turn the screw on Nvidia but it just is not happening.
-
AMD didn't need to outperform the 1080ti or above, that's a very small % of the installed base of Nvidia GPU's.
AMD only needs to cover 95% of the market with products that cost less.
That's enough to start, enough to provide an alternative to anyone wanting to stop giving money to Nvidia. But not enough to wake up Nvidia.
Shhh, don't wake Nvidia, it's enough for now.
-
Nvidia's road map showed the first Volta architecture chips arriving at the end of 2017 or beginning of 2018. Whether they stick to that schedule will depend on the maturation of their die process and yield results I imagine.
-
The GV100 is announced first - end of 2017 for commercial use.
The gaming desktop GPU announcement's usually follow by 3 months, 6 months after annoucement of the data center GPU's the laptop GPU's are announced. Shipping usually follows 1 month after announcement.
So we could see Volta GPU's in the middle of 2018 start shipping in laptops, or at least announced by then.
Then wait 3 months after release for production issues, driver issues, specific card / laptop issues to be fixed, reviews published with representative benchmarks - not the fluff put out at first.
So by this time next year it could be time to start considering a Volta laptop, and 3 months after that, it's getting close to the time to pick one, Merry Xmas 2018
Papusan and AZHIGHWAYZ like this. -
Does this also mean Clevo will have a hard time implementing said chip even into their own mxm design?
-
Meaker@Sager Company Representative
For the desktop side sure, for the notebook side they also need to keep the power in check.
For now on the desktop if they can make the 1070 and 1080 sweat it will be a step in the right direction.Papusan, hmscott and AZHIGHWAYZ like this. -
That right there is the biggest hurdle in mobile graphics. Balancing power and performance. If they get a decent mix of both they've got it made.
-
Hello, Prema.
When will your blog reopen?
I want to get your P151em Prema Mod firmware file.
How do I download it? -
-
I don't want to be the guy that has to do the GPU board design.
-
Is your new website address (PremaMod.com) right?Last edited: Aug 5, 2017
-
you need to figure out a licensing system for individual users @Prema.
Last edited: Aug 5, 2017 -
So when is your new website opening?
Can I get a P151em Prema Mod firmware file with a little donation? -
Glad Clevo didn't settle for BGA trash. Better to keep what we already have than to entertain the notion of stooping to acceptance of that worthless filth.Gursimran82956 and Papusan like this.
-
Meaker@Sager Company Representative
For an MXM based Vega? It's technically more simple, just a lot of power phases around the package. -
I wonder how big the heatsinks will be for VEGA based GPU's? Then there's the power brick the system will need. Especially if the system utilizes Crossfire.hmscott likes this.
-
Meaker@Sager Company Representative
It can be whatever you set the TDP to, just that's the level of performance you will get.
Then it depends on the price/performance you can offer it compared to a similar 1070/1080 equipped notebook. -
That's the kicker right there, along with how much you can do with overclocking. Even if it matches 1080 stock but doesn't overclock well, it will not be an attractive option for some of us. That, and poor durability, has been an Achilles' Heel for them in the past. I am hoping that neither will be the case going forward. It would be awesome to finally have an equal or better option than what we get from the Green Goblin. This would be quite a paradigm shift.Papusan and AZHIGHWAYZ like this.
-
Meaker@Sager Company Representative
Me too, I HOPE we get another good option like the 7970M was at launch.Papusan, Mr. Fox and AZHIGHWAYZ like this. -
I've seen some amazing things in my time, but if AMD manages to drop the power requirements of Vega in the next year without sacrificing performance I'd count that as something truly incredible. I'd love to see AMD be truly competitive again in all aspects of the market I really would.
-
I think we all wanted this to happen. It is really sad that it won't. Another year of monopoly and mediocrity for the Green Goblin as Team Red drifts further and further away from being a relevant force in the world of mobile GPUs. After 5 years of being asleep at the wheel it may be very difficult for them to ever recover. Such a long and dry stint of total irrelevance is very damning to a reputation.Papusan likes this.
-
Rx Vega 56 & Rx Vega 64 Overclocking Benchmark Leaks (Rumor)
Not mentioned what the tests / games were, only these results in % over / under a 1080:
Monday is the embargo release day for performance results, hopefully it's just like we see here, or better
bennyg likes this. -
I sure hope it turns out good, and I hope they're not overclock-emasculated like Ryzen. And, if so... it's coming to MXM... when?
Well, when 99 out of 100 laptops is an emasculated, crippled, over-heating, poorly binned, disposable, soldered piece of dog poop, burning calories to make an awesome GPU for a piece of garbage like that is kind of pointless. We should not have surprised looks on our faces.Last edited: Aug 12, 2017 -
Meaker@Sager Company Representative
Have you seen thread ripper reaching into the 4.1-4.2ghz range on all 16 cores? That's not too bad, it does need aroun 700W to do that though lol.Mr. Fox, Papusan, hmscott and 1 other person like this. -
I haven't been happy or impressed with 4.2GHz since before I upgraded from 2720QM to 2920XM. And, 4.2GHz wasn't good enough with my 4960X in the P570WM... 4.7GHz was good though. What is also pathetic about TR barely being able to manage 4.1-4.2GHz is the fact that it is such an insignificant increase from stock clocks. I am extremely unimpressed by that, just as I have been very unimpressed by how crappily AMD GPUs have historically handle overclocking. Parts that do not overclock well are not enthusiast components. They are mainstream consumer sheeple parts no matter how powerful they might be running stock. I have zero interest in running CPUs or GPUs at stock clocks and won't spend my money on junk that doesn't overclock well.Last edited: Aug 13, 2017Papusan likes this.
-
-
-
Both Amd and Intel sucks with latest cpu chips. But Amd sucks much more due lack of OC'ing. See also the retarded performance in gaming with latest Intel chips. No improvement vs. previous gen. Everything sucks now!!
-
I agree. We live in very sad times. BGA will always be filth and the better stuff that is not BGA filth doesn't perform any better than the older stuff it replaced. NVIDIA is dumber than ever before (no MXM support, blocking vBIOS mods, limited SLI support, etc.). And, we have the worst version of Windows ever conceived by the Redmond Mafia. I still give credit for this mostly to the growing stupidity of consumers. If people refused to purchase or use garbage they would stop it because making money is more important to them than anything else. But, the sheeple keep saying yes to the mess. It will likely get worse.
-
And, now it's the same for i9 isn't it?
New architecture, new instructions, new optimizations.
BTW, this is the "New GPUs" thread, not CPUs, I was just making a point as an example.Papusan likes this. -
All GPU's are soldered BGA... right?
MXM is a "riser" card technology, not LGA. Just because it's replaceable doesn't mean it's better, it means it's replaceable.bennyg likes this. -
See my reply in previous post
Both, sucks!! Terrible time buying new tech.
hmscott likes this. -
Intel sucks much more because it's INTEL!!
Don't forget that
And, it's also pulling far more power than AMD is, mostly because AMD isn't OC'ing, so the power draw increase stops - detune voltage for less heat and stable operation, much lower power draw than Intel.Papusan likes this. -
What's the TDP for Intel and AMD's 16 cores chips (default)?
And bench scores with default clocks? I don't defend Intel. Both sucks. Same for graphics (Nvidia vs. Amd as this is a gpu thread).
Last edited: Aug 13, 2017hmscott likes this. -
Yes, absolutely. Thank you for pointing that out. This is why discrete graphics (or CPUs) soldered to the motherboard is a "let's build disposable trash" approach to things. That excuse is never a good one, but BGA lovers often attempt to use it as a means to justify the existence of inferior products.
So, we could probably safely say "BGA is OK. It's how it is misused and abused that makes it suck."
That said, beside the fact that they are permanently attached to the motherboard, the CPUs used for BGA application in notebooks are inferior to their socketed counterparts.Last edited: Aug 13, 2017 -
Default TDP for Intel is a joke, it's more than a lie it's an outright deception that caused x299 board makers to under design and under build their VRM's and VRM cooling.
ThreadRipper doesn't suck at all, in fact it blows Intel away.
Last edited: Aug 14, 2017 -
-
That's what my point has been from long ago, it's not BGA it's the throttled CPU design limiting it to 45w vs 95w desktop, it's a bulk detune on Intel's part to feed a cost effective solution systems solution to a major portion of the market.
The 45w CPU isn't cheaper because it's supposed to reduce the cost of everything else - the cooling, the power, the support hardware is all reduced to save $$$ and provide an environment that is more readily buildable for a lower price, and smaller footprint.
Performance reduction as a means to reduce thermal cooling requirements and reduce system noise levels, makes ya kinda wonder what took Nvidia so long to copy that idea from Intel
Last edited: Aug 14, 2017Papusan likes this. -
I don't think there are too many people stupid enough to not understand it. Well... maybe a few. I think it's more often along the lines of don't know, haven't been informed, haven't experienced why it sucks, or in some cases don't care.
I agree with you to a great extent. The part that we may differ on is that even if the parts were identical in every other respect, including performance, it would still suck to have it soldered and I would refuse to accept it on that basis alone. The primary basis for BGA design in notebooks is to push sales of new products by making existing products more difficult and costly to repair, and block upgrades where upgrades would otherwise be possible but for the soldered filth. It is a self-serving, profit-motivated, anti-customer, Nazi control freak approach to engineering.
I'm still waiting to see extreme overclocked benchmarks that prove that. Stock performance doesn't count (to me). I'm not interested in seeing or running stock benchmarks.
Last edited: Aug 13, 2017 -
Have you the power draw numbers for absolutely maxed OC from both Intel and 16 cores TR? Maxed OC from both. Maybe @Meaker@Sager know?
-
Meaker@Sager Company Representative
No insider info for me but looking at 10 core Intel vs 16 core AMD I think Intel 18 core is going to eat power like no tomorrow, especially if you push the frequency.
700w for AMD 16 core pushing it hard and 400w for Intel 10 core.
However the high core count silicon for intel will have a more hungry internal interconnect.
We have never had a high core count chip hit retail markets and never unlocked to boot (it has always been low core count and extreme core count is still not on the cards).Papusan likes this.
New GPUs?
Discussion in 'Sager and Clevo' started by FrozenSolid, Aug 1, 2017.