The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    AMD Admitting Defeat? AMD Clarifies Why It Uses Intel Core i7 In Its Project Quantum Gaming PC

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by Mr. Fox, Jul 1, 2015.

  1. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
  2. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    Decision to use Intel CPUs in such a project makes perfect sense to me, offering CPUs from competitor as well as their own should definitely improve sales. I believe AMD reputation will not suffer from this move, but on the contrary there will be positive impact if any.
     
    Mr. Fox likes this.
  3. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    I'd say the content of this article likely factor into this: http://arstechnica.co.uk/gadgets/20...cheaper-solution-will-refocus-on-performance/.

    For better or worse, AMD's CPU division is between a rock and a hard place and they can't really compete with Intel in the performance per Watt as well as at the desktop high end.

    Their GPU division isn't doing too badly and it makes sense to pump out hardware that will sell their GPU. If it gives them the cash to cover the R&D costs of their Fury GPUs with HBM, all the better since we're likely to see the tech trickle down to lower end GPUs sooner. Now, if they can get their drivers up to par with nVidias in terms of stability, we'd have some pretty healthy competition as far as GPUs are concerned.
     
    Starlight5 and Mr. Fox like this.
  4. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Agree with both of you. Their CPUs still have a place in scenarios where low budget constraints are unavoidable, but they really cannot touch Intel CPU performance. Leveraging Intel CPU performance to increase overall sales is a smart move on their part. Had they launched Quantum SFF using their own CPUs as a matter of principle, it probably would not do so well.

    And, honestly, with all of the shenanigans NVIDIA is pulling in the GPU arena, with drivers that induce throttling, artificially limit features and performance, block overclocking, and now more examples of overall system instability (e.g. Chrome crashing, black screens, etc.) I would love nothing more than to see AMD deliver a sharp punch to Intel's throat in both CPU and GPU markets. That article Brother @tijo linked suggests they are going to at least make an effort, and that is exciting. I hope they are successful.

    I also hope that AMD stepping up their game in CPUs will either deter Intel's evil scheme to transition everything to BGA, or at least provide us that view BGA as being absolutely unacceptable a viable replacement for crappy Intel BGA processors.

    It would be so wonderful to see the red versus green fanboy emo wars come back in full force, for both CPU and GPU. I really miss those heated discussions over who had the better GPUs and adding CPUs wars to the mix would be icing on the cake.
     
    Starlight5 likes this.
  5. Tinderbox (UK)

    Tinderbox (UK) BAKED BEAN KING

    Reputations:
    4,740
    Messages:
    8,513
    Likes Received:
    3,823
    Trophy Points:
    431
    I read the other day that Microsoft is considering buying AMD.

    John.
     
    Mr. Fox likes this.
  6. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Oh man... the joy didn't last very long. You just had to kill it with bad news, LOL. :vbbiggrin:

    Based on their recent track record with Windows garbage, I hope that doesn't work out.
     
  7. Tinderbox (UK)

    Tinderbox (UK) BAKED BEAN KING

    Reputations:
    4,740
    Messages:
    8,513
    Likes Received:
    3,823
    Trophy Points:
    431
    Starlight5 and Mr. Fox like this.
  8. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Not sure what to make of it. Hard to interpret that in a positive way, since they just recently made it clear that software development, cloud computing and competing with Google was their primary focus. https://rcpmag.com/articles/2015/06/01/consumption-deployment-cloud-compete.aspx

    It has potential to go well, but it could also end badly. If they do not making it a top priority to aggressively develop a strong AMD that delivers enthusiast products that rival Intel and NVIDIA, it will simply give them a self-serving game console and SoC-focused cash cow and surrender even more of a monopoly to Intel and NVIDIA than they already have. This would be terrible for consumers. It's already terribly, but this has potential to make things far worse.
     
    ajkula66 and Starlight5 like this.
  9. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Whoa. Things could get pretty sticky pretty quickly. AMD using Intel products in their SFF gaming PC's. Mirosoft buying AMD which is in Xbox and PS4 consoles. I guess if you can't beat 'em, buy them.
     
    Starlight5 and Mr. Fox like this.
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Honestly, as much as MS is trying to screw us with Windows 10's exclusivity and such, I sincerely believe that them buying AMD might actually help us in the long run. AMD needs a sponsor. A HUGE one.

    Their driver team needs WORK. And lots of it. They need to release more drivers with game optimizations, and WHQL ones too.
    Their GPU team needs WORK. The Fury and Fury X might be wonderful cards, but their potential is still behind a driver wall and the mobile market and the desktop cards under their Fury line could do with some new blood as well. Maybe less power hungry blood.
    Their CPU team supposedly has Zen coming out, but until I see performance numbers it means nothing. I still say their CPU team needs a lotta work, since Intel has already finished working research on Cannonlake and is on cannonlake's successor internally (or maybe that successor).
    They need a really good new chipset. REALLY GOOD. PCI/e 2.0 needs to go, badly.
     
    Kent T and Starlight5 like this.
  11. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    My concern is that MS will be solely interested in consoles and since PS4 uses AMD they will reap the rewards of their competitor until the next gen is released. They may kill the AMD PC GPU market completely. I don't have faith that they will pump resources into PC graphics. I know that seems counter-intuitive because Microsoft makes Windows, but for some reason, and I still don't understand it, PC gaming is wholly ignored other than being a second hand cash grab. Probably would buy AMD for the whole Microsoft HoloLens project to have low power, high efficiency APU.

    I am still bewildered how to this day, PC's populate the earth more than humans by a large margin and a large majority run Windows. So why is PC second tier when it comes to games? Makes zero sense to me.
     
  12. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Because, as you'd notice if you look at user reports of XBL on the X360 versus the X1, Microsoft not only has changed directions (with various CEOs and such) multiple times, but their primary designers are not gamers. They don't know (or care) what gamers really want unless it cuts into their profits to not know. This is the whole reason MS has axed various "selling points" of their X1 from the home media department. It's why I originally felt that a X1 was the best console to buy as a PC gamer; because most of PS4's best games are multiplat, but Xbox 1 had the exclusives that were more appealing to me (like Killer Instinct, Halo, Gears, etc) but also because the X1 was a good media device.

    That being said, X1's interface is terrible compared to X360. People couldn't even invite others and had no idea how the party system worked for weeks and the way the headset and mic works is atrocious, even with the headset adapters. They really didn't learn anything from X360's good points, but rather just went "I HAVE NO IDEA WHAT I'M DOING" and ran with scissors, lines of code and cups of coffee until they managed to all trip at the same time and land on a point where the X1's interface was "complete". Xbox's popularity is more like an expectation of quality rather than a company actually striving to make good products today.

    If MS is willing to funnel money in AMD's direction similar to how Amazon is funneling money in Twitch's direction, then things could be good. I'm hoping that's how it would be, and that AMD would not sign anything that removes their freedom to do their do.
     
  13. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    I share Brother @HTWingNut concerns. Micro$lop has not done anything to give me a level of confidence that they will do anything except self-serving things if they were to acquire AMD. Sure, the potential for something great to happen with financial backing is there and it could turn out fantastic. Zebra's don't normally change their stripes and their historical behavior is what has me concerned. If they acquire it, get whatever they want out of it and say the heck with everything else it would be status quo and a bad outcome. The fact that they do not seem to care about gamers unless that means selling another XBOX should make us all very apprehensive.

    If they get what they want and bankroll AMD to do bigger and better things than they could with an ineffective business plan, lackluster talent, no ingenuity or drive to excel, and no money, and give them the autonomy to do so, it might turn out better than we can imagine.

    The potential (and hope) that AMD would displace NVIDIA in the extreme performance enthusiast mobile arena has never been more important due to NVIDIA's recent demonstrations of engineering incompetence and misguided focus on mainstream, low-end, budget-friend and game-capable jokebooks.
     
  14. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well that's where my hopes come in. I hope that AMD would be smart enough to not sign any kind of acquisition that limits their freedom. It's one thing if MS wants to acquire AMD to get profit cuts and cheaper console components, but if their contract states that they get to dictate the direction AMD is going, then AMD would be foolish to sign that, despite not being as well-funded as nVidia or Intel. It's like when people thought Twitch TV was going to be bought by Google and it turned out to be Amazon they went with, or how Mighty No. 9 was denying publishers left and right until Deep Silver gave them the funding without restricting their freedom that they took on a publisher.

    If they're simply an investor and the contract is as such, good. If they're a controller and want to shape AMD to their own devices, then AMD should know that their products would suffer greatly if they take the deal, and should deny. If AMD takes such a deal, I honestly can't leave the blame wholly with microsoft for whatever ruin may result. It's like blaming a drug dealer for someone buying drugs. The dude buying drugs obviously had to agree to the transaction, no? There's far more often demand without supply than there is supply without demand. Supply without demand usually evaporates quick, as we saw with 3D notebooks and 120Hz and 60Hz high-gamut panels, as well as the current lauded influx of low quality IPS panels that's being devoured ravenously by the notebook market.
     
  15. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    I on the other hand don't want the AMD albatross around MS' neck. ;)

    Don't see that going good for either company.
     
  16. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Damn it. I like 120Hz high-gamut panels.

    As for the AMD device, as long as they can sell more Fiji's. Their CPU department has accepted defeat for a long time.
     
  17. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,238
    Trophy Points:
    231
    Let's not forget how Nokia's acquisition turned out. I really hope this is just a flop, although I can't see how AMD can make money if no one buys their stuff. Probably that's why this project rocks Intel, to draw some attention and keep the boat floating until they release comparable CPU, or at least something in the neighborhood.

    As for displays it seems we can't have it all and we wont till someone actually decides to not make money, but rather make the costumer happy. DreamColor is awesome, but slow. Fine for casual gaming, but no way in hell for competitive. Those colors though :D Yeah, I'm bragging, so what :D
     
    Starlight5 likes this.
  18. Tinderbox (UK)

    Tinderbox (UK) BAKED BEAN KING

    Reputations:
    4,740
    Messages:
    8,513
    Likes Received:
    3,823
    Trophy Points:
    431
  19. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Again, it all depends on the contract. It doesn't really matter WHO is pouring the $$ into AMD if they don't restrict AMD. As long as money is going in and a cut of profits is all that's coming out to the new investor/owner, then AMD would for all intents and purposes operate exactly as they do today except better funded (which should mean better stuff all around).

    If they're going to be owned and have practices dictated by someone else, then I don't think there's any company on the planet I'd like "owning" or "investing in" AMD. The facts are simple though. They don't have money to make great. They can make good, but their good usually comes with a tinge of bad, and their CPUs are pretty much nonexistent as far as anyone with sense is concerned... they need to improve on dis with moar $$. And someone needs to give them that $$. That's a big problem.
     
  20. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    D2 Ultima,

    I won't disagree that in the end everything comes down to money.

    But simple cash won't fix the predicament AMD is in; they need to create a plan and simply execute it properly (for once).

    Not a static plan that shoots for something around evening tonight (it is noon now...).

    A dynamic plan that takes any and all positive aspects of their products and pushes them as high as possible while not ignoring to address the glaring negative aspects... and doing all this at the same time and being able to change course as the winds of opportunity dictate.

    Giving AMD access to real funds right now is a way to make money disappear, if they're allowed to continue 'doing what they do'.

    The point is that doing what they're doing (for the last decade) is what has brought them here. Time to wake up. Look in the mirror and honestly see the mess they've created and do a complete makeover.

    I, for one, will not call them out for changing tactics, mid-course. Rather, I would applaud such a courageous about face - especially if the whole company from top down dug in it's heels and stayed the course until they see the results of such action.

    Anything less is far too little too late. And (only) throwing money at a fire is only going to make the fire burn higher. Yeah, it will be pretty for a while. But the cold dead embers of the dreams once strived for will haunt the industry for a very long time to come.

    And at the same time, the company foolish enough to give free money and reign to AMD will be in a downward spiral itself before that time comes too.
     
    Starlight5 likes this.
  21. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    ArsTechnica had a piece on how things started to go wrong at AMD: http://arstechnica.com/series/the-rise-and-fall-of-amd/.

    I agree with Tiller that they need a solid plan to execute on top of throwing money at the problem. Besides, anyone "giving them money" will want some say in what AMD does. That doesn't mean that MS couldn't buy them on the condition that they come up with a solid plan and then throw cash at them.
     
  22. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    If they make "great" (meaning equal or better than Intel's greatest, at an equal or better price,) the money will come later, in bucketloads. It would be awesome to see them draw blood from NVIDIA and Intel, and I know that I'm not the only person that would be tickled to see that. One thing is for certain: If they continue on the path of making only crap, the money will never come.

    The million-dollar question is, do they even know how or have the talent to do that? Doesn't seem like it.
     
    Kent T likes this.
  23. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Well, we'll eventually see how things work out.

    But I said it two years ago and I'll say it again. AMD needs to do some real good stuff. Intel isn't even concerned about AMD right now. nVidia probably feared Fiji a bit, but now that it's out with a fizzle rather than with a bang, everyone's back to buying 980Ti cards. I want to see them feel pressured.
     
  24. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Agree. I go one step further. I'd like to see NVIDIA and Intel sodomized by AMD. Nothing would be sweeter than an all-out bloodbath over the performance crown for CPU and GPU. Those hate-fests we used to enjoy are what kept us waist-deep in awesomeness a few years ago. They need to put down the jug of "efficiency" Kool-Aid and break out the big guns.

    Both of those articles about AMD are a good read. I remember very clearly switching from Intel, to AMD Athlon, then slithering back to Intel CPUs. It was a short, but grand, run in the sun for AMD and I enjoyed riding the wave. I also remember being an ATI graphics fanboy. AMD screwed that up as well.
     
    Starlight5 likes this.
  25. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    I know right? I want that too. I'm just being "nice" about what I'm saying XD. I also don't think AMD could slaughter Intel or nVidia at this point.

    I actually halfway disagree. I think efficiency is a great thing. We need more power per watt and per unit of heat outputted XD. But when power is sacrificed for "efficiency" is when we get issues. If we could shove a 88W 4790K into a 60W CPU, imagine what that 60W CPU architecture can do with 88W? =D =D. That's the only kind of efficiency I want.
     
    Kent T and Mr. Fox like this.
  26. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    I want a 320W 12C/24T mobile/desktop CPU that runs at 8.0GHz and an 1.2KW laptop AC adapter. "Efficiency" is a code word for compromise now because it has been used wrongly as an excuse for producing pathetic trash. They should stop wasting time searching for ways to reduce power consumption and focus on exhausting everything they have and searching for more. When they can exhaust what they have and need more, that's what I consider efficient. I want to see the lights flickering and hear circuit breakers humming when I overclock, LOL.

    Humility is something great if we are talking about personality. It sucks when it involves technology.
    [​IMG]
     
    alexhawker, D2 Ultima and Starlight5 like this.
  27. Starlight5

    Starlight5 Yes, I'm a cat. What else is there to say, really?

    Reputations:
    826
    Messages:
    3,230
    Likes Received:
    1,643
    Trophy Points:
    231
    I too remember times of AMD CPU greatness and being a fanboy. Back in the day, my K7 Athlon XP 1700+ overclocked from 1466MHz to over 2.3GHz on air, and raped then-top Pentium 4 3.2Ghz at a fraction of price; some even achieved 2.7-2.8GHz stable. Used AMD CPUs from early K7s till early K8s, overclocked them all.
     
    Mr. Fox likes this.
  28. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    Haha I don't think cooling all of that would actually be feasible in any kind of form factor, but I applaud your desires XD. (I want it too, I just don't think it's gonna happen by the laws of physics of silicon XD)
     
    Mr. Fox likes this.
  29. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Just being sarcastic to make a point, of course. :vbwink:

    They (Intel, NVIDIA, AMD) need to knock it off with the lame strategy of trying to cut back and just max out the resources we already have.

    The entire concept of doing more with less is a flawed and mythical pipe dream. Less is... less... always.
     
  30. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    Buckminster Fuller certainly didn't think so, considering he talks extensively about ephemerialization.
    Bigger isn't necessarily better.
    We demonstrated that on numerous occasions throughout history by making improvements to various technologies and downsizing by creating more in turn.
     
    Starlight5 likes this.
  31. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Guess that depends on what you are trying to accomplish and how you define success. I prefer doing more, more and more, with more. I see doing the same with less as failure if doing more with more is possible. :vbbiggrin:
     
    ajkula66 and Starlight5 like this.
  32. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    But that's not really sustainable or efficient.
    If that mentality was kept instead of downsizing computers, we'd be upscaling them to the point where they would need to be so big, to the point of being impractical to use (at least on a personal level)... if you follow the notion that 'bigger is better' (which its not).

    Indefinite growth on a finite planet is equivalent to suicide and not possible.
    You cannot just do more and more and more with no regard to efficiency, the environment, etc., because you end up with serious issues (as is the case today).

    Besides, the market isn't even employing the best of what is technically possible... we are getting what is most cost efficient for manufacturers to mass produce with end results being profits - it is limiting and results in planned obsolescence, huge waste of resources, and other issues.
     
    Last edited: Jul 4, 2015
  33. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    I thought this was a laptop forum?
     
  34. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,335
    Messages:
    11,803
    Likes Received:
    9,751
    Trophy Points:
    931
    It's beyond clear that AMD isn't doing anything for laptops, but if their desktops do well then maybeh
     
  35. Deks

    Deks Notebook Prophet

    Reputations:
    1,272
    Messages:
    5,201
    Likes Received:
    2,073
    Trophy Points:
    331
    It is... but the forum isn't so strict to not allow a little derailment and discuss a few other topics (otherwise, the existence of this very thread should be put into question, seeing how its not laptop related).

    Regarding the topic at hand... I don't think AMD is 'admitting defeat'.
    It is possible that they realize their cpu's can limit GPU performance, so I think that for now, they might be focusing on Fiji and providing it with every bit of performance advantage in the most energy efficient envelope... at least until they release Zen.
    Besides, AMD's desktop full blown CPU's are years old now... if project quantum is about efficiency and performance at the same time, an AMD cpu might not be a good fit for that design.
     
  36. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    Well, I didn't found it inappropriate to discuss about desktop components in a laptop forum. What made me confused is high performance laptops are thermal limited, it's all about efficiency. Why would people into DTR laptops not care about energy efficiency?
     
  37. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Yeah, I don't play that tree-hugger religion crap either. Environmentalism is hoax junk science and it has no place here.

    The thinner and lighter they try to make high performance machines, the bigger turds they will become, with thermal limitations that are avoidable if they don't go all school boy sissy on us by trying to be all OCD about thin and light. The quest for "efficiency" is really more about how lame they can make them by cramming stuff into a form factor that is far too small to accommodate anything truly wonderful. Remove the diminutive form factor: problem solved... mostly. It takes a little more than half-heinied engineering, too. If they do a sloppy job on the cooling system, nothing is going to run right.
     
    ajkula66 likes this.
  38. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    OCD thin laptops are making things worse. But large proper DTRs have a limit as well. Even with the largest laptop chassises (AW18, P570, GT80), cooling a 300W+ high end desktop GPU device would be a demanding job. Currently we are seeing dual MXM designs with two ~100W GPU dies spreading the heat, and that's what makes those large laptops usable. Can we have more efficiency and put more processing power into just one die?
     
  39. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    The cooling systems in the dual-GPU 18" Alienware systems work fantastic and each component has its own heat sink. They could do even better if they wanted to, but it would increase the cost of the machine. I have no thermal issues on these monsters. Overclocking changes everything and requires something better than average, but thermal management at stock clock speeds shouldn't be a problem for any OEM unless they are trying to cut corners or use a form factor that is too small to support anything great. Tying two GPU heat sinks together to spread the thermals around is a compromise that would not be necessary if they built the machine right in the first place. Alienware has already proven that for several generations of laptops, and so has Clevo. That is an innovative way of making a silk purse from a sow's ear... aka "doing more with less" from an engineering perspective.
     
  40. Mr.Koala

    Mr.Koala Notebook Virtuoso

    Reputations:
    568
    Messages:
    2,307
    Likes Received:
    566
    Trophy Points:
    131
    I do believe cooling solutions slightly tweaked from what we already have now can at least handle GTX 980 or Fury Nano level heat output. GTX 880M at max might do more than that and AW/Clevo sold systems which handled it just fine.


    Not long ago a Chinese OEM talked about making a true DTR series inside normal laptop form factors and planned to start crowdfunding. Heat pipes would be replaced by AIO style water channels and the targeted hardware is GTX 970 + 1150 CPU for 13" or 980 + 1150 for 15". There is also a 15" workstation version with a middle range 2011 Xeon E5 with possibly more heat output. Sounds like it's one step closer to the 300W high end GPU goal.

    Commercially I see little chance of success in the plan, but at least someone is trying.
     
    Last edited: Jul 4, 2015
    Starlight5 likes this.
  41. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    I have to admit that I am a bit jealous of Mr. Fox's (and others on this forum) O/C'ing skills. However, doing extreme O/C'ing on a single system (or two or three...) is not the same as keeping a lab full of systems (desktops and notebooks...) O/C'd and dependable too.

    Given the above and as I've stated before, I still highly value stability, reliability and longevity in my notebooks over sometimes minor performance gains (on the platforms I prefer).

    See:
    http://www.tomshardware.com/reviews/intel-core-i7-5775c-i5-5675c-broadwell,4169-6.html


    The above Broadwell link dated June 2, 2015 shows why AMD is admitting defeat to me (yeah; desktop article - but again I'm using it as objective 'proof' vs. my own intuition to make my points).

    The quote above is regarding graphics performance.

    The key points I keep seeing over and over are:

    1) AMD does have great (and original) ideas.
    2) AMD takes too long to get those ideas to a 'must have' level of performance.
    3) AMD never gets those ideas past the interesting stage vs. whatever their competition is offering on a complete/overall package comparison.
    4) Intel sees the value of some of those ideas and brings it to fruition slowly (as their critics would say...), methodically, but ultimately delivers at the place it needs to be to be not just 'good enough' at the time of introduction... but also best overall vs. what is available in the known universe too.

    This is why Intel not only stays on top but keeps increasing it's lead and AMD keeps sliding even further into oblivion.

    When your competition takes your idea and gives consumers a vastly better alternative - it's time to stop sucking your thumb and learn how to walk already.

    Yes, I know... Intel does not compete with AMD on price (they always 'over charge' vs. AMD's prices). But that is because AMD can't compete on performance and spending double or more for mere hardware to get me even 5% better productivity is worth it to me as a business (yeah; I've done the math).

    The real issue for AMD is that the performance delta has never been just 5% in the last decade (overall)...


    To address the 'moar and bigger is better' argument - sure, I'd have to agree if O/C'ing (and maintaining, troubleshooting and babying) multiple dozens of systems was part of my mentality. But it's not.

    Form follows function and the best mobile form factor I've found is in the 14" to 16" range with a bright and sharp screen with an incredible keyboard/trackpoint with the fastest CPU/Platform available at the time. 17" and larger options are good too of course (mostly in screen real estate) - but they don't increase productivity as much as they take away by being heavier, needing larger power supply bricks and bigger bags, etc. to have them with me at all times.

    The basics are the basics because if they're off - the overall experience of the system will not be optimal (and for some, that can only be seen by using a better balanced example for a while...).


    Lets review the basics:
    Compute side:
    1) O/S - Windows is meant for productivity - today it is Win8.1x64Pro, very soon; Win10.
    2) Platform - (latest/newest) is most important.
    3) CPU - i7 QC, non 'U' or higher - basically, buy as much CPU as you can (at time of purchase).
    4) RAM - maxed out and coupled with the most powerful CPU you can buy - gives the system it's overall productivity value.
    5) Storage subsystem - today; 2.5" ~500GB SSD or larger... very soon; M.2 PCIe x4 of 1TB or larger

    Following the above, in order of importance (for the longest sustainable system to keep me productive over the course of ownership) we get what 99% of the world doesn't need to update their fools book status.

    Interface side:
    1) 14" to 16" chassis (and heavily weighed towards the 14" side of the range) - smaller and it falls into 'toy' status - larger and it falls into unusable status when/where mobility is concerned.
    2) Screen - Matte 1920x1080 minimum (and at this time; close to the maximum too).
    3) Keyboard - Think ThinkPAD-like and you're doing well. :)
    4) Mouse Pointer - Think TrackPoint and forget the rest. :)
    5) Fingerprint reader - needed, without a doubt for mobile systems (typing passwords in public is just stoopid).


    Add a few other necessities such as USB ports, AC WiFi cards, etc. and today's and tomorrow's systems don't really change over time, even if they are getting better each generation.

    Physically bigger is not the key to happiness - a continuously bigger performance envelope is. That includes battery performance too for the mobile systems we (should be) discussing. Physical size is not the defining factor... in fact, it could even be a detriment too (either too small or too large).
     
    Mr. Fox likes this.
  42. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,238
    Trophy Points:
    231
    Money? How about that? Both Intel and nVidia always had better balances even in AMD's strongest years. Don't know what you do for a living, even less in your free time, but I'll tell you something about building things. All of my projects are money-no-object. The thing is, that those money are not readily available at day one, rather accumulation throughout the years. You can go as wild as you can, but nothing is free and if didn't happen to be a lottery winner or having a solid investor, you can only do that much. That's why most of AMD's projects are mostly testing waters. You seem to be the implementation is the most important thing kind of guy. It IS important, but please do tell if there's no initial spark called idea, where the hell you'll start, what on Earth you'll be implementing? The implementation depends on money, idea on creativity FULL STOP!
     
    Starlight5 and ajkula66 like this.
  43. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,225
    Messages:
    39,334
    Likes Received:
    70,635
    Trophy Points:
    931
    Good post. Those are foundational "price of admission" requirements, and without that as a starting point, an overclocking hobby/fetish is going to be a futile endeavor.

    Bigger than necessary isn't great, but as large as necessary to achieve awesomeness kind of goes without saying. Too small leads to engineering compromises that can't be fixed. When a product is engineered by competent people, it can turn our "just right" but we don't see that much any more. That problem isn't limited to computer technology. We are surrounded by incompetence.

    It's the pursuit of small computers at the price of reduced performance that makes my blood boil. Being able to brag about having something thinner and lighter when it runs too hot, throttles and cannot compete in terms of performance doesn't make any sense to me. Factor in the lack of space for nice added-value features and it's predestined to be either a failure or something that can, at best, be viewed as a mediocre product. Building it in a manner that makes it disposable (soldered components) rather than repairable or upgradable also makes no sense to me as a consumer, tech enthusiast and someone that just hates wasting money on expensive disposable trash. For an OEM, it makes good sense as a greedy, self-centered, short-sighted and anti-customer business move. What Alienware has done with their current generation laptops provides us with a good example of how bad that looks.
     
    TBoneSan and tilleroftheearth like this.
  44. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Money? They had that. Let's please not use it as an excuse now, triturbo.

    Implementation is not about cubic dollars. It never has been.

    Implementation is about being realistic about what you (actually) have and what you do with what you have.

    Money no object... over years... is not possible. Buying what is available within your scope today and utilizing it fully until you can afford the next step up (or two) is. As you have probably seen in many of my posts; I strongly argue against upgrading older platforms most of the time. Buying new is almost always the best bang for the buck over time.

    And by upgrading I mean anything other than the O/S, RAM and/or the storage subsystem (and only if/when the other two are already maxed out).

    I understand what you mean; we do what we can. But the optimal level of balance for any given system cannot be ignored or worked around. After a few years, upgrading any 3 or 4 year old platform is a sideways step vs. what is available at that time. Even if it means the actual user will see increases in performance themselves, they still are not usually best bang for the buck type of upgrades, ime.

    To put it another way: if your budget and workflow needs allow you to get an i7 QC, non 'U' cpu based platform today (even if it is a generation back), let's say in the $1K range for the sake of argument... getting the best performing CPU today will only get you another 10% to 40% improvement on the same platform, period. That will stay the same for as long as that platform is viable - the performance improvements to the best cpu won't increase over time, even if the cost of that cpu goes down substantially.

    Compare that to the double or more (100%+) improvements that a new platform will make in almost half a decade's worth of progress from the likes of Intel. Yeah; that will cost you about the same to get into a new platform as the old one did. But the bang for the buck will be more than justified - especially if you can sell the old system to offset the cost too.

    With two systems working side by side while you're setting up/evaluating the new platform, you're in the best position to always decide right then (before the return period is over...) whether a new platform is really an upgrade over your previous one (and of course, we know those % improvements will not be applicable to every aspect of the computing experience either, but many intangible improvements may surprise you too when comparing current vs. ancient systems).

    Time moves very quickly and computers are still in their infancy, relatively speaking. If you consciously decide to effectively stop time by keeping a platform as long as possible through upgrades each year - it is your loss - performance will not match what is available, no matter what bm 'scores' might indicate. Heat, noise, battery life and other sometimes normally ignored aspects come into play too to make a platform better.

    I shoot RAW still images for a living and need the horsepower and storage capabilities to process, store and backup those images and their various versions.

    A slightly lower performing (on paper spec's) system that is balanced has always been the system I reach for vs. a system that excels at any one given aspect of 'performance'. Why? Because the balanced system is the one that gives me the most productivity boost. This has proven itself over and over throughout the years.

    (Today) At day of purchase, you should ideally do this to have a 'balanced' system:
    Use the latest O/S available.
    Buy the most current platform and CPU combo you can afford.
    Max out the RAM (or at least within the next month or two).
    Buy the best and biggest capacity 2.5" SSD and OP it.

    Finally? Find the best software to run the workflows you require and then use the system for as long as switching to a newer/better system is either out of your budget (even after considering the selling price of the old system) or not available yet...

    Doing anything other than the above is a waste of time/resources, money and performance over time...

    The most enjoyable/productive system is the one you use and can depend on constantly; not the one you spend upgrading/tweaking for mostly sideways improvements that can't ever reach above the platform limitations of that dark distance past of that archaic time it was conceived in (usually, years before it was ever released as a product to be purchased by you and me)...



     
  45. triturbo

    triturbo Long live 16:10 and MXM-B

    Reputations:
    1,577
    Messages:
    3,845
    Likes Received:
    1,238
    Trophy Points:
    231
    Oh really now. I think I've read somewhere about the way Intel improves their CPUs - they have four teams working. Two work on something then the other two take on top of that, or it was something like that. Have to find it. That's as far as pure brute force money no object as something could ever go. Innovation, right.

    Upgrading is not economic feasible, only true to most people. Here enters personal preference as quite a handful of the people around here have it. Some would stop for BGA, some for not having 4 RAM slots, some for lack of space shuttle, each to his/her own. I for one wont get anything 16:9 in foreseeable future. When you get that I want high-gamut IPS with decent real estate (at least 15.4"), which is also upgradeable, you can't get much further than 8740w. Actually it's the only one. So upgrading is the only way to kinda keep-up, while satisfying personal preferences. Of course it lacks some of the newer instruction sets, it wont be as power efficient, while clocked hard, but same goes to any modern CPU at full tilt, so not that much of concern here. So upgradeability is to everyone to decide and that's what I always say when asked about it - If you care about aspect ratio - upgrade, if you don't - there are TONS of other options.
     
    Starlight5 and ajkula66 like this.
  46. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    5,398
    Messages:
    12,692
    Likes Received:
    2,717
    Trophy Points:
    631
    Again, I understand and can appreciate your points, but whenever I dug my heels in the road like a stubborn mule it was not to my benefit in the long run.

    Like I mentioned, sitting still is to simply become obsolete.

    If performance/productivity is a true need for anyone (as it is for me) to make a living, then upgrading platforms is the only real upgrade to do.

    I don't know what you do for a living, but I can safely assume that a 16:10 screen ratio is of high importance.

    You want to know how I got around that when I thought it was so important (and still do, actually)? I bought external monitors with the spec's I wanted, rather than limit the platform itself to a spec that I preferred.

    There's always another way to skin a cat.

    I found out the hard way that if I didn't find a solution with the latest/most powerful platform at my disposal - my competitors soon would (and did...). And the performance delta would be either me working for less per hour to offer the same work, or, my competitors offering a much better price than me for the same work/project. Either way; not a win in my books if I want to keep food in my belly and a roof over my head.

    As for Intel having four teams working on each project...

    The number of people (and therefore the amount of money) thrown at a project is not what makes it successful or not. Seeing the end goal and reaching it no matter what obstacles surface is what makes them a winning team (collectively).

    The money does play a part of course. It can shorten the time-frame the goals are reached by and it can also impact the loftiness of the goals too. But money in itself is no guarantee; ever.



     
    Last edited: Jul 5, 2015
  47. Cakefish

    Cakefish ¯\_(?)_/¯

    Reputations:
    1,643
    Messages:
    3,205
    Likes Received:
    1,469
    Trophy Points:
    231
    I wouldn't call the Clevo P75xZM/P77xZM series thin and light. They have full desktop non-soldered CPUs and non-soldered GPUs. These high performance options still exist!

    The likes of the MSI GT80 Titan, MSI GT72, Asus G751 are certainly not thin and light either.

    Sent from my Nexus 5 using Tapatalk