It used to be exciting times anticipating new tech. Waiting for the "next best thing". Now it only brings disappointment and discouragement.
CPU's focus solely on lowering TDP, artificially throttling, and cramming as much I/O and graphics on it that they can, resulting in a power hungry and hot chip that isn't allowed to run appropriately. Performance has stagnated. Working on power reduction is great and all, but we also need opposite end of the spectrum. And of course the fact that Intel is near monopoly in the x86/x64 space doesn't help matters. Let's not forget the whole soldered vs socket debate either.
GPU's are in the same boat. Near monopoly Nvidia pushes out crappy drivers, voltage throttling, and artificial limitations that can be circumvented by a third party BIOS, so we know it's possible to push the tech much further than they release. Instead of focusing on smart TDP design to get the most out of the card, instead they cram as much vRAM as they can that in most cases is not even necessary, considering desktop counterparts have less vRAM.
LCD's are at least getting a little love, except they keep pushing out 4k LCD's instead of some intermediary resolution and focus on improved response and refresh and generally improved quality. Where are all the 2560x1440 (or better yet 2560x1600) LCD's? And pushing 4k for gaming is not a good match yet. You can run at 1080p and it looks good, but it's still noticeably scaled.
And then we get to the OS. We go from a solid, productive, user friendly Windows 7 designed entirely with mouse and keyboard interface, because well, it's a PC after all. But then MS starts the abomination that was Windows 8. They improved slightly with Windows 8.1 but it's really nothing but a glorified Windows 7, and still missing some components and requires third party apps to make it usable. Touch screens on a PC can die. Plain and simple If I want touch screen, I'll get a tablet or a smartphone. I don't want my LCD smudged, and be a glossy reflective mess. If you like this stuff. Fine, happy for you, but go away, it doesn't work for me. Then we have the stillborn that is Windows 10. Windows Big Brother. You don't need to have any control over your $2000 computer. Microsoft knows better than anyone what updates and drivers you want and need to install. Microsoft GTH.
Enthusiasts be damned. Microsoft and PC makers would do better if they catered to us enthusiasts like they have in the past. We're free advertising for them. We can show people what their technology is capable of. We can be a cheerleader for Team Green or Team Red or Team Blue, but it works out for everyone. Now we just get a dumb device that does what it wants, not what you want.
I'm angry. I'm sad. I'm old and crabby. I'm mainly just depressed that one hobby that I've had most of the 42 years of my life is quickly going down the toilet and nothing I can do about it. There's nothing really coming out to be excited about. Faster Firewire...ooh. DDR4... oooh. 3.2% CPU performance increase with lowered TDP... oooh.
-
much larger ssd drives hit the market later this year.
-
Nice rant, sorry had to G rate it a bit. As far as the SSD's I am thinking about the new 2TB's but the Samsung's I think still have that Linux black list issue for live Trim. It is true though that the enthusiast, once considered the heart of the market, has been getting royally screwed as of late. This is one of the biggest reasons the PC market is down.
Newer, faster and better used to drive the market with enthusiasts being the first adopters. Then the Jones's had to have their piece of the pie and then Smiths had to one up as well, so on and so on. Now there is nothing left driving that market engine. Just all of this lackluster hard/software. -
its true what wingnut is ranting about...the only thing im excited about at the moment is new and faster storage coming out. aside from that, im of course curious about what the rumoured 990M and pascal will bring to the table. but looking at recent developments, it gets harder and harder for enthusiasts to find their niche...i mean, how much choice do u have left if u want to opt for an upgradeable (=non-soldered), true high performance laptop these days? theres only clevo, msi and now alienware with their refreshed AW18 left, thats it! and even put together ull only get like a handful of models to choose from...
Sent from my Nexus 5 using Tapatalk -
CPU power is there if you can need it, called clusters. "PC" cpu only need to satisfy personal need, which I think is plenty...
GPU isnt all that bad, you get 980m after 3 generation of essentially the same card, (680m,780m,880m). That is about 2.5 year gap if wiki is right on the launch date.
LCD, imo is slow, where is da OLED~,
I think a proper title is
WTF has gone wrong with PC GAMING/Performance?
but hey, you don't even have much good game going on.
PC world isn't all that bad, you have all the convertibles, windows tablet etc etc -
But yes, I agree. Things have been very "meh" lately. SSDs coming of age in 2010 - 2012 (depending on your space requirements) was big, and 2012 was about the last time GPUs jumped a bunch until Maxwell. Other than that, not a whole lot of reason to upgrade. Sandy Bridge to Skylake... yawn. A nice intermediate screen res that a non-$500 GPU can handle? Not necessarily easy to come by (hello, 2560x1600 27" LCD?). Easily upgradeable laptops? That's why I bought a used 2010 workstation instead of a 2014 model (well, that and the screen).
On the positive side, next year should see 14nm (or is it 16nm?) CPUs from AMD, and 14 nm GPUs from AMD and nVIDIA. So that at least should be worth following. Personally I'm hoping AMD does impressively, since as you've mentioned both Intel and nVIDIA are approaching monopoly status, and that certainly isn't good for competition over the long term. Thus far AMD hasn't been close enough to Intel in CPUs for me to buy one of theirs, but in GPUs where my needs are more modest that's part of the reason I've been favoring them the past several years (the other being soldergate and nVIDIA's response to it).
Still, even if 14nm does bring things back to how they used to be next year, it certainly hasn't been like the heady days where Moore's Law translated nearly directly to performance as well as the number of transistors.
Would've +repped the post as well as liking it, but apparently you've been the only one posting things worthy of repping lately and I need to spread the love around a bit before +repping again. -
http://www.pcworld.com/article/2032913/the-end-of-moores-law-is-on-the-horizon-says-amd.html
Computer technologies need to evolve beyond silicon power. Intel and other companies are milking silicon for $$$ while we transition, thus we have low performance gains per generation. The video included in this article is very informative. I suggest viewing it.
Greed is the basis of all evil, my friend.Last edited: Aug 20, 2015ExMM, TomJGX and alexhawker like this. -
IMO, that guy ^ is full of calabasa (<- from spanish ) since his report on what happened couple years ago when somebody (cough .. who might that be) fired something (that was explained like an airplane...) from the sea (a sub maybe .. lol) off the coast of California.
anyways
technology, hmmmm, CPU I dont need more, GPU - hell yes, and SSDs .... of course I want.
besides that, I hope Win 7 has been patched enough by now so that I can actually upgrade to it -
Technology is advancing so quickly because these monopolies have so much money to throw towards development of it. We are really just bystanders
Sent from my Nexus 6 using Tapatalk -
Let's see, there are inherent physics limitations on performance that these companies are getting ever closer to in each die shrink; each die shrink is now a lot more difficult, complicated, expensive, and time-consuming than the last; the real advent and flourishing of "good enough" computing; and yes, some corporate laziness and greed. All of these contribute to the slowing of development cycles, the loss of some customization and interchangeability at the high end, and regression toward the mean of computer price and performance.
-
I haven't been excited by a new technology from quite some time now. Where are all of these battery technologies? We see news for a new "revolutionary" battery, each passing day and we are yet to see one on the market! What about new display technology? A couple of years ago we got IGZO, which replaced aSi as substrate, but the underlying technology is still LCD (not to mention that the first IGZO panels were combined with TN panels) with all of its drawbacks. Not really excited about OLED either. MEMS display, or better yet CNT-FED (half pixels drawings!!!! now we can talk about resolution independence once again), where are they?! Does it really matter that you'll have to wait a minute more while transferring a large file to your USB drive? I guess for some. Batteries and displays would benefit everyone! Batteries would benefit not only notebooks, but everything that relies one way or another on them - handheld devices, back-up power supplies, hybrid/electric cars and etc. Displays - you are looking at the damn thing the better half the day! You MUST demand a better quality one! With those two alone I can swallow the shallow development on the other fronts (especially CPU). As for GPUs - I was waiting since forever the multi-GPU setups to mature enough to be even worth considering getting it. All the time I was seeing it as a nice thing, but half of the time it would be ~$500 (or more) in hardware that would just sit there, doing nothing (talking about the second and more GPUs). No optimization what-so-ever, and we are seeing the opposite direction, even less titles care about multi-GPU support. Sorry, but for me at least spending even $500 for something that can't be used at 100% is a waste of money (unless its a back-up that is, which the secondary or more GPUs are not). And don't get me started on resolutions and aspect ratios, there's a flame war waiting to happen.
Last edited: Aug 21, 2015HTWingNut likes this. -
Absolutely, battery technology is and has been at a standstill. While power improvements in CPU's has improved slightly, they keep adding more power draining junk on them like 4k touch screen LCD's. They skimp on batteries in laptops that could accommodate higher capacity ones too, artificially limiting the available battery life for some strange reason. Heck, charge more for an option with the higher Wh batteries. That's the other thing, now it seems laptops are tiered. You can't configure how you want. If you want a 512GB SSD and 8GB RAM you have to get a 4K LCD and top end i7 as well and pay and extra $1000.
-
And it is not just the gamer here who is suffering from good enough computing. The CAD/CAM user, the Digital Artist, the video editor and photographer, not to mention audio creation are also feeling the effects of this. While we are not as much wed to gaming GPU, we need fast processors, fast storage, ample RAM, and robust CPU and workstation GPU and with superb cooling. We need options suited for our needs, and most of this emphasis on thin, light, and battery efficiency has affected us too. And the soldered in BGA Debacle does us no favors. And Apple has turned their back on us.
-
One could argue that chemical batteries face much more direct physical limitations than processors and increasing energy density is inherently hard. But the general trend to get smaller and smaller batteries really doesn't help here. -
What happened was that sometime in the past decade, we went past the point where the kind of performance increases that are still possible are useful to enough consumers to justify focusing on them at the expense of a laptop's size and weight. If Intel or AMD could make a CPU with double the single-threaded performance as they used to do every couple of years in days of yore (i.e. before 2008 or so), they'd do it in a heartbeat. However, it's just not possible to run silicon-based chips at clock speeds significantly about 5 GHz without using more power and thus generating more heat than even the biggest laptop can tolerate and architectural improvements can only do so much. What can be improved is multi-threaded performance which is why if you look at GPUs or server CPUs, they're still improving at a pretty decent rate. Everything else is also getting better (e.g. the advent of SSDs) except where improvements are limited either by single-threaded CPU performance or more directly by the properties of silicon (e.g. RAM).
GPUs are in a strange place right now. Part of it is due to the fact that discreet cards have been stuck at 28nm for so long, another part is because the consoles have fixed hardware and drive most graphics-intensive games and yet another part is the point of diminishing returns for photo-realism. If you look at the graphical improvements in games over, say, 5 year intervals, the difference gets smaller and smaller with time. -
TomJGX likes this.
-
i want to see the end of hdds
finally here. end the 2.5 inch format finally here. 6 gb pcie hdds is a big elimination of size and power. -
28nm has been around so long because it's the optimal balance of transistors per dollar.
They're just milking it. -
Batteries definitely have stagnated. While they weren't really the exciting part of 15 years ago, either, a big leap in batteries could be quite interesting. Hopefully the investment by Tesla and other car manufacturers will lead to increased battery innovation, and in the end better batteries for devices like laptops as well as cars.
-
The next major step is probably the virtual reality stuff and it does look interesting, but it has very little to do with laptops. -
There have been commercial attempts to change this, by IBM, AMD and ARM. But the PC market has been so successfully controlled that ARM has been effectively prevented from entering the laptop market they would have delivered superior products in. IBM's parallel computing engines based on resurrected tech from the 70s, that would have been superior on the PC market (never mind the "office" space, as it was with the Power Macs), has suffered the same problem. And AMDs attempts to redefine what a "cpu" is has been prevented by lawsuit from Intel.
So it's not that VR tech and input modes, workspace redefinition, or gaming/graphics tech, etc., has failed to show up. Far from it, this is all very old ideas that have been put on the backburner, because we lack the actual computing power to do anything with it or develop it further.
For example: it was possible with IBM's CellBE to create, on a "PC board", a real-time algorithm for very advanced occlusion handling of lighting sources. And the only place that materialized was in an experiment at an internal Sony studio, and a few commercially released games. It's extremely old tech, and it's well known tech. But the industry isn't moving towards real-time generation of graphics on one side, or extremely low power maintenance of low intensity graphics contexts on the other. So even though any programmer worth anything will know that explicitly parallel assembly languages, parallel programming, and at worst a well-distributed execution diagram that treats the limitations of x86 and linear programming consciously, is the only way to make any sort of improvement in existing programs -- products are not moving in that direction, and customers are not sold on, like mentioned, lower power draw graphics contexts, or heavy duty real-time graphics handling (for example for VR applications).
Instead we're getting these repeated "experiments" limited by existing industry standards, where all of the existing ideas are fielded in immature and amateurish ways that essentially present these concepts to the public as being very little except bad gimmicks that never will get anywhere.
Which is what people comment in this thread as well. People believe that complex real-time graphics contexts, real-time ray-tracing, etc. on the one hand, and very low-power but sufficiently fast computing engines to drive a PC on the other, belong to some sort of futuristic science fiction reality that might not happen for another 100 years.
While the truth is that the tech to develop this exists right now, but simply cannot succeed commercially as long as Intel exists. This might sound like your usual rant about corporate greed, of course, but be aware of that Fairchild Semiconductor International did not develop linear computing cores similar to what Intel is specializing in at the moment. And that Moore (as in "Moore's Law") originally in his article "Cramming more components into integrated circuits", was not talking about increases in processing speed for a single core, but rather about how many computing engines could be placed on a single chip. Their concept, that held true - which Moore predicted - for the next 15 years at the time, was based around collapsing external bus-solutions between larger serial-produced chips, into combining those functions on one chip, without the bus-interface in between.
And after we actually did achieve semi-conductors that are small enough to actually collapse the entire Intel bus system into one single chip, many years ago, we've still kept the "Industry Standard Architecture" and only improved the production cost for individual components on that ISA. While the actual computing power we could achieve by "cramming more components into integrated circuits" has failed to materialize. And that's been the reality for over 20 years now.
Like HTWingnut says, we've had improvements on screens, and I suppose you could add that with the power-saving functionality after Haswell, Intel has achieved laptop power-use that finally matches what the IBM based PowerMacs had over 10 years ago. Batteries with synthetic electrolyte has also developed, finally, and forced itself into the market - in spite of the tech having been very obviously superior to organic electrolytes for many, many years (also in production cost, by the way).
But outside of that, development in tech either on low-power engines or high-performance engines has been standing utterly still in the commercial space. For the very simple reason that customers 1. don't know alternative tech exists, and 2. the willingness of "experts" in software development and hardware sales to stick to the existing course the industry has.
Even when there is potentially a more lucrative market to be opened up. Because the existing market is still sustainable, even if it has to be maintained by relatively expensive strong-arm commercial strategies that do not benefit us as customers.
But be aware of that if people refused to buy the existing products every year, and demanded that laptop makers would make quality products that lasted, and had battery, processing engines and screens, etc., that was specified - in practice, not just in the sales-pitch - to their requirements, this would actually change overnight. Because the tech to develop these products really does exist.Jarhead, Starlight5, jaybee83 and 4 others like this. -
tilleroftheearth Wisdom listens quietly...
Nipsen, well said.
But there are two requirements for mere consumers to have a choice.
1) The existence of choice.
2) The affordability of that choice vs. the alternatives.
Without those two requirements, we are simply where we are today.
We cannot truly blame Intel, MS or any other entity for making the best choices for themselves and their shareholders (go ahead if you own majority shares).
We can only say 'what if' to the ideas of the people and companies that tried to make a difference but ultimately could not.
But what we must realize is that the reasons people are using the offerings as is, is because the products offered are checking off the right boxes for most. That may not be you or me. But it is the reality.
I pick and choose over new offerings and make them bend to fit me. I also show to others why accepting most offerings is not in their best long term interest either.
That is the final choice we have in the long run. And I participate in it fully - and I try to not cut off my nose to spite my face - even when the offerings are downright banal (yeah; I tried to like the fruity company once or twice...).
Starlight5 likes this. -
Throughout history, brilliant people have not only provided solutions to things they themselves considered to be problems - they also managed to get the masses to agree with them that there is a problem, and that this fancy new invention will solve this problem that they never knew they had until now.
Sadly, with so many people content with the performance of their PC, the pool of potential innovators is small to begin with, and even if one of those innovators comes up with something that revolutionizes computing performance, there are likewise few sympathetic ears (and wallets) who could help them get their idea off the ground...
I mean, think about it - all the technology you hear about nowadays revolves around computing on the go, whether that be in the form of wireless, wearables, cloud subscription models, or whatever. The Wintel space is hardly even talked about anymore - as far as most people are concerned, the idea of upgrading their computer is about as exciting as the idea of upgrading their fridge or their washing machine. -
But yeah... commoditization has more or less arrived. And part of it is that, for most people, the performance of their PC really is adequate for what they want to do. I can do the tasks a non-technical person would do - web browsing, video chat, watch a DVD, maybe write a letter in Word or update a financial spreadsheet in Excel - just fine on my 2007 laptop. Sure, the hard drive is slow by today's standards, and the battery life with its 4.5-year-old battery sucks, but there'd be little reason to upgrade to anything newer if I were a casual user and bought a replacement battery every few years. Live with the not-as-quiet-as-modern-ones fridge and call a repair dude when it stops cooling effectively; accept a couple minutes to boot on an old hard drive and buy a battery every 3-4 years. I could see those being pretty much equivalent to someone who's not really that into computers.
You're right about the being able to convince people that they had a problem, too. That's part of what Jobs and Apple did amazingly good at in the 2000s. First with the iPod, then the iPhone, and then the iPad. There were mp3 players, smartphones, and tablets beforehand, but most people didn't think they needed one. Apple convinced much larger numbers of people that they had a problem their product could solve. Though Blackberry had done the same thing with smartphones, with a different audience, previously, and various PC companies had in the past to make the PC more mainstream. Lately though, nothing in the PC space has really jumped out as solving a problem for lots of people. The Surface and hybrids like it are a nice niche, but there haven't been tens of millions of people deciding the Surface will solve one of their problems. SSDs are nice and snappy, but most non-technical people I know with hard drives instead of SSDs don't consider themselves to have a problem with how fast their computer boots and starts programs. Greater-than-HD screens are nice, but again most people don't have a problem with even low-res (1366x768) screens; it hasn't caught on like standard-def to HD TV did.
Now fitness bands could be a category that fits into the solving-a-problem-you-didn't-know-you-had category. It's still young and if it does take off, I think it will be more along the Blackberry smartphones line than iPhone smartphones in uptake. But between that and related products like the Fitbit, I think they are finding an audience of people who hear about them, think that maybe fitness is a problem for them and the product might help, and buy one. Again though, that's in the wearables category, not Wintel (though the Microsoft Band is somewhat Windows and is among the more popular ones). -
ARM has one big issue out there, available software which runs on that platform. Think Surface RT.
-
-
naturally, u cant expect everyone to be as excited about their hardware as we are here
cars for example are appliances to me, whereas others here would surely and passionately disagree with this notion
still, theres a niche for these performance car lovers, so why not for us mobile computing lovers as well...?
Sent from my Nexus 5 using TapatalkHTWingNut likes this. -
but ...let me give you a software engineer perspective on this. Customers and sales-people tend to be on the same page about how it's incredibly important for the entire industry to use the same tools, the same platform, and that portability in general is nice as long as the software is created inside a closed ecosystem owned by a single company. The same often is for hardware - the idea is that if a company owns the whole ecosystem, they can make changes to the package in ways that are more fundamental and to specification than if they used hardware that was more general. Software engineers often argue the same thing, saying that software is typically more agile and adaptable if the whole package is developed from scratch in-house. Since this affords the company the option to make changes to the software they otherwise wouldn't be able to. One of the professors at my school has written teaching material that fields this as a well-known premise for a sound software engineering philosophy, and writes in open-source development as a counter-weight to that, with the drawback that it will be software developed on foundations that are less "adaptable". And that's how the industry talks about product development. You recognize the narrative for Intel products just as for Apple products - the idea that if you own the platform, you can make changes to it that are unavailable for third-party product developers.
While the truth is that making these changes to an existing platform is extremely expensive and cumbersome. So any company that actually does develop a platform from scratch will invariably be reluctant to make any changes to it, and they will also be reluctant to consider actually suggesting that anything should be changed at all. In fact, their software foundation is, after all, carefully developed to be the best product on the market, and for all you know it is. So why should any changes be done to it? Really, what would mere customers know about software development? Perhaps that suggestion someone made isn't either necessary, or even possible on "current hardware" -- and that's where the ball stops rolling. Meanwhile, very small and incremental changes are made to an existing platform over time, and only if it can renew the product enough to be able to make another similar product as before. Basically, the company will automatically attempt to protect it's own expensively developed product, and are reluctant to actually make the changes to it that would enable them to sell new products.
At the same time, the demand for new functionality fails to show up. Because your customers, who don't know what you're doing, will be easy to sell on small tune-ups, since these small changes are also expensive - after all, any change requires a huge rework. And so we get this loop of small changes that are sold as wide-ranging and groundbreaking, but still fail to actually renew the software foundation. Inside this is where we get internally developed pseudo-standards for database code, specifically developed to work inside a specific software foundation, for example. Where we know as software engineers that this code is never going to be reusable, but we still sell the idea that we're reusing that extremely expensive platform. While the actual brunt of the software development takes place inside a very limited scope, which still requires a fairly high degree of rethreads as the code is developed. Outsourcing program jobs to external and cheaper software-developers by creating abstractions inside a software foundation is basically where most of software development takes place now. While the actual work that's being done is quite large in amount, and as said it rethreads fundamental building blocks to a very high degree. Even though software is typically developed on the same hardware over and over, and into the same systemic limitations that are set, over and over.
So the idea that a closed system is more agile is actually false. In practice, but also in theory when you consider that the abstraction set initially is typically based around a known platform or existing hardware, and a very framework for the software foundation that is established very early in the process - typically by probably very skilled people, but people who develop a prototype with limited functionality for a specific task.
On top of that comes all the smartasses out of school and full of ideas and experience with different kinds of systems. And we have to work inside these extremely limited branches, with very specific tasks that have to be, and this is very often, re-implemented to fit with very small changes to the existing foundation. Or otherwise there's a demand for implementing what shouldn't be done at that abstraction-level, and we're essentially being told to spend a very long time (and a lot of a company's money) on implementing solutions that - if so well implemented that it will survive instead of thrown out - will enforce yet another layer of specification that limits the foundation's "agility". Meanwhile, the people with the wallets are interested in things that they believe they can sell to customers, which - obviously - isn't super-efficient database code, or graphical user-face implementations with guaranteed execution times. Over something with a button that animates with even more bling than before, etc.
The entire thing takes on epic proportions when Apple people for example introduce something they call "visual science", where they're creating an entire scientific discipline out of the need to have UI response faster than what a human brain will notice is a pause - in order to actually insert that concept into the software foundation development process. That's just the ultimate example of the intellectual bankruptcy that dominates commercial software development now. Where we sell a button with guaranteed response time for the visual component as a brand new science discipline on it's own. After which then a single company ends up "owning" responsive UI as a concept on the market. Not because it hasn't been done before, or afterwards, but because it's part of the sales-package, and because Apple is the only company that treats this concept consciously. To the point where, as people find out, Apple products are often very comfortable to use, and where Apple products actually do create demands for hardware developers as well. But the market demand isn't for the software solution, but for the software foundation package as a whole. As if Apple owns (or owned) immediate response after key-input, or a scheduler that requires response input and visual feedback to be predictable regardless of system load.
But this is something that keeps on creating these small gimmicks for renewal in software foundations, while hardware changes tend to be associated with monstrously expensive re-implementations of existing code, or otherwise simply throwing out all the code and starting over. While the most interesting software development happens on web or in the mobile space where the development trails are shorter. Meanwhile, surface RT and that Microsoft trail is 100% about forcing an existing software foundation on top of ARM, to "allow" MS software to be made available for other platforms. Something that doesn't necessarily get us to a point where we can make use of that platform to improve such things as UI response and normalized processor load. In fact, the opposite will happen. And this is in spite of the fact that the RT foundation is actually a pretty fleet system with fairly solid design-choices.
All I'm saying is that if there actually was a demand for fairly specific and very common problems to be sorted out, from how data is abstracted and shown to the user, to UI response, and how the system functions under normal work-loads. And customers were discerning when it comes to battery life and actual performance of the applications (instead of interested in the sales-pitch for the latest gimmick). Then the market would adapt to this very quickly. Because hardware has become so cheap to develop, and there are so many platforms that theoretically can be used to deploy these solutions - while so many programmers are extremely conscious about these problems. So there's really nothing else but lack of specific user-demand that stops a more holistic approach to hardware and software development from being adapted as an "industry standard".
But until that happens, we'll continue to see very solid platforms being developed, along with very complicated and heavy software foundations - that then find absolutely no use or demand in the commercial space; People think what they have is good enough, or that it's the best you can actually get. It's a bit frustrating, to be honest.Jarhead likes this. -
Tsunade_Hime such bacon. wow
-
And the manufacturers followed the suite by slashing both their profit margin and QC to beyond bare minimum, effectively cutting the branch that they were sitting on. I can think of only two brands - Apple and Panasonic - that had management smart enough to resist the temptation. -
Of course, this source of additional demand is completely absent in the PC space, for obvious reasons - think about how society perceives someone who owns a Ferrari, and then think about the stereotypes surrounding someone who owns a big gaming rig. Big difference, no?
jaybee83 likes this. -
In the case of the company I worked at, they did believe they needed to move their platform, since the third-party platform it was build on was no longer updated. But indeed, they were building their own internal platform (albeit based a more open source - though not entirely - base) when I left, and thus their own internal standards. The standards for their old platform were indeed nice for internal development - and one enterprising developer even used them as a base to develop an internal development tool that became so popular that customers asked for it after seeing it - but it also meant that they had to train new developers on their internal frameworks after hiring them, since unlike Apple they weren't selling to consumers but businesses, and thus the chance of a new hire being familiar with their methodology was quite low. To their credit, they did do a good job with their training. They've also taken steps to become less dependent on any one database vendor or hardware vendor, but the latter at least is somewhat responding to market movement away from Unix servers.
They aren't necessarily the best parallel, though. As far as I know they're still considered the market leader in terms of features and user experience, they're still growing rapidly but not near the market dominance of Intel or Apple in mp3 players, and they don't outsource anything. That may be because the CEO was also the original programmer at the company, and the leadership is more techie-heavy than businessperson-heavy. At any rate, it was an interesting mix of having an internal platform, yet also being aware that doubling down too much on the old core wouldn't work in the long term.
As a user, I'm sympathetic to Microsoft's moving-the-platform-to-ARM idea rather than building from the ground up, since while it can hold things back, compatibility is nice. I'd actually be more willing to adopt it if they had x86 emulation on ARM, even though that obviously wouldn't be as efficient - the classic problem of entirely restarting is you have to gain the customers anew as well. Intel ran into that very problem when their designed the i860 and the Itanium, so it isn't really even fair to say they didn't try to move to something better than x86 - the market just didn't move with them. Essentially then, I'm saying Apple did the smart thing by including fairly good backwards compatibility when moving from 68K to PowerPC, and then PowerPC to Intel, and only dropping that backwards compatibility years later. Microsoft may have better luck moving to ARM long term if the did similarly, though obviously there are other considerations at play as well.
Though I personally disagree with the most exciting software development being on the web/mobile. I'm in software development for the algorithms and the logic, and you usually don't get as much of that on the web or mobile (except perhaps the backend of a web application). The glory may be in web/mobile these days, but I'd still consider most of the interesting work to be on the back end.
And since when has Apple been great at responsive user interfaces? Not arguing that they aren't saying that and talking about "visual science" and whatnot, but I always found their animation-heavy UI to be rather not that responsive since it was often waiting for some animation to finish, whereas in operating systems like Windows 95 (happy 20th birthday!) things tend to happen nice and quickly with no unnecessary delay for animations. -
-
-
I mean, I hate everything Apple now, but the point was that they had to have this grand epic scheme that people could talk to sales-people about with tears in their eyes before it actually was worth it to get fairly simple things implemented. And even then there's no real push to for example get Intel to design a bus that has an option for making IO work in a more predictable way. We're seeing things like PCI-e, and the rework of PCI-e(4.0) is promising by having IO scheduling controlled by high-level programming input. That's promising. And would make it possible to design the entire bus differently from the OS. But you really haven't seen any kind of suggestions like "hey, wouldn't it be nice if we could push the entire graphical UI functionality into the graphics array, and then design the UI in such a way that we never have lag from random idiotic hdd and memory IO that the users shouldn't need to wait for", is all I'm saying. And having that drive 3d UI development towards a point where it's not just an independent overlay, but where the functions underneath are actually designed to fit with a pause-less UI...
Isn't it's a bit weird when a PC has to run at 2Ghz to /mostly/ avoid feedback blips?...because it's definitely not necessary to run a pc that fast to have guaranteed response time towards the bus. In the same way that unpredictable loads can still cause blips in the right circumstances almost no matter how fast a processor runs, when you don't plan IO well.
-
I'm going to cherish the i7-4900MQ in my beloved MSI GT60 as long as I can. It is an end of an era. Goodbye socketed mobile CPUs.
Its probably only a matter of time before Intel removes this option for desktops as well.
Thank goodness Clevo still has the option for desktop CPUs in their notebooks, if only MSI and other enthusiast notebook manufacturers could do the same.
To heck with these soldered and throttled HQ processors.D2 Ultima, alexhawker and jaybee83 like this. -
IBM had a *lot* on their plate - more than any other PC manufacturer - and when the production (of ThinkPads at least, can't say much about the other lines) was moved to China in late 2003 in order to minimize costs, QC issues started piling up big time. When the lead-free-solder regulations hit and turned their premium range (T4x series) into a BGA-related disaster, the writing was on the wall. Their exit was a smart and seemingly dignified move nevertheless, at least IMO. -
Tsunade_Hime such bacon. wow
-
The story behind moving the production to China is a bit more complicated..as for the warranty support, IBM still runs parts of it both in the U.S. and abroad.
But - at the end of the day - IBM left the PC business a decade ago and is still being talked about...there's something to be said about that state of affairs IMO.Peon and Tsunade_Hime like this. -
IBM still has research and development projects and offices around the planet, as they always had. But their laptop line and thinkpad brand is basically Chinese, or owned by Lenovo. Not sure that on it's own would make any changes to the actual hardware used in the laptops, though... but if you were looking for weird and crazy design inventions that never made any money, that part might have been lost with the brand sale
But yeah.. things about structuring the entire business around providing solution on request from selected customers, instead of developing solutions on their own, that was kind of a bummer. Thing is that IBM still has had a hand in the most interesting development platforms made in the last 30 years. CellBE, the memory architecture with the EIB ("element interconnect bus"), with asynchronous reads and writes, the FlexIO interface and design at Rambus, etc. This sort of thing wouldn't have happened without IBM.
Still - didn't make them any money. Just to put the other things about market demand and what is assumed to be possible to sell, etc., in context. It is the case that brilliant solutions like these are impossible to market alongside Intel hardware, simple as that.
And then we get a very specific and specialized product that dominates the entire market, with the effect that has on software development. -
Last edited: Sep 4, 2015
-
And that, my friend, is the story of how they became the world's largest PC vendor...
BTW, IIRC it didn't plummet right after switching to Lenovo. It took a while before Lenovo realized "oh, wait a second, we shouldn't be doing this".Last edited: Sep 4, 2015 -
-
You mean by being a domestic, partially-government-owned company having unrestricted access to the world's most populous - yet anything but free - market? -
^
hehe
So you could perhaps say that ARM, PowerPC, and so on are natural candidates to inherit that role now that Intel is unable to produce improvements on their own platform. That is, without moving into the "combined processing element cores" AMD talked about before they were shut down by lawsuit. Which should be entertaining, if Intel receives injunctions from themselves, or something like that, even if they "acquire" the tech themselves. But the thing is that the way the computer industry has developed, simply means that there is very little opportunity to compete on hardware or software, and specific solutions of different kinds. Hardware has a fairly high entry cost, and changes here is only going to turn up if you can shave off costs significantly and still sell at the same price as before. That's not going to happen. Software in the same way - people don't want to pay for specific solutions or particularly developed environments (or stuff like Lotus - it's not competitive when you can get similar solutions on web or developed externally for three rice-corn). So that we end up with very similar products on the same platform isn't exactly a surprise. And IBM, or anyone developing tech that actually has a future, isn't going to move into that market with a cost-effective asynchronous bus-solution that can be fitted on any Intel platform.
What would have to happen first is that customer demands become different. And that has to do with simple things like "I want my office-computer to run at the same processing requirement for 9 hours without fail, regardless of workload". Or "I want a 3d interface in my overlay for a console/tablet/laptop to run with a predictable processor load without impacting the running of the rest of the software - how do I get that to happen". Or, "I want my videos to play back and stream in HD without the processor running warm".
But as long as we're impressed with synthetic benchmarks and microcode optimisations that are... completely bloody useless in real applications, that's not going to be an element in any sales-package at any hardware or software manufacturer. It's as simple as that. But in the meantime, when I say things like "well, you know, if we had several full-function computing cores that could access working memory asynchronously, we could program a desktop environment or a phone that actually could have guaranteed execution time for essential services, without halting working threads. Also, I could create occlusion detection algorithms that could predict the shader cost with 100% accuracy, reduce the hardware requirements for current application to a fraction, as well as create new real-time effects that you don't believe will run on a current computer". Then people have no idea what I'm talking about, or how any of that relates to hardware or software in any way. It's like saying "hey - you know, if we put a log under that huge stone slab, we could roll it uphill" to someone perfectly happy with cracking the stone into pieces and carrying it off in a knapsack.jaybee83 likes this. -
Until it runs real software and does real workflow, ARM is not it. Basically Windows RT as it is now. Until the developers step up. PowerPC is no go, Apple had to leave that ship when there was no G5 which was cool and battery efficient for mobile use. The G4 platform was the last usable PowerPC platform which was viable in laptops. And the PowerPC consortium could not deliver a G5 solution which was workable for mobile use. Due to the heat and power consumption issues. Which is why Macs switched to Intel. History here.
-
That's one interpretation, at least. Another interpretation is that Intel sold promises about future processor improvements at lower power drain quite well, while putting so much money into the project, that people started believing three things:
1. That increases in processor power on x86 is unlimited, and only dependent on production density.
2. That processor clock is essential to all operation of a computer.
3. That software developed over x86 is more open and less subject to restrictions by the manufacturer.
Some of this is true, and IBM's attitude towards proprietary formats is well known. This was also a huge issue with the CellBE - no one will develop software for something that is going to be tied into a specific company, and be reliant on a certain profit afterwards. That's so old fashioned it probably never worked for anyone except Microsoft and IBM... *cough* ...anyway.
The rest was and still is a pure invention. By people at Intel, but also people who develop on x86 in different areas, and use tools that are very specifically dependent on low-level optimisations for that instruction set. In practice, we're actually struggling a lot with optimisation on x86, and have been since SSE2, because of the increased amount of microcode optimisations. But people keep believing things such as that as long as you buy the proper compiler, things run perfectly the first time, and so on. It doesn't. You often run into things with timing and memory handling that you can't predict, and that shouldn't happen logically if you base your expectations on the platform specification. It's simple things like syntax form, that the compiler favours one syntax over another.
And I'm not saying that problems like that will immediately disappear if we only had some explicit parallel assembly language in the first interpretation layer, so we could target that across different hardware by different developers, or anything like that. But the same problems still exist on x86, and they always will. In increasing amount as well, when we start to rely on microcode optimisation as much as we do now.
Meanwhile, IBM didn't sell PowerPC in the same way, and very few companies had the ability to woo people with a long-term plan. So Intel won by default, and it's not unfair to say that IBM can thank themselves for that as well. We saw it again on the way the bidding process went with the ps3/xbox360 and towards the next "generation" afterwards, to take something less obscure - we all know that one kit has more potential from a hardware perspective. But from a marketing and sales-perspective, one of the partners have a more comprehensive solution that creates an expectation with the current software partners. One of them are offering a kit that is incredible in every way, but where everything developed on it can be thrown in the bin (at least physically) at the end of the period. The other is offering something that will last in terms of sales, and that ensure (or at least promise to ensure) that partners and software developers can persist with the same development process on the next kit later on.
That's worth money. And that's how a technically sketchy solution (obviously, Intel isn't a power-efficient platform, even if it employs clock schemes that make the power-drain passable when you use the computer "normally", i.e., let it idle for most of the time) ends up beating what is demonstrably a kit with better performance on everything that counts, along with offering potential for miniaturization later on that would make any engineer instantly think about VR and things like having your office-computer inside your pen in your pocket just streaming the image to the nearest terminal, etc.
But it doesn't make money now. So when you have the choice between making billions of dollars, and then more billions of dollars, at extremely little cost, every time you release a new version of the hardware. And on the other hand having to invest in something, and then maybe earn the same millions back later on, except you've already spent all of them on research -- what would anyone choose?
As I said, whether you're just a peon with book-smarts like me, or a board-director, or an investor - we know that people will buy whatever crap we put in front of their faces. It never fails. And a cynic would say that that's exactly why nothing interesting happens in software and hardware lately. We have to develop something new, and it has to happen now, and there has to be a predictable curve and a goal above the blackboard that we'll never reach. So would you really put the whole company to just spending masses of dollars, if you don't have to? And even just smaller side-projects become a way to sabotage the industry, if they don't fit into the overall picture.
Just imagine someone developing a brilliant solution in car-engines... you know, like a triangular rotary cylinder, to essentially drop maintenance costs and increase fuel efficiency. Or car chassis that don't rust and need to be replaced. It's exactly the same thing: you don't let that happen as long as customers buy what makes you more money. It's the simplest and most predictable logic there is.djembe likes this. -
Re: PowerPC mobile. G4, 1.4 Ghz was as fast as laptop solutions got when Apple offered PowerPC. Apple was forced to change to Intel due to the speed issue and the fact that faster mobile solutions and higher clock speeds were not forthcoming. Intel was necessary to keep Apple competitive in the laptop game, this is CoreSolo and Core2 at that, never mind later developments. Until there is a PowerPC effort to keep up, ancient history for laptop users.
TomJGX likes this. -
John Ratsey Moderately inquisitive Super Moderator
And back to the subject of computers, whatever the basic processor architecture the power consumption tends to be roughly proportional to the square of the speed as the voltae needs to be increased to ensure that the electrons behave themselves when expected to move faster. Hence more cores running slower is more efficient provided the workload can be programmed to use all the cores.
Johnkatalin_2003 likes this.
WTF has gone wrong with the PC World?
Discussion in 'Hardware Components and Aftermarket Upgrades' started by HTWingNut, Aug 20, 2015.