So during 2015 i'm going to be in the market for a new (15.6" gaming) laptop to replace my (excellent, but underpowered) Envy 14... And it got me thinking, what sort of 'tech' or components are out, due out and mostly anticipated for 2015?
I'm almost certain that Intel/AMD/Nvidia will have a new range of CPU and GPU's out (don't they always? haha) but what about other features and hardware? We're getting QHD screens in our phones now, so will that trickle into notebooks too? I'm still using 1366x768 so an upgrade in that department is very much needed!
Batteries/Battery life is another area that all our devices could do with a bit of improvement, be it charging time or capacity... and I'm sure the list can go on and on, fanless ultrabooks, smaller and lighter chassis
tl;dr... If you're buying a laptop in 2015, what sort of hardware should/will you be looking out for?
-
-
tilleroftheearth Wisdom listens quietly...
Buy sometime in 2015 with no 'need' to buy?
Broadwell platform, if not Skylake. DDR4. Period.
Everything else is not essential. -
Look out for better screens. Laptop screens that are higher than 1080p, and laptop screens using IPS panels.
Everything else is incremental. CPU speed, GPU speed, SSD speed, HDD capacity, battery life... all of those will get slightly better with time (as always).
But the major jump to look out for is for laptop manufacturers to stop putting ty low-res (1366x768) panels and/or crappy TN-based panels in laptops to try and save a few dollars. This day and age, I wouldn't buy a laptop if it didn't have an IPS screen.ellalan likes this. -
Thanks very much for your replies guys. It's appreciated!
Higher resolution panels seem to be getting higher on my personal list of requirements, this Envy for instance is 4 years old now (I think) and like I say, it shipped with a low-res screen which is disappointing... but being a 'gamer' it's a never ending struggle balancing high resolution with a GPU capable of delivering the frames to run games natively haha
So all-in-all, is 2015 going to be quite a quiet year for notebook technology? Would I be just as wise to go out tomorrow and buy my P505 from MySN as I would in say 6-8 months, if they were to have a refresh of some sort? (Don't worry, I know nobody is Mystic Meg and can predict exactly what Clevo will produce next) -
DDR4 won't be used in Mobile till Skylake H will be like late 2016 at this rate... So no point waiting for that... Honestly, as far as going for the P505, I'd suggest getting the U505 Batman due to the i7-4790K desktop processor, 980M MXM GPU, up to 32GB RAM, 2 2.5" slots.. You'll be able to upgrade CPU+GPU.. Don't get the Clevo p771ZM from mySN as the price is a rip off.. Scan sells it for cheaper.. Graphite LG1521 it is..
-
tilleroftheearth Wisdom listens quietly...
Would it be just as wise? No. Never. Not when two new platforms are/have launched/launching this year. -
I don't get it, at any point of time, there will be something to wait out for.
Krane likes this. -
-
HBM GPU from AMD
That's the only thing I can think of waiting for in 2015 before buying new hardware.
Looks like Broadwell on its own won't really provide any significant performance leaps or for that matter better heat management (though that remains to be seen).Last edited: Feb 1, 2015 -
tilleroftheearth Wisdom listens quietly...
One Broadwell platform tested on Anandtech shows ~10% improvement over a comparable Haswell platform.
See:
http://www.anandtech.com/show/8941/gigabyte-gbbxi7h5500-broadwell-brix-review
-
Depends on the workload.
Plus, those numbers are for ULV models, and we still don't have a good idea about potential thermal issues. -
Starlight5 Yes, I'm a cat. What else is there to say, really?
Surprisingly, I hope they won't dump 1366*768 completely - at least on ultraportables. My eyesight is not perfect, scaling in Windows and Android, both of which I use quite a lot, is even worse - so no thanks, I'd rather hook up an external monitor for productivity than get eye strain.
-
However, there are some occasions where there are upgrades that can make a big difference between your system being rendered obsolete in a short time, and those that still remain useful for many years to come.
Ports are a prime example of this. The transition from USB 2.0 to 3.0 was a huge update particularly with the addition of external flash storage. Screens on the other hand, add convenience, but won't typically stop you from doing your work.
As for CPUs, they typically only advance incrementally -- although quad-core was a significant advancement. While GPUs, on the other hand, can have a much larger impact on your extended computer use and enjoyment. So make sure you choose the best one for the long haul. With 4K just on the horizon, I think these are are what's in the biggest flux at the moment.
In fact, that's what rendered my HDX immediately obsolete. Its antiquated DDR 3 512 Mb GPU just couldn't handle Adobe's Mercury Playback engine w/GPU acceleration. Since applications such as this comprise my primary computer use, I was forced to upgrade even though the rest of my system was passable. These are the kinds of things your really do need to keep in mind, and watch out for.Starlight5 likes this. -
I'd love to get my hands on the Aorus X5.
15.6" machine with up to a 4K screen and SLI GTX 965Ms. What's not to love? -
tilleroftheearth Wisdom listens quietly...
A gen 4 platform. -
Just sayin'. I see the argument but would be a better point if the system supported such power saving.
TomJGX likes this. -
Going probably just a little bit off topic here, but does anybody remember a movie scene or a sketch where a guy hears about this amazing desktop pc in a commercial and goes to buy it, and by the time he arrives home he hears another commercial that calls the model he just bought garbage and introduces the next generation super wow model, so he returns to the shop and buys the new model, and this happens 1 or 2 more times.
I cannot even remember if it is from a movie, or some funny sketch from a tv show, any help appreciated.Apollo13 likes this. -
-
But yeah, if it was a 4K model, I can definitely understand wanting more than just 2GB. -
tilleroftheearth Wisdom listens quietly...
If Optimus doesn't work with todays platforms that's a hint that it's outdated too.Krane likes this. -
-
Name one laptop that supports dual GPUs, AMD or Nvidia, that actively supports either Enduro or Optimus. -
tilleroftheearth Wisdom listens quietly...
Dual GPU's and either Enduro or Optimus support? I don't care about and never have.
Intel CPU and igpu is all that is needed for my high performance computing needs. I also don't whine about features I can't buy or used to have once upon a time. I can use what is available and always test anything new against what I have to see if the progress offered is in line with what it will cost me to switch/upgrade.
Clevo class notebooks don't draw me in either. But if they've dropped support for a marketing check point that was offered in the dim past... yeah, smell the future of optimus - it's dying/dead and outdated (or will be soon). And if I'm wrong; well, throw a party. But if that is what the options are right now, whining won't make that change. -
The move from Nvidia 8xxM GPUs to 9xxM GPUs recently was a pretty massive step forward for dedicated mobile GPUs - 40% or more greater speed, whilst also running at lower power and cooler
deadsmiley likes this. -
You can't sit there and claim something is stupid when you've never attempted or cared about anything related to such. That's like me complaining about how I don't understand the point of a sports car when it doesn't get good gas mileage...and yet, I don't like driving fast; you can't hate on a dual GPU laptop when you don't even care to immerse yourself in gaming of any sort! Buying a high performance anything generally means you have to sacrifice something else, like an efficient fuel system or, more the point, good battery life. I run into this all the time with desktop users. They jab at how horrid the battery life is on all these super high end laptops...completely missing the POINT of a gaming laptop. It's not there to be used for surfing the web or doing word processing tasks for hours on end. It's there to play games and the battery life of most of these DTR machines is definitely lacking. However, again, that's not what they're built for. The battery of a big laptop like that is a glorified UPS.
So in claiming that, summarizing as, "I don't use it, therefor it's unnecessary in every case and a waste for the platform." is ignorant, biased and immature, especially so on this forum. If you go to any of the boutique brand sub-forums, MSI, Asus, Acer, Gigabyte, Alienware, Toshiba, etc, I think you'll find you're quite wrong about how it's a waste to use a laptop in such a manner. It's the penultimate design of a slim/small form factor, all-in-one PC (model dependent) that will probably never be replaced, or if they are, it'll be more of a merging of multiple devices rather than a true replacement. -
tilleroftheearth Wisdom listens quietly...
Okay, you win (at missing the point). -
Don't argue with trolls.
-
-
Uh, G-Sync? Will that be a thing this year? Considering that a single GPU solution will rarely get games running at a steady 60FPS at 1080, that would be a cool feature to have.
I can just never justify an SLI purchase (mainly for price/paranoia that there will be that one game that I REALLY want to play that doesn't support it until ages after launch) and this would be a sweet spot for me. -
GSync, or AMD's equivalent of it, could be a major feature. In a way I have more hope for AMD's solution, as it looks like it could work with most monitors with a firmware update, as opposed to requiring an additional chip. While I doubt many existing monitors would get it, it likely would be both easier and cheaper to add support for it to new monitors. And ultimately, for me I'd narrow down my monitor choice by display characteristics first, and GSync/AMD's equivalent only after determining I'd be happy with the picture.
I agree that most things will be incremental, including gradually better screens. I also bemoan the great number of 1366x768 monitors, although I can see having them in, say, 12" screen laptops. As someone who no longer needs the best performance money can buy but is somewhat picky about screens, I look forward to the day when I can find a laptop whose specs and price point fit what I'm looking for, but not have the potential purchase canceled due to it only being offered with a low-quality, 1366x768 glossy TN display on a 15" laptop. Which unfortunately was a recurring theme of 2013 and 2014 for me.
4K on a laptop though, I'm skeptical about. I might be convinced after using one, but for now the pixel density seems way too high, and Windows' scaling is not that great. A few weeks ago I was at a friend's place, and he'd recently acquired a 1920x1080, 13" Windows 8 laptop. At 125% scaling, the desktop looked nice and crisp, but he was complaining that Device Manager was blurry and ugly - and it was. The recommended solution? Disable scaling. Which doesn't work very well when 100% scaling is too small to read easily. I reckon 4K would have similar problems. I might like it on a 17" laptop where I could just use 200% scaling and having the non-DPI-aware programs still look good, but at this point I'd still prefer a screen with closer to 120 DPI over a 4K one, even at the same price.
(And yes, I do have somewhat nerdy friends)Starlight5 likes this. -
Pcie SSD, i think is important to wait, usb 3.1 (dont know date), search for socketed processor and you will prob be safe
-
Erm...Might be slightly off topic, but since it was mentioned on this topic (and since I am currently looking for a new laptop), when is broadwell really going to be released and manufacturers will start updating their products? I can't find a definitive answer on the web.
-
120-144hz screens... such a major issue in FPS games.
-
-
-
Unfortunately high refresh rate and IPS don't mix, unless you've got a non-Optimus machine with a good panel that can overclock. 120/144 Hz laptop screens are in the category of 'endangered/soon-to-be extinct' rather than 'to look out for' in 2015.
-
-
USB 3.1 and Skylake are what I'd be looking for in a future laptop.... and I don't see that realistically until 2016. Broadwell isn't a big jump if it even materializes in the full power laptop form.
Maxwell was a huge step forward in GPU's, much more so than a Broadwell update will provide CPU side. If you are wanting a new machine in 2015, I really don't think now is a bad time to buy... just know that Broadwell will offer a line of refreshes but not a big performance improvement. -
Be on the lookout for new Intel quad-core processors that run at 400Mhz with a 500Mhz turbo-boost, 1W TDP, no heatsink, and suffer from thermal throttling issues.
Don't miss the new quad-SLI GTX 910M!
Last edited: Feb 9, 2015Mr.Koala, Starlight5 and ajkula66 like this. -
I predict(!), that the next couple of years will be mostly uneventful. Tech exists that would allow for thin plate-batteries with twice the charge compared to the "standard". But we're not getting it because of how the production lines are still pumping out lithium ion batteries in huge heaps. They're cheap, and people still buy them, one way or the other. It's ended up at the point that some manufacturers have started to improve the solution in the batteries, to something that won't deteriorate so quickly, instead of waiting for the batteries to become obsolete (as they really have been for over ten years).
That happens because power-draw on the most popular platforms isn't a priority. There's a fictional demand for "performance" that trumps power-management. And that stops actually efficient power-management from being focused on. Which in turn makes longer lasting and more reliable batteries a "specialist demand". This won't change for years.
Meanwhile, we could have ultrabooks with hyper-resolution screens with multicoloured "e-ink" - non-backlight based screens that would not be difficult to read in the sun, and draws about as much power in a circuit as a very long coated copper-wire. But we're not getting that, because of how 90% of the PC and digital thingy users, and most developers as well, keep swearing to software with absolute references and low-resolution, non-scaling, non-vectorbased UIs. I used to scream my head off about this in 2001, when developing for Symbian, that absolute sizes for UI was laziness and stupidity all at once in the same package. And in 2015, we still get UI with absolute references. While people develop CSS on web-pages -- and still pick UI-elements that don't scale. They program in absolute sizes for the borders - on large, commercial software. It's so extremely ridiculous that I know for a fact that a large IT-company sold "scaling to mobile without changing the code" as their own internally developed schema to several large actors, for millions of moneys - in 2013. None of them understood what they were doing.
And that's why low-resolution ugly TN-panels, and laptops with 4h battery life, will still be on the market in 2020.
We could also have passively cooled laptops right now. We could have 8w packages running OpenCL simulations at rates that would put intel to shame for another 20 years, that no "internal graphics" solution on the current separate (and IP protected) system would have the technical capacity to match given any length of time.
But we don't have programs developed that would actually exploit this. We don't have explicitly parallel assembly languages, we don't have a low-level layer that is compatible with different alu clusters from different manufacturers. And we will never get that with the way the market works (i.e., not at all).
And that's why linear computing is hitting the edge, and we still have intel steering the business by the ears. And that's why we have TN panels, even though better tech exists, thanks to how a certain OS with absolute references and low-resolution UI elements is so common. And that's why we won't see any interesting developments in the laptop market for another several years.
Instead, we're going to see the improvements happen on smaller devices, on tech that actually exists right now. And it depends on using the existing platforms in a more clever way, while opening up for something to actually replace them, to improve in the areas that actually matter on mobile platforms. Which, of course, is becoming literally everything very quickly.
Qing Dao, t456 and Starlight5 like this. -
I want 1920x1200 or 2560x1600 as standard in 15" and larger models.
But I do agree that they should focus on higher refresh quality LCD's like 120 or 144Hz at 1080p or 1200p over a super high res one. For gaming fast response is all that matters. And current tech can barely drive 4k at 30fps, so I don't see the point.Mr.Koala likes this. -
Do we actually know what is the best display configuration with respect to eye strain when editing documents and programming?
I myself have found that sometimes a cheap low-brightness TN panel does not strain my eyes as much as a nice vivid, colorful, bright, super HD, IPS display.
Sure, the latter is better to look at, but in the long run I think it strains my eyes more.
Has anybody else experienced something like this?
I am usually most happy with a low-brightness matte display, relatively pale in colors, but with good contrast. With anything too vivid or bright I feel my eyes boiling from the inside out after several hours. -
Maybe just reduce the overall brightness or use it in highly backlit rooms?
I'll agree that my old M17x R2 really had me squinting in dark areas a lot, but it wasn't all that bad when I had an evenly lit room to back it up... -
Starlight5 Yes, I'm a cat. What else is there to say, really?
Picolino, I use devices on low brightness and with warm colors, applying filters to reduce color temperature (EasyEyes for Android, f.lux for Windows). Recently moved to HP 4740s and it's unbearable without this filter, blueish colors cause eye strain.
Qing Dao likes this. -
Maybe to a degree.
I find even the absolute best LED-lit laptop displays such as DreamColor a pinch too "aggressive" - for a lack of better word - when compared to a good CCFL-lit panel.
Do bear in mind that my eyesight is extremely far from perfect in more ways than just one.Qing Dao and Starlight5 like this. -
I bought an external IPS monitor recently, and while it looks good, the contrast is too high and the colors are too vibrant. I have it sitting next to my laptop so I use both displays, and I want to spend more time looking at the laptop's display. I just turned down the contrast and brightness, and it helps a bit. I just need to color correct it like I did my laptop, because the difference in colors is driving me nuts and I can't get it to look any good by adjusting it manually.
The largest issue I used to have was with looking at a CRT at 60Hz. It was fine for a while, but after an hour or two, it would start to take its toll. I ended up just using high-res with 60Hz when playing games, and on the desktop kept it at at least 85Hz. Similar to this is a modern LCD that flickers at low brightness levels, but I find it only annoying, not something that causes eye strain.Starlight5 likes this. -
I've recently started playing all my games in 8-bit color 640x480 because anything higher looks too life-like and hurts my eyes. Plus, the performance increase is insane. MY FPS IS OVER 9000!!!!!!!!!!
TomJGX, Starlight5 and alexhawker like this. -
I also use f.lux and keep brightness fairly low. It definitely makes a difference.
I do occasionally notice a slight bit of flicker on my Windows 7 work laptop, which I believe is LED-backlit. Hadn't thought about it, but that may be why I notice it there and not on my Inspiron that's CCFL.
I want 1920x1200 as well. Not that I plan to buy a laptop in 2015, but I've never bought a screen that wasn't either 4:3 or 16:10. Which means I might have to buy a MacBook eventually and run Linux and Windows on it, just to keep having 16:10 laptops. Scary thoughts. -
Laptop hardware to 'look out for' in 2015
Discussion in 'Hardware Components and Aftermarket Upgrades' started by demonhotrod, Jan 28, 2015.