Lately, I've seen a lot of talking of future proofing and trying to purchase a laptop to last 3-5 years if not longer. In today's market what do you think is actually worth the money because it won't be soon outdated? Are these i7 processors all going to look like crap in 2 years and dual cores are just embarassing? Will usb 3.0 actually overtake usb 2.0 anytime soon and become mainstream? Are SSDs anywhere close to reaching their peak performance? Have graphical fidelity reached a level of diminishing returns? Will 4GB RAM soon become not enough for average users? What about blu-ray?
So what do you think? What's staying for the long haul and what's just another feature soon to become obselete?
-
NotEnoughMinerals Notebook Deity
-
The problem with trying to define it is that "outdated" is a very relative term. What functionality will the laptop need to provide in 3-5 years? I'm sure that in 3-5 years, i7 will perform like crap, dual core will be a thing of the past, usb 2.0 will be garbage, platter hdds will be legacy hardware, physical displays will be replaced by holograms/projector displays, and anything that requires a cable to charge will be considered a burden.
HOWEVER, if the machine you buy satisfies your needs, there won't really be any problems. I'm still running a CoreDuo (not Core2Duo) from 4 years ago with winxp/ubuntu, and I don't plan to upgrade for another year or so, even thought I've been browsing new machines pretty heavily. I also use a netbook I bought two years ago (early adopter), and I think it's still got another 2 years in it for the usage I put it to (e-mail, browsing, streaming video), even though it's running an early generation Atom, 1.5gb ram, and quad-booting xp/vista/7/ubuntu (ubuntu > xp > 7 > vista on netbooks, imo)
If you just plan to browse the web, e-mail, and word process, maybe stream video, anything you buy now will be fine in 5 years, probably even for 8 years as long as the machine is well-maintained physically and reformatted every so often. If you need to play games, render images/video, or perform some other intensive task, you'll upgrade sooner. -
Future proofing is relative to who's buying. Future proofing involves the laptop's use and the projection of those uses into whatever time frame you want the laptop to last.
Personally speaking, an SSD is probably the one upgrade which will last more than the others. Until flash memory caught on, HDDs weren't getting all that much faster in terms of the end user's experience. yes they were getting noticeably faster, but it wasn't "OMG wow". Now that we have SSDs, the one thing they're working on is making a larger density chip and trying to bring costs down.
Everything else is related to your uses. -
Except for gaming, computers have been "good enough" for most people for the last 5 years. There are some features that might be nice to have, but USB3 is backwards compatible, 4GB RAM is very hard to use up for most people, and Blu-Ray is a $100 upgrade to almost any laptop sold in the last 5 years. -
Going to add in here again, you're more likely to cause fatal physical damage to the laptop within 5 years than to have it become truly obsolete or have a component fail.
-
NotEnoughMinerals Notebook Deity
-
A 6 years old Acer here. It has been upgraded few times; mainly:
Ram from 256MB to 1GB
Celeron M to Pentium M
DVD/CD-RW to DVD-RW
The machine is fine and like new. Bottom line is if your needs are the same your laptop can easily last more than 5 years.
-
i would get a core i7... it isn't going to become obsolete unlike core 2 duos... as for RAM , 4GB is enough for now if ur playing games but RAM intensive stuff will require 8GB or more... as for USB2.0 , its too old... USB3.0 is better but e-SATA is quite good too... as long as ur drives write speeds are good... in fact e-SATA beat USB3.0 when data was transfered using the new WD external hard drive... As for SSD's , they're not reaching peak performance... in fact , in laptops they're beaking bottlenecked by SATA 2... SATA 3 is comming next year in laptops so that won't be a problem... in fact SSD's can go way further... -
NotEnoughMinerals Notebook Deity
-
What laptops?
-
Furthermore, I think the best way to future proof any computer is to make to make it as upgradeable as possible. That's why my next laptop will be from the manufacturer that does that the best.
Designing a configuration that allows easily exchangeable and upgradeable parts is an essential element in devices as short lived as a computer. Thats especially true when it comes to those designed with a no-holds-barred specification specifically outfitted to keep those power hungry gaming enthusiasts etc. at bay. -
thinkpad knows best Notebook Deity
Forget USB 3.0, Light Peak is the future for the next unniversal external hardware connector.
-
NotEnoughMinerals Notebook Deity
-
I assume by SATA "3" you mean SATA 6 Gb/s (SATA revision 3.0)? If you keep just saying SATA 3, I might just assume you mean SATA 3 Gb/s (SATA revision 2.0)!
In terms of SATA 6 Gb/s, I don't believe any in production notebook has a controller of that type yet, so the high sequential read/write speeds of a C300 in a notebook are unreachable at present (although I don't think it's 4K read/writes saturate a SATA 1.5 Gb/s interface yet, even, so for "normal" use you'd still be fine). -
Pitabread and Krane have said it well.
One thing to remember is that while double every year or two. So while growing at an exponential rate, the industry cannot simply abandon what was made just 2 years ago. A "lowly" C2D may be obliterated by an I7, but there is little out there designed specifically for an I7 and even less out there that can take full advantage, the industry is still working on the idea that most are still using a C2D and will be for some time yet. You have to aim for what is currently being USED by the mainstream. Not what is selling to high end.
The biggest change in the next few years will be the mass consumption of SSD's, they make the biggest leap in performance, and wonderfully, they can be back ported to older systems.
As things others have mentioned...
USB 3 will not be a big issue for some time to come, it's only bog claim to fame is for external drives. Esata and Expresscard sata adapters can fulfill this function already. Will USB3 be nice? Certainly, but it's easy to live without. Also, while drive space may increase, most people already have more than they need which means you only have so much data to move.
Light peak... is years away for most systems.
It's an awesome idea, one I have wanted for along time and it makes sense. Therefore it must fail or take forever to become mainstream. For those who forget, USB was highly touted as the great successor to ps2, parallel and serial ports. We still use them, and it took many years for USB adoption to really take off. Some will say that is B.S. I was there, I remember wondering what was taking so long. It did not take off like the wildfire expected and we STILL have boards coming with the older connectors. -
and as for light peak , it still is a pipe dream. Its going to be hard to comercialise it and its going to cost a bomb.. -
Looking at the apps available right now, I serisouly think the i7 and similar processors (future AMD multithread CPUs come to mind) will last for more than 2 years before being "outdated".
When the dual core cpus first came out, the single cores were getting stressed by most apps that the CPUs were the bottleneck. Right now, I've seen less and less programs able to stress the quads, let alone a hyperthreaded one like the i7. -
Anything built to handle Vista (dual and quads) should be fine for a bit yet for general computing.
-
4.8Gbit/s vs 3.0Gbit/s.
I know it doesn't work quite like that in the real world but give it time I have no doubts USB3.0 will have no trouble at beating SATA2 and even a a little beyond. remember USB3.0 is brand new, once manufactures figure out how to best use it it will preform better. That being said I'm eSATA cursed and can't get it to work on any machine for the life of me so I've invested in USB3.0 for my desktop and it's quite fast. 80MB/s+ on a WD 2TB green. -
-
SATA 2 is 3gbps.
SATA 3 is 6Gbps.
SATA 3 is already on the market, motherboards and hard drives. Almost every board with USB 3, also has SATA 3. Only a few have USB 3 with SATA 2
Regardless though...
It doesn't matter one tiny bit if USB outruns SATA. Really!
Because you still have to go from the drives SATA connector before you can go through the USB connection, therefore it will NEVER be faster than SATA. It simply can't. Besides the simple fact that you have CPU overhead and bottleneck issues elsewhere in the system. -
See the below review for hard drive using USB3.0... The USB3.0 was actually slower than e-SATA.. i think that we'll only see the full potential of USB3.0 if we are using an external SSD...
http://www.notebookreview.com/default.asp?newsID=5558&review=western+digital+my+book+3+wd
-
-
inperfectdarkness Notebook Evangelist
here's another take:
what about future proofing against things that WON'T be available in the future?
i, for one, am very glad i have 1920x1200 resolution on my laptop. next year (heck, possibly even next month) this option may not be available. to me, this is more paramount than what GPU i'll be running; although it IS the reason why i went with the fastest GPU i could buy in this form factor. i refuse to buy another laptop until the industry starts increasing resolutions beyond 1080p for laptops. -
-
You try 1080p on a 13in screen. -
His post relates to 17"+ laptops I think.
-
inperfectdarkness Notebook Evangelist
i don't know of anyone who feels that a 5mp camera is "sufficient"; yet that is precisely what everyone is arguing with regards to screen resolution. even if i have to expand text size, icon size, etc--it will still be a clearer, cleaner picture running at wuxga+ than at 1080p.
the same backward logic that says 1080p is enough; is identical to the logic that says a game in 1440x900 w/ full filtering somehow looks better than 1920x1200 w/o filtering. -
-
"Future-proofing" is something that people would like to sell you. For a lot of money. There is no definition as to what "future-proofing" is either today or in the future. It means what the seller decides it means, not what you might want it to mean. And that definition, lacking any legal basis, can and will change depending on what product the seller wants to push out the door at any given time.
"Future-proofing" will never have a warranty or guarantee that will protect you from 'the future'.
You will never be able to claim a refund based on some perceived future deficiency.
You will never be able to resell your item for a premium price based on it's "future-proofiness".
Every time you see a piece of technology being touted and sold as being "future-proof", run away. -
inperfectdarkness Notebook Evangelist
-
-
Future proof is a vague statement, It could meam from apps 10 years from now to any apps litterally released later that day. your best bet is take your most demanding application and look at it's resorurce growth over a period of time.
If you say jus surf the web and do mild office work resource growth of these apps is low. Even with the prior releases of say IE and office they really only are in the neighborhood of 3% a year towards memory and cpu usage growth and in the neighbor hood of 5% a year in HDD requirements. At these low growth rates your system will be usable for quite a long time.
Gaming is the other end of the spectrum. Games are aways released that can push the highest end hardware to a cripling failure. If you want thelatest game with the highest res and the most possible eyecandy along with the most frantic of action, hyou will pay dearly today and look for the next best thing tomorrow.
No technology is future proof. Every thing just gets better and/or cheaper. This is a fact of any computers life. You Bomb of a system today may just be a dud tomorrow or the next day.
With laptops you are best just getting by with a system that is at least 50% more than you need for the best price you can. Expect though a new upgraded system is in your future.............. -
inperfectdarkness Notebook Evangelist
i have tested it. i did it back in 2003 with serious sam on my alienware. higher resolution w/o filtering > lower resolution w/ filtering.
the "pixels" you lament are significantly smaller on a higher resolution--hence they flow together significantly better without filtering than would, say, unfiltered pixels at a lower resolution. the idea, the apex of gaming (if you will) is a high enough dpi screen that an unfiltered rendering bears only pixelation and anomalies which are indistinguishable to the human eye. based on my plethora of experience with fps's, that threshold is somewhere around 200dpi or higher. this resolution is approximately double what many laptops are offered with.
the argument for filtration is essentially the same as suggesting using a 3Mp camera, and processing the crap out of it in photoshop; rather than using a 10Mp camera to begin with.
the only other "side" to this argument is from those who feel the "realism" aspect of a game goes down because the resolution is "too high". there is some validity to this; but it doesn't invalidate the push for better, higher resolutions. while it is true that cranking an older game's settings to the max can expose annoying flaws in rendering--that doesn't invalidate the higher resolution. it simply dates the game. go back and play something like ut2004 or blood2 & you'll see what i mean. the "blockiness" becomes more apparent with higher resolutions because, frankly, it's there. it's a part of a game designed with less polygons.
theoretically, 600x400 looks better than 1080p unlfiltered--provided you filter the out of 600x400--at least with the logic you're espousing. -
On my 42" LCD I can game at 1080p, but just barely. If I lower the resolution to 900p I can enable anti-aliasing and run the games at the same 60fps as with 1080p. I'm getting less pixels, but the image is of higher quality. I no longer see the individual pixels because they have been slightly blurred together, which in my eyes looks much better than a pixelated screen. What is even better is that the TV, unlike laptop and small/cheap desktop LCD's, does an amazing job of anti-aliasing on its own. For some reason setting the TV to soften the image on its own doesn't work all that well. However, running at 900p and selecting to run the TV using all of its pixels has it run anti-aliasing on the image before displaying it at 1080p, and does the most amazing up-scaling job I've ever seen. So I get to up my other graphics settings on top of that. To me having a softer image looks a whole lot better than being able to distinguish the pixels, but if that's what floats your boat..
There is also more to anti-aliasing than just getting rid of pixelated images. Anti-aliasing works by blending the colors of adjacent pixels, thus softening the image. Computer images without AA applied to them are 100% sharp. In real life our eyes cannot see the line so clearly of far images. Our eyes naturally view stuff pretty softly, at least much much softer in the distance as computer graphics can make it look. So softening the hard lines of the computer screen sitting a couple feet from your face, even if you have infinite resolution, makes it appear more life like, so I think AA will never go away.
Another thing is that you can't bring in digital photography to this argument. A well-taken picture already has the equivalent image quality of a game with infinite polygons, infinite textures, and all graphics settings set infinitely high, including anti-aliasing. The only tool in computer graphics to make a screen look less pixelated in AA, so the analogy can't transfer over to photography. There is aliasing and anti-aliasing in photography, known as sharpening and softening. Any properly focused and steadily taken picture is going to have the perfect degree of sharpness/softness to make it look like the way you would with your own eyes. The only part of that that can make it look any better is usually a little bit of sharpening. That doesn't really matter. Either way, anti-aliasing is the only tool I know of to make images less pixelated, and since with photography that is not doable like it is in computer graphics, there is really no comparison. -
BenLeonheart walk in see this wat do?
The concept of futureproofing IS a concept.
There is just a notion of what it is, but does not really come to play.
Eg., You can buy a 1995 car, its gonna be the that year... later on, in 2010... the same model, newer year is 10 times better, faster and even more economic than it was back in 1995... for almost the same price (yes, monetary devaluation comes to play too...)
With a computer, I believe you can future-proof yourself up to 1 year, 2 year max with its components...
there's always something new coming around...
Like say, for example, everyone's going crazy for the ATi 5870... then the most powerful single GPU will pop up in a couple of months... but you're still able to handle the games coming for almost a year...
Futureproofing is just that, a concept. -
Higher resolutions also reduce aliasing in and of themselves; the only problem is that this requires additional graphics power.
The "softness" you mention is simply due to imperfection of human vision, and given the opportunity I'd prefer the sharper image. Anti-aliasing is designed to minimize the distortion caused by resolution limitations, not make things look "soft". The latter is merely a side-effect.
If smoothing is what you're looking for, you might have to find dedicated smoothing filters rather than just anti-aliasing. Of course, you could also continue to manually set the resolution lower than the maximum as well. -
inperfectdarkness Notebook Evangelist
IF games are supposed to be more "realistic" they need to be designed that way. i, as a consumer, should not have to resort to lower quality settings and "cheating the system" in order to mimic real life. if a game needs a lack of focus beyond a given threshold--that should be intrinsic to the engine.
i'm also willing to bet that you are not a professional-level gamer. although i am not even close to that level of competition myself--i side with the logic thereof. sharper rendering = better game-play performance. i don't want to hesitate because i'm unsure whether the dot i'm looking at is AA in action, or an actual enemy sniper.
in any case, this thread has derailed. to the OP:
if you want future proof, the best you can do is buy the bleeding-edge of performance, even if you never game. in 4-6 years...you'll be glad you did. -
-
As for the case of the sniper smaller than a pixel, it's more complicated than that. It's true that if you're using some form of supersampling (not all forms of AA do this), a black sniper on a white background might cause the pixel to appear grey. However, increasing the resolution could also make the sniper appear, because it's more likely that the sniper will take up one of the pixes.
Even more importantly, when you actually attempt to shoot that sniper, the game can only resolve your aim down to the pixels. If you shoot at that grey pixel that contains the sniper, there's no guarantee that you'll actually hit the sniper.
-
inperfectdarkness Notebook Evangelist
that's quite true on both counts. i'd actually thought about sniper accuracy when i got up this morning.
while there is no such thing as future proof; the concept will require different strokes for different folks. -
http://www.notebookcheck.net/Review-Sony-Vaio-VPCZ11X9E-B-Notebook.28704.0.html
it has 1600X900 screen but u can get a full HD one.. -
inperfectdarkness Notebook Evangelist
i use a cooler--heat is not something of utmost concern to me.
inability to fit a performance-tier card in a 13" chassis? that's important to me. -
thinkpad knows best Notebook Deity
Yeah, putting a 5850 in a 13" is like putting a V8 LS1 5.7L engine into a Porsche 944, which by the way has been done, many many times...
-
inperfectdarkness Notebook Evangelist
-
There are these things called Monitors... Yeah, they can be plugged in.
A V8 in a 944, it could be a meek little mouse (some v8's are anemic) or a deathtrap, but a very, very fun deathtrap!
I'll take a BMW 3 series with one instead, the 944 is an expensive Porsche to maintain. -
inperfectdarkness Notebook Evangelist
when you work where i work, desktops are not a viable "road" option. monitors that "plug in" aren't going to cut it.
how the heck did this thread get derailed on craptastic porsche cars. go 928 or go home. -
So I didn't spend much more over the course of four years, and I had much better performance the first two years. I'll grant that I bought at an opportune time, and had I been in the market a year later, that may not have been possible. But sometimes you can get better performance in the short term and not actually spend any more in the long term by buying higher-end (not top-of-the-line) components. You just have to be able to not keep upgrading to whatever the next high-end is when it comes out.
It really does depend on your standards, too - if you want to play every game on high settings, you can't really future-proof much. If you are fine with playing on low settings in a couple years, you can future-proof a bit. And even if you did buy an inexpensive laptop every couple years, you'd still be playing low-end on new games. Of course, if you don't do anything particularly demanding, "future-proofing" mainly consists of making sure you get a laptop with decent quality. -
You're missing a couple of important aspects of the calculation. Firstly, your friend would've made at least some money back by selling that 2-year old laptop. Secondly, the money he saved on the initial purchase is worth more 2 years down the track because it could be invested instead.
I do agree that, say, selling your laptop and buying a new one every year probably isn't worthwhile, because a single year mostly doesn't make enough of a difference to the hardware. Additionally, the best value on the market is not necessarily at a single price point, and can vary from higher values to lower ones whenever there's a large discount, or a manufacturer releases a highly competitive model.
I would say it's most important to focus on bang for your buck and performance now rather than at any point in the future. If there's a more expensive option that still offers a good level of performance/price, then go for it as long as you know that the performance is actually going to help you in the here-and-now.
The concept of future proofing
Discussion in 'Hardware Components and Aftermarket Upgrades' started by NotEnoughMinerals, Apr 6, 2010.