The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    The concept of future proofing

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by NotEnoughMinerals, Apr 6, 2010.

  1. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    Lately, I've seen a lot of talking of future proofing and trying to purchase a laptop to last 3-5 years if not longer. In today's market what do you think is actually worth the money because it won't be soon outdated? Are these i7 processors all going to look like crap in 2 years and dual cores are just embarassing? Will usb 3.0 actually overtake usb 2.0 anytime soon and become mainstream? Are SSDs anywhere close to reaching their peak performance? Have graphical fidelity reached a level of diminishing returns? Will 4GB RAM soon become not enough for average users? What about blu-ray?

    So what do you think? What's staying for the long haul and what's just another feature soon to become obselete?
     
  2. woofer00

    woofer00 Wanderer

    Reputations:
    726
    Messages:
    1,086
    Likes Received:
    0
    Trophy Points:
    55
    The problem with trying to define it is that "outdated" is a very relative term. What functionality will the laptop need to provide in 3-5 years? I'm sure that in 3-5 years, i7 will perform like crap, dual core will be a thing of the past, usb 2.0 will be garbage, platter hdds will be legacy hardware, physical displays will be replaced by holograms/projector displays, and anything that requires a cable to charge will be considered a burden.

    HOWEVER, if the machine you buy satisfies your needs, there won't really be any problems. I'm still running a CoreDuo (not Core2Duo) from 4 years ago with winxp/ubuntu, and I don't plan to upgrade for another year or so, even thought I've been browsing new machines pretty heavily. I also use a netbook I bought two years ago (early adopter), and I think it's still got another 2 years in it for the usage I put it to (e-mail, browsing, streaming video), even though it's running an early generation Atom, 1.5gb ram, and quad-booting xp/vista/7/ubuntu (ubuntu > xp > 7 > vista on netbooks, imo)

    If you just plan to browse the web, e-mail, and word process, maybe stream video, anything you buy now will be fine in 5 years, probably even for 8 years as long as the machine is well-maintained physically and reformatted every so often. If you need to play games, render images/video, or perform some other intensive task, you'll upgrade sooner.
     
  3. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    Future proofing is relative to who's buying. Future proofing involves the laptop's use and the projection of those uses into whatever time frame you want the laptop to last.

    Personally speaking, an SSD is probably the one upgrade which will last more than the others. Until flash memory caught on, HDDs weren't getting all that much faster in terms of the end user's experience. yes they were getting noticeably faster, but it wasn't "OMG wow". Now that we have SSDs, the one thing they're working on is making a larger density chip and trying to bring costs down.

    Everything else is related to your uses.
     
  4. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    I'm currently using a processor that was released in 2006 (T7200). It's still plenty fast for all the work stuff I do, even running virtual machines and heavy Java servers. An i7 will not look like crap in 2 years. It won't even look like crap in 5 years. It'll just be a little slower, but still very fast.

    Except for gaming, computers have been "good enough" for most people for the last 5 years. There are some features that might be nice to have, but USB3 is backwards compatible, 4GB RAM is very hard to use up for most people, and Blu-Ray is a $100 upgrade to almost any laptop sold in the last 5 years.
     
  5. woofer00

    woofer00 Wanderer

    Reputations:
    726
    Messages:
    1,086
    Likes Received:
    0
    Trophy Points:
    55
    Going to add in here again, you're more likely to cause fatal physical damage to the laptop within 5 years than to have it become truly obsolete or have a component fail.
     
  6. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    That's a really good point. Goes to the thought that if you treat you laptop right, it'll be good to you for the long haul.
     
  7. naton

    naton Notebook Virtuoso

    Reputations:
    806
    Messages:
    2,044
    Likes Received:
    5
    Trophy Points:
    56
    A 6 years old Acer here. It has been upgraded few times; mainly:
    Ram from 256MB to 1GB
    Celeron M to Pentium M
    DVD/CD-RW to DVD-RW

    The machine is fine and like new. Bottom line is if your needs are the same your laptop can easily last more than 5 years :).
     
  8. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0

    i would get a core i7... it isn't going to become obsolete unlike core 2 duos... as for RAM , 4GB is enough for now if ur playing games but RAM intensive stuff will require 8GB or more... as for USB2.0 , its too old... USB3.0 is better but e-SATA is quite good too... as long as ur drives write speeds are good... in fact e-SATA beat USB3.0 when data was transfered using the new WD external hard drive... As for SSD's , they're not reaching peak performance... in fact , in laptops they're beaking bottlenecked by SATA 2... SATA 3 is comming next year in laptops so that won't be a problem... in fact SSD's can go way further...
     
  9. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    SATA 3 is available on laptops right now, just insanely expensive
     
  10. cloudbyday

    cloudbyday Notebook Deity

    Reputations:
    50
    Messages:
    706
    Likes Received:
    0
    Trophy Points:
    30
    What laptops?
     
  11. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    True on all accounts save for this one. There are other demanding niche markets in additions to gaming.

    Furthermore, I think the best way to future proof any computer is to make to make it as upgradeable as possible. That's why my next laptop will be from the manufacturer that does that the best.

    Designing a configuration that allows easily exchangeable and upgradeable parts is an essential element in devices as short lived as a computer. Thats especially true when it comes to those designed with a no-holds-barred specification specifically outfitted to keep those power hungry gaming enthusiasts etc. at bay.
     
  12. thinkpad knows best

    thinkpad knows best Notebook Deity

    Reputations:
    108
    Messages:
    1,140
    Likes Received:
    0
    Trophy Points:
    55
    Forget USB 3.0, Light Peak is the future for the next unniversal external hardware connector.
     
  13. NotEnoughMinerals

    NotEnoughMinerals Notebook Deity

    Reputations:
    772
    Messages:
    1,802
    Likes Received:
    3
    Trophy Points:
    56
    Well not sure if the laptops themselves support them yet but there's the crucial c300 ssd sata 3s
     
  14. Judicator

    Judicator Judged and found wanting.

    Reputations:
    1,098
    Messages:
    2,594
    Likes Received:
    19
    Trophy Points:
    56
    I assume by SATA "3" you mean SATA 6 Gb/s (SATA revision 3.0)? If you keep just saying SATA 3, I might just assume you mean SATA 3 Gb/s (SATA revision 2.0)! :p

    In terms of SATA 6 Gb/s, I don't believe any in production notebook has a controller of that type yet, so the high sequential read/write speeds of a C300 in a notebook are unreachable at present (although I don't think it's 4K read/writes saturate a SATA 1.5 Gb/s interface yet, even, so for "normal" use you'd still be fine).
     
  15. leslieann

    leslieann Notebook Deity

    Reputations:
    830
    Messages:
    1,308
    Likes Received:
    11
    Trophy Points:
    56
    Pitabread and Krane have said it well.


    One thing to remember is that while double every year or two. So while growing at an exponential rate, the industry cannot simply abandon what was made just 2 years ago. A "lowly" C2D may be obliterated by an I7, but there is little out there designed specifically for an I7 and even less out there that can take full advantage, the industry is still working on the idea that most are still using a C2D and will be for some time yet. You have to aim for what is currently being USED by the mainstream. Not what is selling to high end.

    The biggest change in the next few years will be the mass consumption of SSD's, they make the biggest leap in performance, and wonderfully, they can be back ported to older systems.


    As things others have mentioned...
    USB 3 will not be a big issue for some time to come, it's only bog claim to fame is for external drives. Esata and Expresscard sata adapters can fulfill this function already. Will USB3 be nice? Certainly, but it's easy to live without. Also, while drive space may increase, most people already have more than they need which means you only have so much data to move.

    Light peak... is years away for most systems.
    It's an awesome idea, one I have wanted for along time and it makes sense. Therefore it must fail or take forever to become mainstream. For those who forget, USB was highly touted as the great successor to ps2, parallel and serial ports. We still use them, and it took many years for USB adoption to really take off. Some will say that is B.S. I was there, I remember wondering what was taking so long. It did not take off like the wildfire expected and we STILL have boards coming with the older connectors.
     
  16. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    U mean USB3.0... and not it is not insanely expensive... u get it in quite cheap notebooks like Asus N61 etc... but having e-SATA is also allright as e-SATA is literally as fast as SATA... USB can't match SATA..

    and as for light peak , it still is a pipe dream. Its going to be hard to comercialise it and its going to cost a bomb..
     
  17. f4ding

    f4ding Laptop Owner

    Reputations:
    261
    Messages:
    2,085
    Likes Received:
    0
    Trophy Points:
    55
    Looking at the apps available right now, I serisouly think the i7 and similar processors (future AMD multithread CPUs come to mind) will last for more than 2 years before being "outdated".

    When the dual core cpus first came out, the single cores were getting stressed by most apps that the CPUs were the bottleneck. Right now, I've seen less and less programs able to stress the quads, let alone a hyperthreaded one like the i7.
     
  18. leslieann

    leslieann Notebook Deity

    Reputations:
    830
    Messages:
    1,308
    Likes Received:
    11
    Trophy Points:
    56
    Anything built to handle Vista (dual and quads) should be fine for a bit yet for general computing.
     
  19. yuio

    yuio NBR Assistive Tec. Tec.

    Reputations:
    634
    Messages:
    3,637
    Likes Received:
    0
    Trophy Points:
    105
    well in theory USB3.0 should crush SATA 3Gbit/s...
    4.8Gbit/s vs 3.0Gbit/s.

    I know it doesn't work quite like that in the real world but give it time I have no doubts USB3.0 will have no trouble at beating SATA2 and even a a little beyond. remember USB3.0 is brand new, once manufactures figure out how to best use it it will preform better. That being said I'm eSATA cursed and can't get it to work on any machine for the life of me so I've invested in USB3.0 for my desktop and it's quite fast. 80MB/s+ on a WD 2TB green.
     
  20. Krane

    Krane Notebook Prophet

    Reputations:
    706
    Messages:
    4,653
    Likes Received:
    108
    Trophy Points:
    131
    That's just because--for the moment--few software manufacturers are will to take on the labor intensive task of developing program to optimize quad-core. It's yet another of those advancements that consumers and software developers alike have been slow to accept.
     
  21. leslieann

    leslieann Notebook Deity

    Reputations:
    830
    Messages:
    1,308
    Likes Received:
    11
    Trophy Points:
    56
    First off
    SATA 2 is 3gbps.
    SATA 3 is 6Gbps.

    SATA 3 is already on the market, motherboards and hard drives. Almost every board with USB 3, also has SATA 3. Only a few have USB 3 with SATA 2



    Regardless though...

    It doesn't matter one tiny bit if USB outruns SATA. Really!
    Because you still have to go from the drives SATA connector before you can go through the USB connection, therefore it will NEVER be faster than SATA. It simply can't. Besides the simple fact that you have CPU overhead and bottleneck issues elsewhere in the system.
     
  22. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0

    See the below review for hard drive using USB3.0... The USB3.0 was actually slower than e-SATA.. i think that we'll only see the full potential of USB3.0 if we are using an external SSD...

    http://www.notebookreview.com/default.asp?newsID=5558&review=western+digital+my+book+3+wd

    I do hope u meant desktop motherboards as laptop ones 0% SATA 3 support. Its only comming next year with sandy bridge.
     
  23. leslieann

    leslieann Notebook Deity

    Reputations:
    830
    Messages:
    1,308
    Likes Received:
    11
    Trophy Points:
    56
    Ooops, you are correct, it still doesn't change the rest though.
     
  24. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    here's another take:

    what about future proofing against things that WON'T be available in the future?

    i, for one, am very glad i have 1920x1200 resolution on my laptop. next year (heck, possibly even next month) this option may not be available. to me, this is more paramount than what GPU i'll be running; although it IS the reason why i went with the fastest GPU i could buy in this form factor. i refuse to buy another laptop until the industry starts increasing resolutions beyond 1080p for laptops.
     
  25. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    Why? You mean you want 16x10 and not 16x9, not that you want pixels to get even tinier on laptop displays, right? 1080p on a small screen is close if not already as far as a human eye can resolve.
     
  26. leslieann

    leslieann Notebook Deity

    Reputations:
    830
    Messages:
    1,308
    Likes Received:
    11
    Trophy Points:
    56
    There are limits to how small of pixels you really need.
    You try 1080p on a 13in screen. :D
     
  27. Melody

    Melody How's It Made Addict

    Reputations:
    3,635
    Messages:
    4,174
    Likes Received:
    419
    Trophy Points:
    151
    His post relates to 17"+ laptops I think.
     
  28. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    i would. happily. can you point me in the direction where i can try this? oh....that's right. IT DOESN'T FREAKING EXIST.

    at a minimum? yes. 1080p on 15" should tell even the most inane of individuals that offerings on 16"+ laptops should be > than 1080p resolution. i would argue though (since i have a wuxga screen) that i could easily go from a 2.3Mp image to a 2.5-2.6Mp image on a 15.4" chassis and still no be forced to "squint". mfgs should not be allowed to offer <140dpi screens on laptops, imho; unless they're catering specifically to the crowd that reads the XXL size of reader's digest.

    i don't know of anyone who feels that a 5mp camera is "sufficient"; yet that is precisely what everyone is arguing with regards to screen resolution. even if i have to expand text size, icon size, etc--it will still be a clearer, cleaner picture running at wuxga+ than at 1080p.

    the same backward logic that says 1080p is enough; is identical to the logic that says a game in 1440x900 w/ full filtering somehow looks better than 1920x1200 w/o filtering.
     
  29. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    well , the Vaio Z is 13 inch and has 1080p screen.. looks quite nice... must be very easy to read lol :D
     
  30. newsposter

    newsposter Notebook Virtuoso

    Reputations:
    801
    Messages:
    3,881
    Likes Received:
    0
    Trophy Points:
    105
    "Future-proofing" is something that people would like to sell you. For a lot of money. There is no definition as to what "future-proofing" is either today or in the future. It means what the seller decides it means, not what you might want it to mean. And that definition, lacking any legal basis, can and will change depending on what product the seller wants to push out the door at any given time.

    "Future-proofing" will never have a warranty or guarantee that will protect you from 'the future'.

    You will never be able to claim a refund based on some perceived future deficiency.

    You will never be able to resell your item for a premium price based on it's "future-proofiness".

    Every time you see a piece of technology being touted and sold as being "future-proof", run away.
     
  31. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    does bestbuy carry those? i'd like to sample it just to prove to myself that 1080p is a good size for something that small.
     
  32. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    Maybe you should test this out, because AA makes the lower resolution image look better than the higher resolution. With the 1080p you will see the individual pixels, solid blocks of color that contrast sharply to the ones around them. With AA on the 900p it is much harder to distinguish the pixels.
     
  33. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,548
    Messages:
    9,585
    Likes Received:
    4,997
    Trophy Points:
    431
    Future proof is a vague statement, It could meam from apps 10 years from now to any apps litterally released later that day. your best bet is take your most demanding application and look at it's resorurce growth over a period of time.

    If you say jus surf the web and do mild office work resource growth of these apps is low. Even with the prior releases of say IE and office they really only are in the neighborhood of 3% a year towards memory and cpu usage growth and in the neighbor hood of 5% a year in HDD requirements. At these low growth rates your system will be usable for quite a long time.

    Gaming is the other end of the spectrum. Games are aways released that can push the highest end hardware to a cripling failure. If you want thelatest game with the highest res and the most possible eyecandy along with the most frantic of action, hyou will pay dearly today and look for the next best thing tomorrow.

    No technology is future proof. Every thing just gets better and/or cheaper. This is a fact of any computers life. You Bomb of a system today may just be a dud tomorrow or the next day.

    With laptops you are best just getting by with a system that is at least 50% more than you need for the best price you can. Expect though a new upgraded system is in your future..............
     
  34. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    i have tested it. i did it back in 2003 with serious sam on my alienware. higher resolution w/o filtering > lower resolution w/ filtering.

    the "pixels" you lament are significantly smaller on a higher resolution--hence they flow together significantly better without filtering than would, say, unfiltered pixels at a lower resolution. the idea, the apex of gaming (if you will) is a high enough dpi screen that an unfiltered rendering bears only pixelation and anomalies which are indistinguishable to the human eye. based on my plethora of experience with fps's, that threshold is somewhere around 200dpi or higher. this resolution is approximately double what many laptops are offered with.

    the argument for filtration is essentially the same as suggesting using a 3Mp camera, and processing the crap out of it in photoshop; rather than using a 10Mp camera to begin with.

    the only other "side" to this argument is from those who feel the "realism" aspect of a game goes down because the resolution is "too high". there is some validity to this; but it doesn't invalidate the push for better, higher resolutions. while it is true that cranking an older game's settings to the max can expose annoying flaws in rendering--that doesn't invalidate the higher resolution. it simply dates the game. go back and play something like ut2004 or blood2 & you'll see what i mean. the "blockiness" becomes more apparent with higher resolutions because, frankly, it's there. it's a part of a game designed with less polygons.

    theoretically, 600x400 looks better than 1080p unlfiltered--provided you filter the out of 600x400--at least with the logic you're espousing.
     
  35. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    That's not the logic I'm espousing. And by "filter" I get the feeling you don't know exactly what we are talking about. We are strictly speaking anti-aliasing here, afaik the only tool like it in computer gaming. Also when you mention alienware, you mean alienware laptop? And was it on the laptop's display and taking up the full screen both times? If yes, that invalidates your observations due to not running a 1:1 pixel ratio.

    On my 42" LCD I can game at 1080p, but just barely. If I lower the resolution to 900p I can enable anti-aliasing and run the games at the same 60fps as with 1080p. I'm getting less pixels, but the image is of higher quality. I no longer see the individual pixels because they have been slightly blurred together, which in my eyes looks much better than a pixelated screen. What is even better is that the TV, unlike laptop and small/cheap desktop LCD's, does an amazing job of anti-aliasing on its own. For some reason setting the TV to soften the image on its own doesn't work all that well. However, running at 900p and selecting to run the TV using all of its pixels has it run anti-aliasing on the image before displaying it at 1080p, and does the most amazing up-scaling job I've ever seen. So I get to up my other graphics settings on top of that. To me having a softer image looks a whole lot better than being able to distinguish the pixels, but if that's what floats your boat..

    There is also more to anti-aliasing than just getting rid of pixelated images. Anti-aliasing works by blending the colors of adjacent pixels, thus softening the image. Computer images without AA applied to them are 100% sharp. In real life our eyes cannot see the line so clearly of far images. Our eyes naturally view stuff pretty softly, at least much much softer in the distance as computer graphics can make it look. So softening the hard lines of the computer screen sitting a couple feet from your face, even if you have infinite resolution, makes it appear more life like, so I think AA will never go away.

    Another thing is that you can't bring in digital photography to this argument. A well-taken picture already has the equivalent image quality of a game with infinite polygons, infinite textures, and all graphics settings set infinitely high, including anti-aliasing. The only tool in computer graphics to make a screen look less pixelated in AA, so the analogy can't transfer over to photography. There is aliasing and anti-aliasing in photography, known as sharpening and softening. Any properly focused and steadily taken picture is going to have the perfect degree of sharpness/softness to make it look like the way you would with your own eyes. The only part of that that can make it look any better is usually a little bit of sharpening. That doesn't really matter. Either way, anti-aliasing is the only tool I know of to make images less pixelated, and since with photography that is not doable like it is in computer graphics, there is really no comparison.
     
  36. BenLeonheart

    BenLeonheart walk in see this wat do?

    Reputations:
    42
    Messages:
    1,128
    Likes Received:
    0
    Trophy Points:
    55
    The concept of futureproofing IS a concept.
    There is just a notion of what it is, but does not really come to play.

    Eg., You can buy a 1995 car, its gonna be the that year... later on, in 2010... the same model, newer year is 10 times better, faster and even more economic than it was back in 1995... for almost the same price (yes, monetary devaluation comes to play too...)

    With a computer, I believe you can future-proof yourself up to 1 year, 2 year max with its components...
    there's always something new coming around...
    Like say, for example, everyone's going crazy for the ATi 5870... then the most powerful single GPU will pop up in a couple of months... but you're still able to handle the games coming for almost a year...

    Futureproofing is just that, a concept.
     
  37. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    Higher resolutions also reduce aliasing in and of themselves; the only problem is that this requires additional graphics power.

    I disagree; with infinite resolution AA makes no sense. In fact, as resolutions continue to increase, you will find that the effects of AA will be less and less noticeable.

    The "softness" you mention is simply due to imperfection of human vision, and given the opportunity I'd prefer the sharper image. Anti-aliasing is designed to minimize the distortion caused by resolution limitations, not make things look "soft". The latter is merely a side-effect.

    If smoothing is what you're looking for, you might have to find dedicated smoothing filters rather than just anti-aliasing. Of course, you could also continue to manually set the resolution lower than the maximum as well.
     
  38. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    this is PRECISELY what my point is. well said sir!

    my "experiment" was run on an aurora desktop with the (then) mighty ti4600 GPU & a maximum resolution of 1920x1440. i played around with both AA and AF. i stand by my conclusions.

    IF games are supposed to be more "realistic" they need to be designed that way. i, as a consumer, should not have to resort to lower quality settings and "cheating the system" in order to mimic real life. if a game needs a lack of focus beyond a given threshold--that should be intrinsic to the engine.

    i'm also willing to bet that you are not a professional-level gamer. although i am not even close to that level of competition myself--i side with the logic thereof. sharper rendering = better game-play performance. i don't want to hesitate because i'm unsure whether the dot i'm looking at is AA in action, or an actual enemy sniper.

    in any case, this thread has derailed. to the OP:

    if you want future proof, the best you can do is buy the bleeding-edge of performance, even if you never game. in 4-6 years...you'll be glad you did.
     
  39. Trottel

    Trottel Notebook Virtuoso

    Reputations:
    828
    Messages:
    2,303
    Likes Received:
    0
    Trophy Points:
    0
    Fine, I guess we just have different preferences. I can't stand seeing jagged images and feel that lowering the resolution one step is more than a fair tradeoff to get rid of them, but you clearly prefer it the other way. Also with anti-aliasing you are more likely to see a sniper if he is smaller than a pixel due to the way the pixel is sampled, as the average of sampled points and not a single point which may or may not include the sniper.

    You must admit that's a bad idea. You should go with the minimum you need today or expect to need in the near future. If you need something more powerful down the line, you can upgrade your laptop or sell it on ebay and buy a new one. You will be happier with a new laptop boasting modern technology every couple of years than suffering with an outdated dinosaur until it bites the dust and you can't pay anyone to take it off your hands.
     
  40. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    The point is that the images will be less and less jagged as resolution increases.
    As for the case of the sniper smaller than a pixel, it's more complicated than that. It's true that if you're using some form of supersampling (not all forms of AA do this), a black sniper on a white background might cause the pixel to appear grey. However, increasing the resolution could also make the sniper appear, because it's more likely that the sniper will take up one of the pixes.
    Even more importantly, when you actually attempt to shoot that sniper, the game can only resolve your aim down to the pixels. If you shoot at that grey pixel that contains the sniper, there's no guarantee that you'll actually hit the sniper.

    Agreed. Quite frankly, the very concept of futureproofing a computer is mostly nonsense. My own saying on the matter is quite simple: "The best kind of futureproofing is extra money in your wallet".
     
  41. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    that's quite true on both counts. i'd actually thought about sniper accuracy when i got up this morning.

    while there is no such thing as future proof; the concept will require different strokes for different folks.
     
  42. sean473

    sean473 Notebook Prophet

    Reputations:
    613
    Messages:
    6,705
    Likes Received:
    0
    Trophy Points:
    0
    i don't know.. i don't live in US but maybe they do... here's a review below where u can see how good the screen way but generally , not a good option for 13 inch gaming.. it's super toasty... worse than my Dv5t which is quite old and runs quite hot... without mods that is..

    http://www.notebookcheck.net/Review-Sony-Vaio-VPCZ11X9E-B-Notebook.28704.0.html

    it has 1600X900 screen but u can get a full HD one..
     
  43. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    i use a cooler--heat is not something of utmost concern to me.

    inability to fit a performance-tier card in a 13" chassis? that's important to me.
     
  44. thinkpad knows best

    thinkpad knows best Notebook Deity

    Reputations:
    108
    Messages:
    1,140
    Likes Received:
    0
    Trophy Points:
    55
    Yeah, putting a 5850 in a 13" is like putting a V8 LS1 5.7L engine into a Porsche 944, which by the way has been done, many many times... :p
     
  45. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    i would buy it. it'd be even better if it was a 1080p screen in a 14" w/ a 5870...but the consensus seems to be against viable cooling options in that form.
     
  46. leslieann

    leslieann Notebook Deity

    Reputations:
    830
    Messages:
    1,308
    Likes Received:
    11
    Trophy Points:
    56
    There are these things called Monitors... Yeah, they can be plugged in. :p


    A V8 in a 944, it could be a meek little mouse (some v8's are anemic) or a deathtrap, but a very, very fun deathtrap! :D
    I'll take a BMW 3 series with one instead, the 944 is an expensive Porsche to maintain.
     
  47. inperfectdarkness

    inperfectdarkness Notebook Evangelist

    Reputations:
    100
    Messages:
    387
    Likes Received:
    102
    Trophy Points:
    56
    when you work where i work, desktops are not a viable "road" option. monitors that "plug in" aren't going to cut it.

    how the heck did this thread get derailed on craptastic porsche cars. go 928 or go home.
     
  48. Apollo13

    Apollo13 100% 16:10 Screens

    Reputations:
    1,432
    Messages:
    2,578
    Likes Received:
    210
    Trophy Points:
    81
    Depends on the options at the time really. I spend about 2.5 times as much on my laptop as one of my friends who bought a couple months after mine; unsurprisingly, mine was much more powerful, both with the CPU, GPU, and HDD. I planned to keep mine for 4 years; he planned to keep his for two and buy another one, also relatively inexpensive, and keep it for two years as well. But two years comes around, and to get a laptop that was equal to mine on both processor and graphics capabilities, still costed 1.5 times as much as what he paid for his first, inexpensive one. So on the whole, he would've paid the same amount over four years, but have had a significantly weaker computer the first two years. Perhaps a slightly faster processor the latter two years, but not nearly enough to make up the performance penalty the first two years. He ended up getting a desktop instead, and still has the weak notebook.

    So I didn't spend much more over the course of four years, and I had much better performance the first two years. I'll grant that I bought at an opportune time, and had I been in the market a year later, that may not have been possible. But sometimes you can get better performance in the short term and not actually spend any more in the long term by buying higher-end (not top-of-the-line) components. You just have to be able to not keep upgrading to whatever the next high-end is when it comes out.

    It really does depend on your standards, too - if you want to play every game on high settings, you can't really future-proof much. If you are fine with playing on low settings in a couple years, you can future-proof a bit. And even if you did buy an inexpensive laptop every couple years, you'd still be playing low-end on new games. Of course, if you don't do anything particularly demanding, "future-proofing" mainly consists of making sure you get a laptop with decent quality.
     
  49. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    You're missing a couple of important aspects of the calculation. Firstly, your friend would've made at least some money back by selling that 2-year old laptop. Secondly, the money he saved on the initial purchase is worth more 2 years down the track because it could be invested instead.

    I do agree that, say, selling your laptop and buying a new one every year probably isn't worthwhile, because a single year mostly doesn't make enough of a difference to the hardware. Additionally, the best value on the market is not necessarily at a single price point, and can vary from higher values to lower ones whenever there's a large discount, or a manufacturer releases a highly competitive model.

    I would say it's most important to focus on bang for your buck and performance now rather than at any point in the future. If there's a more expensive option that still offers a good level of performance/price, then go for it as long as you know that the performance is actually going to help you in the here-and-now.