One thing I've noticed -- 'business' laptops seem to have suffered from far fewer issues than the 'consumer' ones with the 'bad' Nvidia video chips. Could this be because the magnitude of thermal cycles that a typical 'business' laptop will experience are somewhat less because of the greater heatsinking capability of these machines (ie: compare the Dell Latitude D630/D830 to the XPS m1330/m1530; the D630/D830 can radiate heat through the entire case, while the XPS uses a plastic case and heat is primarily dissipated through the fan system, giving rise to higher peak temperatures and thermal transients that are much more severe in magnitude!
-
I owned a Vostro 1500 with 8600m GT for nearly two years. Sold it to a friend who has owned it for about two years now and still going strong. The cooling is pretty good, and my GPU was decently overclocked that I recall. Maybe different solder was used, I dunno.
-
Tsunade_Hime such bacon. wow
Massive heatpipe for the GPU, and CPU has its own pretty large heatpipe.
Business laptops generally are better built than a consumer grade, but of course it all depends on what you are comparing. -
-
Tsunade_Hime such bacon. wow
I would know having a gaming laptop, my G71GX and that is heavy enough with the power brick (close to 11 pounds) and comes nowhere close to my i7 desktop in my rig.
Desktops will always win for performance to price ratio. Laptops...well you pay that premium just so you can carry it around (or not). -
Meaker@Sager Company Representative
Yeah a laptop is not going to approach a 5ghz sandy bridge with a pair of GTX580s in SLI.
As for the Nvidia reliability issue, there was a class action lawsuit and they burned bridges with quite a few companies, so yeah, I think in that instance their reliability was markedly worse compared to anyone else. At the time there were plenty of 3650 and 4650/4670 machines in use. -
-
-
-
tilleroftheearth Wisdom listens quietly...
Running Windows and a couple of programs (including background programs like A/V, etc.) already uses much more than a dual core can handle efficiently.
And, if I had an 8GHz cpu - you'd better believe that I wouldn't be using any of the available ~2/~3(with Turbo) GHz models now available (as long as it was still 4/8 core/threads.).
While I may not notice the difference between 0.005ms and 0.02ms for a single task (like right-clicking on an item) - I will notice the speedup at the end of my day (which would be around 3/4pm with an 8GHz system vs. 9/10pm with my 3.4GHz systems I have now.
Who really cares about 'average users'? Getting work done is the end goal of a powerful system, right? -
-
all mentioned task have bottlenecks some where else(disk or network) which actually favours multicore than single core with high frequency.
If you don't have a fast enough source to feed you 8GHz CPU, you are not using it.
BTW, the heavily pushed APU(i.e. AMD and Nvidia) is in effect massive multicore with pretty slow individual core and they are being used in HPC(and super computers) where people know a bit or two about performance. -
tilleroftheearth Wisdom listens quietly...
It takes less than 10 seconds to open and load my (~500MB) image into PS. I then need an hour or three to work on it. (Filters really slow you down...). With 8GHz, that time would be cut down to 1/2 hr to 1 1/2 hrs - at least.
Huge difference.
Storage speed in my case is not starving the CPU (16GB RAM...), although it is highly appreciated.
Current 'APUs' are a joke right now. As to their use in super computers: they would also be a bad match to my computational needs (image editing).
If/when they become powerful enough and mainstream enough that Adobe/Nik/Nikon/Bibble/AutoCAD program for them, we can talk about them. Right now? Lol.... -
again you are using your own usage which is perfectly fine for your own decision whether a 8GHz fits better but that was not we are talking about.
-
The GHz race is over. The future is in cores!
-
-
tilleroftheearth Wisdom listens quietly...
The future is in efficient programming for all those cores.
See:
ParaSail -- A new language for race-free parallel programming at OSCON 2011 | Lanyrd -
Yes, we need to utilize what we have more efficiently. Hardware can grow exponentially but without the right software optimizations, they're only using a fraction of what can be done with the hardware.
-
niffcreature ex computer dyke
Of course, there is plenty of incentive for manufacturers not to program things efficiently as well.
For all we know, integrated gm965 graphics and a t8100 could play the witcher 2 maxed out... with a different OS. -
-
niffcreature ex computer dyke
I dunno, people don't do that with graphics cards for obvious reasons.
In fact, marketing is trying to do the opposite - they want us to associate with model numbers more than actual numbers, so we make decisions completely trusting them.
Essentially they want the product to sound more like the name of a car. You get the ford F350 for your serious truck use but not because you know what the 350 or anything else means.
Its already been established that these things are intangible. I mean, there are sellers on Ebay who are still trying to push p4 CPUs because they run at 3ghz, and some of them succeed at it, so I guess if thats what you mean then sure. -
-
-
ParamountComputers Notebook Enthusiast
Don -
I just know that if I want occasional portable power, my Shuttle is awesome. Fits in a small duffle bag, and can tote a 15" or 17" screen with it. Right now houses i5-2400 and GTX 460 mainly for cost constraints but no reason it can't house an i7-2600 and a GTX 580. Not that you want to have two PC's but I do for this very reason. < $1000 I can have a somewhat portable and powerful desktop + light gaming laptop.
-
niffcreature ex computer dyke
Oooh, how much does a desktop flatscreen weigh tho?
If I were you, I'd find a 17" assembly from a laptop, and buy an LVDS converter
.....or maybe I just love laptops. -
niffcreature ex computer dyke
My problem with it is that people will still try to upgrade, or buy used laptops... and chances are if the have a pm965 chipset, they are going to go for a t8100 2.1ghz instead of a t7800 2.6ghz or a t7000 instead of a t6700. -
-
I have not found a single 1366x768 panel that even meets "average". They all have bad viewing angles, washed out colors, overbright or not bright enough, and horrible contrast. Minimum two of those.
-
ASUS G series laptops have an 120Hz 1366x768 panel. Given it's 120Hz, it should be better than an normal 1366x768 panel. Of course, it's like 15", lol...
Lenovo uses an IPS panel option on their X220 (12.5") at 1366x768. -
Tsunade_Hime such bacon. wow
I think he meant a standard 15.6" or 14" @ 1368x768. The x220's IPS panel is a rare exception.
The point is, cheapo laptops typically have awful screens because they can't strip away all the features/CPU's -
-
No, I'm saying most 768p screens have horrible qualities compared with the higher resolution counterparts in most cases. I've gone through no fewer than a half dozen laptops with various screens in the last year and the 768p screens have horrible viewing angles, poor brightness and contrast, and sometimes grainy appearance. 1080p screens have been excellent with the exception of a 17" 1080p screen (Sager NP8170) that the default screen had poor viewing angles, otherwise was a good screen.
It's just a fact that in order to get laptops as cheap as possible they skimp where they can, and can drop pricing significantly by using lowest end screens. The OEM's will admit to it. -
Oh, well. One bright aspect (litterally), it's typically less opaque. Of course, they will counter by using a weaker/cheaper backlight...
As for bad 1080p panels on laptops, I think the only bad one is made by AUO or AOC - I don't remember which one was an actual panel manuf.
Laptop Misconceptions
Discussion in 'Hardware Components and Aftermarket Upgrades' started by Wolfborne, May 21, 2011.