Why aren't these companies going for stability and lower temperatures before the steady climb to graphics superiority? Anyone wanna comment? I mean, if it runs, shouldn't that be fine for a laptop?
-
Some people want higher performance, higher performance leads to more heat being generated, higher temperatures lead to less stability.
-
Couldn't agree more, I'd add a rep point to you if I could, "Some People" are making computers into a sad jumble of technological leaps and bounds.
-
which is why i chose the amazing Intel x3100
When i got i never expected anything from it, so i didnt get that disappointed when i i got 6 FPS on Diablo II -
^ Diablo 2 isn't even a 3D game; the Intel X3100 should be able to handle that with ease.
-
Yeah i wish it was but when i play with my sorc and cats some blizzards or when there r a lot of action on my screen i get 6 FPS. This is when i run it in Direct 3D. Cant run this game in open GL.
-
I don't really see what you're saying here, Mr Dsk. If you want low temperature and low power consumption, you get integrated graphics. If you'd like bleeding edge performance, in the form of polygons per second, you get a high end GPU. If you need to render 3D animation or engineering programs you get a workstation card. With proper driver support and cooling systems, these should all be pretty stable.
There are some chocolate and peanut butter solutions here and there: Lower end dedicated GPUs, more powerful integrated GPUs, die shrinks, hybrid graphics solutions, downclocking... - whatever.
My advice is to get the GPU for the job. If you'd like a your laptop to run cool and play the game just functionally - maybe a high power GPU isn't for you. Different strokes for different folks.
Check out the GPU Guide and maybe do some digging through Google for reviews that mention heat and forum posts that complain about stability - Then asses if the product meets your needs. Someone wants your money, so chances are there is a product out there that's suited just for you.
And I'm sure someone will give us an informative lecture on the benefits of laptop coolers. -
Charles P. Jefferies Lead Moderator Super Moderator
Most of the 'blame' for the high temperatures can go on the notebook manufacturers who design the heatsinks for the notebooks, and not the chip manufacturers. Now, if the chip manufacturers are not giving the correct TDP measurements (which stands for Thermal Design Power; it tells heatsink makers how much thermal energy their heatsink has to dissipate in order to keep the chip from reaching its maximum rated temperature), then that of course, would be the problem. It's happened before, but would be very difficult to prove in the notebook world on your own.
Note that graphics cards can get hotter than other chips such as the processor. Many are okay well over 100*C. The Nvidia 9600M-GT in my HP dv5t runs quite warm at 85*C under full load, but that temperature is under the maximum temperature of the card so the thermal solution employed by HP is doing its job satisfactorily.
The propagator of the whole graphics-cards-getting-too-hot-and-failing deal is The Inquirer, which is known for sensationalist stories. The Nvidia failures were acknowledged by Nvidia, yes. However, the problem with the chips is [was] the packaging material used, not the chip itself.
I'm convinced that notebook manufacturers are skirting too close to the thermal limit of graphics cards, and in some cases, the CPU, by using heatsinks that just cut it. My HP's GPU as noted gets hot, and there is little headroom before the maximum temperature for the chip is reached (other dv5t owners have reported the same GPU getting over 90*C). Most of the notebooks I've had to review had GPU temperatures over 85 - 90*C, and some even overheated. Without a doubt in my mind, cooling solutions used by notebook manufacturers need to be beefed up so they keep components cooler, and minimize the risk of overheating. -
"shouldn't it be fine for a laptop?" No. It shouldn't. I needed to replace my desktop with a laptop for college, and I am more than willing to withstand the heat, which is safe unless you're a newborn baby, and didn't have to sacrifice the power since I like to play games. Now if they refused to make cards that were powerful just to make some people comfortable, then I'd be angry. As for stability? How are they very unstable? Mines been fine. If you're not overclocking them at all or past their limits, then they are fine. What exactly is the problem? -
-
Charles P. Jefferies Lead Moderator Super Moderator
-
Going for the Superior graphics ability is actually wat allows us to make the cards that ARE stable and cool. For example, the 45nm technology used to make the Q9x50s, which are hot cuz they have four powerful cores. ALso allows us to make the E8x00 processors, which only have 2 cores that are 45 nm... a HUGE improvement to the E6x00 series... making leaps in the hardcore technology allows us to back up that technology to "weaker" stuff, which is still way more powerful than last generation stuff that runs cooler. Processors was the best example. But for GPUs, i dont kno the names, but the 9500 is a powerful, cool card, only cuz we crippled the hot 9800...
-
What if people could learn to take care of their laptops even more? I mean a high end GPU of course gets quite warm/hot. The qutie a few people overclock too with a laptop that has inferior heat dissipation, no wonder the GPU´s get hot. Also if one is so concerned you can actually check the temp/throttling threshold in nibitor and see where your card starts to throttle.
Just regular maintenance solves a lot of issues, dust bunnies is a huge culprit in temps. Those more "hardcore" can put on AS5 or even mod the heat dissipation themselves. But easiest way is just to use a can of air and blow out the dust once in a week.
Raising the laptop in the back a little yields a drop of around 5-6 degrees maybe even more depending on the ambient temp. Simple things like that keeps the temps down
Take care of your laptops and they will last longer -
I think only a few of the posts I read answer the original question, I wasnt asking for an opinion, but a serious definitive answer, does anyone know people in the companies or read over their customer senus to tell me why this is so frequent?
-
I don't understand your question, but I'll try my best to help you out. Each GPU designer/manufacturer offers a wide variety of products to meet their customer's needs.
You might not need a fast high end gaming card - maybe I do. Maybe you just need a no frills dedicated GPU that runs cool. Maybe dedicated graphics aren't for you at all. The point is, each of these product tiers has its market which it sells in, and if you aren't satisfied with what you're looking for maybe you should re-asses your needs.
What are you trying to do, and what problems have you personally had with nVidia or ATI's hardware?
Have you considered Intel Integrated Graphics? -
The reason for high temperatures is definitely the way notebook manufacturers design their cooling systems. The main problem is that people want it all- they want their computer to be fast, they want it to be thin and light weight, and they want it to be silent. Heatsinks are solid metal, and metal is heavy. Manufacturers are in most cases using barely large enough heatsinks in order to keep weight to a minimum and reduce laptop size/thickness. Possibly the biggest cause for high temperatures are the fans in current laptops- not only are they small and bordering useless, but manufacturers try to keep them from spinning so you don't have to listen to the fan.
If manufacturers had more aggressive fan profiles, temperatures would probably be a lot lower. This may not be true in all cases, but it is in many.
The Uniwill laptop in my sig has an 80watt desktop AMD processor from back in the socket 754 days. Under 100% load 24/7, the processor never breaks 60C. It is usually around 55C. I should mention that the Uniwill is a 15.4" laptop. It isn't extroadinarly thick or heavy(actually, it does weigh a little over 8lbs but that is because the entire chassis is metal). The reason it stays so cool with such a high wattage processor is that the fan actually turns on and moves air. With a decent cooling system, we could probably see load temps of well under 40C on modern laptops since they use so little power. -
If you want low temps etc , go with an integrated solution (intel has the best right now).
-
Everyone has been saying the same thing. And it's all correct, I don't understand what the OP is complaining about.
-
-
i think what OP meant was,
Nvidia/ATi (especially Nvidia) puts graphic superiority as their main concern, rather than stability and lower temperature.
Why not place stability and lower temperature as their main concern 1st, then steadily climb towards graphic superiority.
Well, in my opinion, for the past few years, customers' (us users) main concern is about graphic superiority. Why wouldn't we, since games demand higher graphics capability from time to time
Naturally, as a company, ATi/Nvidia (especially Nvidia) would want to fulfill their customers' satisfaction (in order to gain more profit), by offering greater graphic superiority while putting temperatures and stability as their 2nd concern.
But, ever since the faulty Nvidia gpus news, things might probably change now.
Gpu stability and temperature (maybe 35C max when playing Crysis) have to be ATi/Nvidia's 1st priority now or else pc gaming will not have a bright future.
I'm guessing there might be a slow down in (newer, hardcore-ar, higher) graphic demanding games and also the performance of gpus for the next couple of years. -
You could say that nvidia is going for stability as they lowered the GPU size from 65nm to 55nm so less heat. Other then that technology needs to improve as software requires more powerful hardware to run smooth.
-
Nvidia/Ati
Discussion in 'Gaming (Software and Graphics Cards)' started by Chk, Aug 23, 2008.