The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Nvidia/Ati

    Discussion in 'Gaming (Software and Graphics Cards)' started by Chk, Aug 23, 2008.

  1. Chk

    Chk Notebook Consultant

    Reputations:
    3
    Messages:
    130
    Likes Received:
    0
    Trophy Points:
    30
    Why aren't these companies going for stability and lower temperatures before the steady climb to graphics superiority? Anyone wanna comment? I mean, if it runs, shouldn't that be fine for a laptop?
     
  2. 5482741

    5482741 5482741

    Reputations:
    712
    Messages:
    1,530
    Likes Received:
    17
    Trophy Points:
    56
    Some people want higher performance, higher performance leads to more heat being generated, higher temperatures lead to less stability.
     
  3. Chk

    Chk Notebook Consultant

    Reputations:
    3
    Messages:
    130
    Likes Received:
    0
    Trophy Points:
    30
    Couldn't agree more, I'd add a rep point to you if I could, "Some People" are making computers into a sad jumble of technological leaps and bounds.
     
  4. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    which is why i chose the amazing Intel x3100 :D

    When i got i never expected anything from it, so i didnt get that disappointed when i i got 6 FPS on Diablo II
     
  5. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    ^ Diablo 2 isn't even a 3D game; the Intel X3100 should be able to handle that with ease.
     
  6. Satyrion

    Satyrion Notebook Deity

    Reputations:
    123
    Messages:
    1,404
    Likes Received:
    0
    Trophy Points:
    55
    Yeah i wish it was but when i play with my sorc and cats some blizzards or when there r a lot of action on my screen i get 6 FPS. This is when i run it in Direct 3D. Cant run this game in open GL.
     
  7. Mikelx215

    Mikelx215 Notebook Evangelist

    Reputations:
    132
    Messages:
    474
    Likes Received:
    0
    Trophy Points:
    30
    I don't really see what you're saying here, Mr Dsk. If you want low temperature and low power consumption, you get integrated graphics. If you'd like bleeding edge performance, in the form of polygons per second, you get a high end GPU. If you need to render 3D animation or engineering programs you get a workstation card. With proper driver support and cooling systems, these should all be pretty stable.

    There are some chocolate and peanut butter solutions here and there: Lower end dedicated GPUs, more powerful integrated GPUs, die shrinks, hybrid graphics solutions, downclocking... - whatever.

    My advice is to get the GPU for the job. If you'd like a your laptop to run cool and play the game just functionally - maybe a high power GPU isn't for you. Different strokes for different folks.

    Check out the GPU Guide and maybe do some digging through Google for reviews that mention heat and forum posts that complain about stability - Then asses if the product meets your needs. Someone wants your money, so chances are there is a product out there that's suited just for you.

    And I'm sure someone will give us an informative lecture on the benefits of laptop coolers. :)
     
  8. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,084
    Trophy Points:
    931
    Most of the 'blame' for the high temperatures can go on the notebook manufacturers who design the heatsinks for the notebooks, and not the chip manufacturers. Now, if the chip manufacturers are not giving the correct TDP measurements (which stands for Thermal Design Power; it tells heatsink makers how much thermal energy their heatsink has to dissipate in order to keep the chip from reaching its maximum rated temperature), then that of course, would be the problem. It's happened before, but would be very difficult to prove in the notebook world on your own.

    Note that graphics cards can get hotter than other chips such as the processor. Many are okay well over 100*C. The Nvidia 9600M-GT in my HP dv5t runs quite warm at 85*C under full load, but that temperature is under the maximum temperature of the card so the thermal solution employed by HP is doing its job satisfactorily.

    The propagator of the whole graphics-cards-getting-too-hot-and-failing deal is The Inquirer, which is known for sensationalist stories. The Nvidia failures were acknowledged by Nvidia, yes. However, the problem with the chips is [was] the packaging material used, not the chip itself.

    I'm convinced that notebook manufacturers are skirting too close to the thermal limit of graphics cards, and in some cases, the CPU, by using heatsinks that just cut it. My HP's GPU as noted gets hot, and there is little headroom before the maximum temperature for the chip is reached (other dv5t owners have reported the same GPU getting over 90*C). Most of the notebooks I've had to review had GPU temperatures over 85 - 90*C, and some even overheated. Without a doubt in my mind, cooling solutions used by notebook manufacturers need to be beefed up so they keep components cooler, and minimize the risk of overheating.
     
  9. Arquis

    Arquis Kojima Worshiper

    Reputations:
    844
    Messages:
    1,688
    Likes Received:
    0
    Trophy Points:
    55
    Because some people want to play games... there are lower end cards that do not get hot at all. My 8400m GS doesn't go above 51, for example. You just have to know what you're buying. It seems like you're blaming people for high-end laptops existing, as if there was no choice to get a low-end, cool GPU. Pretty weak argument if you ask me.

    "shouldn't it be fine for a laptop?" No. It shouldn't. I needed to replace my desktop with a laptop for college, and I am more than willing to withstand the heat, which is safe unless you're a newborn baby, and didn't have to sacrifice the power since I like to play games. Now if they refused to make cards that were powerful just to make some people comfortable, then I'd be angry. As for stability? How are they very unstable? Mines been fine. If you're not overclocking them at all or past their limits, then they are fine. What exactly is the problem?
     
  10. Mikelx215

    Mikelx215 Notebook Evangelist

    Reputations:
    132
    Messages:
    474
    Likes Received:
    0
    Trophy Points:
    30
    I don't think 'protagonist' is quite the word I'd use to describe the Inquirer, Chaz. Maybe you were thinking of propagator?
     
  11. Charles P. Jefferies

    Charles P. Jefferies Lead Moderator Super Moderator

    Reputations:
    22,339
    Messages:
    36,639
    Likes Received:
    5,084
    Trophy Points:
    931
    Probably, I'll change it just in case. Calling the Inquirer a protagonist is giving them too much credit.
     
  12. Doodles

    Doodles Starving Student

    Reputations:
    178
    Messages:
    880
    Likes Received:
    0
    Trophy Points:
    30
    Going for the Superior graphics ability is actually wat allows us to make the cards that ARE stable and cool. For example, the 45nm technology used to make the Q9x50s, which are hot cuz they have four powerful cores. ALso allows us to make the E8x00 processors, which only have 2 cores that are 45 nm... a HUGE improvement to the E6x00 series... making leaps in the hardcore technology allows us to back up that technology to "weaker" stuff, which is still way more powerful than last generation stuff that runs cooler. Processors was the best example. But for GPUs, i dont kno the names, but the 9500 is a powerful, cool card, only cuz we crippled the hot 9800...
     
  13. Magnus72

    Magnus72 Notebook Virtuoso

    Reputations:
    1,136
    Messages:
    2,903
    Likes Received:
    0
    Trophy Points:
    55
    What if people could learn to take care of their laptops even more? I mean a high end GPU of course gets quite warm/hot. The qutie a few people overclock too with a laptop that has inferior heat dissipation, no wonder the GPU´s get hot. Also if one is so concerned you can actually check the temp/throttling threshold in nibitor and see where your card starts to throttle.


    Just regular maintenance solves a lot of issues, dust bunnies is a huge culprit in temps. Those more "hardcore" can put on AS5 or even mod the heat dissipation themselves. But easiest way is just to use a can of air and blow out the dust once in a week.

    Raising the laptop in the back a little yields a drop of around 5-6 degrees maybe even more depending on the ambient temp. Simple things like that keeps the temps down :)

    Take care of your laptops and they will last longer ;)
     
  14. Chk

    Chk Notebook Consultant

    Reputations:
    3
    Messages:
    130
    Likes Received:
    0
    Trophy Points:
    30
    I think only a few of the posts I read answer the original question, I wasnt asking for an opinion, but a serious definitive answer, does anyone know people in the companies or read over their customer senus to tell me why this is so frequent?
     
  15. Mikelx215

    Mikelx215 Notebook Evangelist

    Reputations:
    132
    Messages:
    474
    Likes Received:
    0
    Trophy Points:
    30
    I don't understand your question, but I'll try my best to help you out. Each GPU designer/manufacturer offers a wide variety of products to meet their customer's needs.

    You might not need a fast high end gaming card - maybe I do. Maybe you just need a no frills dedicated GPU that runs cool. Maybe dedicated graphics aren't for you at all. The point is, each of these product tiers has its market which it sells in, and if you aren't satisfied with what you're looking for maybe you should re-asses your needs.

    What are you trying to do, and what problems have you personally had with nVidia or ATI's hardware?

    Have you considered Intel Integrated Graphics?
     
  16. JPZ

    JPZ Notebook Deity

    Reputations:
    339
    Messages:
    966
    Likes Received:
    0
    Trophy Points:
    30
    Chaz, excellent post.

    The reason for high temperatures is definitely the way notebook manufacturers design their cooling systems. The main problem is that people want it all- they want their computer to be fast, they want it to be thin and light weight, and they want it to be silent. Heatsinks are solid metal, and metal is heavy. Manufacturers are in most cases using barely large enough heatsinks in order to keep weight to a minimum and reduce laptop size/thickness. Possibly the biggest cause for high temperatures are the fans in current laptops- not only are they small and bordering useless, but manufacturers try to keep them from spinning so you don't have to listen to the fan.

    If manufacturers had more aggressive fan profiles, temperatures would probably be a lot lower. This may not be true in all cases, but it is in many.

    The Uniwill laptop in my sig has an 80watt desktop AMD processor from back in the socket 754 days. Under 100% load 24/7, the processor never breaks 60C. It is usually around 55C. I should mention that the Uniwill is a 15.4" laptop. It isn't extroadinarly thick or heavy(actually, it does weigh a little over 8lbs but that is because the entire chassis is metal). The reason it stays so cool with such a high wattage processor is that the fan actually turns on and moves air. With a decent cooling system, we could probably see load temps of well under 40C on modern laptops since they use so little power.
     
  17. link1313

    link1313 Notebook Virtuoso

    Reputations:
    596
    Messages:
    3,470
    Likes Received:
    0
    Trophy Points:
    105
    If you want low temps etc , go with an integrated solution (intel has the best right now).
     
  18. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    Everyone has been saying the same thing. And it's all correct, I don't understand what the OP is complaining about.
     
  19. LeetPix

    LeetPix Notebook Guru

    Reputations:
    35
    Messages:
    52
    Likes Received:
    0
    Trophy Points:
    15
    i overclocked my 9600m gt in my hp dv5t (500mhz core clock, 1000mhz shader clock and 500mhz memory clock) and it runs on 80c-82c MAX, thats like around 5 hours of continuous tdu/nfs pro street on max settings, its at 50c when im not gaming too ;p
     
  20. Valerone

    Valerone Newbie

    Reputations:
    0
    Messages:
    1
    Likes Received:
    0
    Trophy Points:
    5
    i think what OP meant was,

    Nvidia/ATi (especially Nvidia) puts graphic superiority as their main concern, rather than stability and lower temperature.

    Why not place stability and lower temperature as their main concern 1st, then steadily climb towards graphic superiority.

    Well, in my opinion, for the past few years, customers' (us users) main concern is about graphic superiority. Why wouldn't we, since games demand higher graphics capability from time to time


    Naturally, as a company, ATi/Nvidia (especially Nvidia) would want to fulfill their customers' satisfaction (in order to gain more profit), by offering greater graphic superiority while putting temperatures and stability as their 2nd concern.

    But, ever since the faulty Nvidia gpus news, things might probably change now.

    Gpu stability and temperature (maybe 35C max when playing Crysis :p) have to be ATi/Nvidia's 1st priority now or else pc gaming will not have a bright future.

    I'm guessing there might be a slow down in (newer, hardcore-ar, higher) graphic demanding games and also the performance of gpus for the next couple of years.
     
  21. Thaenatos

    Thaenatos Zero Cool

    Reputations:
    1,581
    Messages:
    5,346
    Likes Received:
    126
    Trophy Points:
    231
    You could say that nvidia is going for stability as they lowered the GPU size from 65nm to 55nm so less heat. Other then that technology needs to improve as software requires more powerful hardware to run smooth.

    I disagree, Ive always had better results from ATi IGPs then intel IGPs.
     
  22. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    I disagree; heat output would lessen if they didn't shrink the manufacturing process to also cram more transistors on to the die. Truth is, the lower TDPs are probably offset by the extra transistors that CPU and GPU manufacturers add.