The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Time for a new video card

    Discussion in 'Desktop Hardware' started by tijo, Mar 4, 2015.

  1. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Since we now finally have a desktop section, I might as well ask here.

    It's time for me to get a new video card.

    My current system is as follow:
    i7-2600K
    16 GB 1600 MHz RAM
    GTX570
    700 W PSU (Cooler Master Silent Pro)
    Dual 1920x1080 monitors

    While for now my CPU does what I need to, I feel that my GTX570 while still respectable is getting a bit long in the tooth, especially the 1.25 GB of VRAM among other things, so I'm in the market for a new video card.

    I don't plan on going crossfire or SLI any time soon and I don't have any brand preference either, so AMD or nVidia is fine.

    Let's say my budget is up to 500$ CAD, though ideally, I'd like something in the 200-300$ range, but I'll take suggestions up to 500$.

    I don't need to run the latest games at maxed settings (high would be nice), but I'd at least like to be able to run the homeworld remastered collection with everything cranked to maxed.

    So here's the killer question, what video card would you suggest I get?

    EDIT: Don't forget that I'm in Canada and that sometimes the prices in Canada are somewhat of a rip-off compared to the US (hence the up to 500$). I'll likely be purchasing online from any decent canadian e-tailer, so newegg.ca, ncix, memoryexpress, etc.
     
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Right now, the best GPU you can get within your budget is the R9 290X. It's about $300 USD.
     
    tijo likes this.
  3. Syndrome

    Syndrome Torque Matters

    Reputations:
    1,765
    Messages:
    1,501
    Likes Received:
    546
    Trophy Points:
    131
    The GTX 970 is pretty much the same price as the R9 290x. So I guess its between those two.

    On newegg.ca those can both be had for $420. I personally would got with the Nvidia, for years I was an ATI/AMD fan but my last bout with video cards put me in Nvidia's corner. Although my experience is with lesser cards, more in the GTX 750 range.

    Also don't forget that the GTX 970 is going to use less power than the R9 290x, if that's a concern to you.

    http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-5.html
     
    Last edited: Mar 4, 2015
    tijo likes this.
  4. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    I'd wait two months for the 3xx series from AMD, otherwise get the Sapphire R9 290X VaporX (it has 8 GB of VRAM)
     
    tijo likes this.
  5. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I would've recommended 970 as well if it weren't for its neutered specs, which kills it at high resolution:

    [​IMG]
     
    tijo likes this.
  6. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Yeah, if it wasn't for the whole memory thing, this would have been a no brainer. I can definitely afford to wait 2 months though.

    Regarding power consumption, it's not an issue as long as my PSU can handle it.
     
  7. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Your PSU will handle a 290X just fine.

    If you can wait a couple months, GM200 and Fiji should both drop. I'd expect to see a 390 (Fiji Pro) card right around the $500 CAD mark with awesome perf/price just like the 290 currently has.

    BTW here are some benchmarks of the Homeworld Remastered Collection: http://www.techspot.com/review/970-homeworld-remastered-benchmarks/
     
    tijo likes this.
  8. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    I would also wait, because R9 290x is too hungry (tho not a big problem for you) and GTX 970, well, should I too mention about all that VRAM controversy. That 570 of yours should last a few months. See what interesting comes out, if not, well you will still be able to pick older tech (by then) and probably for cheaper too.
     
    tijo likes this.
  9. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    Thanks to everyone who replies. Looks like I'll wait for the 300 series from AMD and see. Chances of an AMD video card are high though given the 970 memory problem.
     
  10. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    970 despite its memory problems is still a great card. However I too would lean more towards AMD if I were in a market for a new graphics card. False advertising should be punishable!
     
    TomJGX likes this.
  11. kent1146

    kent1146 Notebook Prophet

    Reputations:
    2,354
    Messages:
    4,449
    Likes Received:
    476
    Trophy Points:
    151
    The 970 "3.5GB controversy" is way overblown.

    It is only a problem when you are running at resolutions that actually use above 3.5GB of VRAM, which will only occur at 4k resolutions. And a single GTX 970 can't reliably drive 4K resolutions anyway.

    You're not going to find a real-worls scenario where the GTX 970 3.5GB partition becomes an issue. If you notice, the entire "controversy" relies on people intentionally trying to replicate the problem by running the card at unrealistic resolutions and graphics settings.

    In the real world, you will not run into a situation where you use more than 3.5GB of VRAM on a single GTX 970. And despite knowing what we now know about the 3.5GB partition, a GTX 970 is still a fantastic card for $330.
     
  12. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Uh I posted a graph of it limiting the 970 at 2.5K in Evolve. High levels of AA can also induce it. And you can forget about 970 SLI.
     
    Jarhead and TBoneSan like this.
  13. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Watchdogs, Shadow of Mordor, Skyrim modded, Titan Fall (in some instances) all chew up more than 3.5GB VRAM at 1080P. There's probably more games than that too.
    What Nvidia did was criminal in every sense of the word.
    Alot of publications play down the seriousness of the problem based on today's games... What about tomorrow's games? Not to mention SLI being maxed.

    970 is still a good performer though.
     
  14. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    Either way, its not good for the customer to send the message that its ok to be lied to as long as real world scenarios don't show that it matters. They advertised something that was not true, lied to all of their loyal customers and many seem to think it is ok because it doesn't effect performance. This is not something that should ever be taken lightly. I'm pretty sure that GTA IV with modded textures (or even without) can hit the 3.5GH limit.
     
    Jarhead and TBoneSan like this.
  15. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    The partitioned memory does make a difference in some games which would otherwise be playable, which is why so many people are upset. Nvidia not only lied about the specs, they also lied about the performance. And when defending the 970, Nvidia and many review outlets deliberately twisted facts and downplayed the issue by showing FPS instead of frame pacing/frame time.
     
    Jarhead, bnosam and TBoneSan like this.
  16. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    They need to just disable that upper 512MB. Although I still find it funny that the mobile GPU's are getting 6GB and 8GB vRAM where desktops are still 3GB and 4GB. Just tells you something, like 3GB and 4GB are more than sufficient. You have to get a $1000 desktop video card before you can go over 4GB. The only reason the 970m or 980m have 4GB is because they're 256-bit, otherwise I can guarantee if they were 384-bit they would be 3GB. So it's a conundrum.
     
  17. darkydark

    darkydark Notebook Evangelist

    Reputations:
    143
    Messages:
    671
    Likes Received:
    93
    Trophy Points:
    41
    Why not go with regular r9 290. Still has awesome performance and not to mention you can get it cheaper? Just take model with decent cooling.

    Sent from my C1905 using Tapatalk
     
  18. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    You need to think this through a little more. Right now an R9 290X is 300. It's a fabulous card. You have no idea how much the 3 series is going to cost. The R9 290x was the flagship card and it costed somewhere are 6-700 when it came out. An R9 390x or whatever the flagship will be called will probably cost in the 6-700 dollar range again and it will only be marginally better than the 290X. The 290x can run any game Ultra and will so for a few more years. Unless you are going to wait to buy one once the 3 series comes out. Also you can get the 290, which is specd just below the x. Buy two of these http://www.newegg.com/Product/Product.aspx?Item=N82E16814202146 and you'll be scrate for 5 years.
     
  19. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    I wouldn't exactly call +60% "marginal"

    [​IMG]
     
  20. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    Sorry, but it's all speculation till it hits the streets and even so, if the card is as good as it's "leaked" specs state, with a water cooler and all, your gonna have to drop a grand just to get it. Run 2x 290's or 290x's and I bet it hits the same framerates or VERY close to it. So I stand by my statement that it is marginally better for the price. If your into measurebation then that's great, but what the OP is looking for is a decent card to upgrade his older nvidia 500 series. You don't need a Supercomputer to run BF4... You don't even need a Current Tier 1 card. People got all these charts and graphs for what they "think" it will be like... get real and come back with this once it's released, then we'll talk. Also, the current flagship model is the R9 295X... Which costs 700 and beats the 390X specs right now... If I was him, I'd wait for the 390X to hit the streets, then the price of the 295X will plummet and he will get a better card for a better price.
     
  21. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    1. OP's PSU can't handle a 295X2 or 2-way 290/290X CrossFire
    2. SLI is bad enough as it is (Y500 650M SLI user here), but CrossFire is a crapshoot. How often does AMD release a new driver, like once every few months? Oh right, I would know, because I also have a 7950 in my desktop. Good luck waiting that long for CrossFire profiles if you keep up with AAA releases. Oh and what about the non AFR-friendly games that never get a profile? Yeah be my guest. Golf claps for AMD's wonderful developer relations team.
    3. Going off of the above, a single stronger GPU is always better than multiple weaker GPUs. Guaranteed performance/scaling, no microstutter, lower heat and power consumption, etc.
    4. AMD would be mental to price 390X same as Titan X at a grand. Nobody would buy 390X then.
    5. I never recommended 390X. I specifically singled out 390 as most likely being the fastest single GPU within OP's budget.
    6. This is getting OT
     
  22. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    1. Recommeded wattage for his setup with a set of 290's is 696 Watts and I made sure to add a bunch of accessories so it's probably lower. 1.(a) PSU's aren't that expensive these days.
    2. I have a Crossfire 290X system and have had "0" problems with any of your above mentioned issues. My current Firestrike score is 11,568. Your Y500 has two midrange cards, you should have expected microstutter at some point. You don't need to update your driver if there is nothing wrong with your setup. I update my drivers probably once a year and they run games fine. You are correct about heat and power consumption, but with performance come heat and power consumption, it's the nature of gaming, counter it with fans/airflow or liquid cooling.
    3. Not true, very few games are SLI/Crossfire incompatible these days and micro stutter problems occur with lower/mid range cards or really old high end cards, that need to be updated anyway. The concept behind SLI/Crossfire is to increase performance and it does just that. The solution to micro stutter or crossfire/sli incompatibility = disable sli/crossfire and lower your game settings to medium. You will still have fun playing your game. Promise.
    4. If it comes with a watercooler, you can be darn sure it's gonna have a hefty price tag on it. Preliminary "rumors" were pricing it at 700 before it was "leaked" that there would be a watercooler. 2x 290's for less than 500 is still cheaper with better spec'd performance.
    5. No you, took it upon yourself to post theoretical charts and graphs to oppose my statement of being marginally better.
    6. No it's on topic. He is asking for advice on what card to get and you can be sure that there will be people who disagree on a public forum as we obviously do.
     
  23. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    1. Recommended based on what? Here's my simple math. 2600K is 100W. Add a couple 290X/290 (300W each) is already 700W total. At stock. With nothing else.
    2. Microstutter has nothing to do with mid-range or high-end cards. Did you already forget AMD's frame pacing woes? And they still haven't fixed it in DX9 games and never will.
    3. Let me list games in the past year which don't work or have graphical glitches: Titanfall, CoD: AW, Middle-Earth: SoM, ArchAge, Risen 3, Wolfenstein: TNO, The Evil Within, Dead Rising 3, FIFA 15, NBA 2K15, Dragon Age: Inquisition, Cities: Skylines, almost every Ubisoft title
    4. OK, let's follow your own advice and stop the speculating mkay? ;)
    5. My bad, didn't mean to start a semantics argument. I just assumed "marginal" meant under 10%.
    6. OP already stated he/she is not interested in SLI/CrossFire
    P.S. Has AMD gotten CrossFire to work in windowed mode yet? ;)
     
    Last edited: Apr 13, 2015
  24. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    90% TDP-Thermal Design Power is usually 85-90% since it is very rare that CPU will utilize 100% of TDP unless you are stress testing which you don't need to if you aren't overlcocking. Also accounting for a maximum of 90% system usage, which will pretty much never happen. Because 100% (peak load) is only attainable when all components are at 100% load, including start up surge current compensation. Really, processors run 50-90% power efficiency and your system is never going to achieve 100% system load. The only way to achieve that is to ensure that EVERY device in the computer is operating at 100%, so you would need Prime 95, Furmark, Memtest, Playing music, using your printer charging your iphone and the list goes on... That's what you need to achieve that full 700 watts you are so concerned about. Ah... yes microstutter does have to do with mid range cards. Microstutter happens more often during low framerates and during high framerates if it does happen, is sometimes not even recognizable to the human eye. Lower tech gear gets lower frame rates, and in turn gets more noticeable microstutter. I didn't forget the framepacing woes for years ago. It's been fixed yo, where you been? If you are still playing DX9 games, why on earth would you need SLI? A low end R7 or 9 series card can run DX9 games on great settings... Again, to fix microstutter or crossfire/SLI incompatibility... Disable Crossfire/SLI and reduce your game settings to medium. You'll still have fun playing your game. Promise. Every game has issues and graphical glitches to some degree. One person might have a 980 SLI setup and have issues while another person has R9 290X Single Card and doesn't have issues and vice versa. Windows is not proprietary and is thus subject to glitches and compatibility issues. Wow really? Stop speculating? I'm talking about your theoretical prices here... :vbconfused: Not mine. And no you meant to start an argument. Otherwise you would have asked this question... "What do you mean by marginal?" And I would have answered, "Price vs Performance and more pipelines/higher clock speed is not quantitative of a proportional increase in performance, there are alot more variables than just these. Take for instance, 780 vs 980. You maybe get a 20 30 FPS boost if your lucky 10 if your not... that's about 5-25% better. Real world performance after upgrades are generally not as amazing as people expect it to be." And you may have been like, "oh I see what you mean now..." So what if the OP was not interested in SLI, I wasn't interested in a mercedes benz when I was looking for a car, I was interested in a toyota... but guess which one I own? :newpalm:

    P.S. I don't care... I play in full screen... :vboops:
     
  25. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,077
    Trophy Points:
    581
    I deleted some posts since things got out of hand a little bit even though people seem to have cooled down. I left some since they are somewhat relevant to the discussion.

    Speaking of which, almost pulled the trigger on a R9 290 this week-end. I'll also mention again that I'm in Canada and that we are somewhat getting the shaft at the moment. I mean, the R9 290 is listed as being almost 450$ CAD list price.

    Anyways, I'm reevaluating what exactly I need given that I haven't gotten any new games recently, so I might go down to something like a R9 280 or 280X if the price is right and I can find one with a decently silent cooler (Asus DCU II, MSI twin frozr, etc.)
     
  26. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Except for the additional VRAM, 280 and 280X (AKA 7950 Boost and 7970 GHz) aren't a huge step up from 570. If I were upgrading, I'd want a 290 at a minimum.
     
  27. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Keep an eye on the 290/290X listings on NCIX. A while back there was a PowerColor 290X going for $235 after rebate. :eek:

    But if you're in a hurry:

    Visiontek 290X -- $399.99
    Gigabyte 290 OC -- $349.99 after rebate
    Asus 290 DCUII -- $389.99 after rebate

    (and yes, all the above in CAD instead of USD :D, also promotion price on Asus and Gigabyte ends on Apr 15 so better act fast)

    I don't know anything about the Visiontek brand, maybe octiceps can comment on that. According to Kitguru who tested both the Asus and Gigabyte 290 cards, the Gigabyte is the quieter of the two if that makes any difference to you.
     
    Last edited: Apr 13, 2015
  28. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Gigabyte and ASUS both have 3 year warranties, PowerColor and VisionTek only have 2
     
  29. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    I found an OC 290 for 429 CAD. http://www.memorydepot.com/detail/GV-R929OC-4GD.html
    I am sure that if you wait a little bit a killer deal will come around...
     
  30. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    That same card is going for $349.99 after rebate at NCIX per my post above.
     
  31. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    Gonna hijack this thread a bit. I'm currently in the process of deciding what to get. If I go desktop way, 970 due to it's big cost here (430EUR) is a no go. As much as I would want to support AMD, 280X isn't exactly the best choice either (heat, power consumption, old architecture). The only option I'm left is 960, which is also a disappointment. Dunno what to do really.
     
  32. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    The 290 is a decent card. If I remember correctly it's priced and specd just below the 970 and better than the 960. Otherwise your stuck with the 960 or save up for the 970

    Oh and Kilkenny is the best beer in the world, IMO. Don't know if your referring to that or South Park...
     
  33. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    Isn't 290 power hungry and hot as well? I'm thinking about miniATX build, don't want it to get hot in there.
    Otherwise 290 looks nice from performance standpoint.

    EDIT: wow, up to 350W while gaming. how about no...
     
  34. octiceps

    octiceps Nimrod

    Reputations:
    3,147
    Messages:
    9,944
    Likes Received:
    4,194
    Trophy Points:
    431
    Don't you mean mini-ITX? In that case (pun intended), a reference 970 is the way to go.
     
  35. killkenny1

    killkenny1 Too weird to live, too rare to die.

    Reputations:
    8,268
    Messages:
    5,258
    Likes Received:
    11,615
    Trophy Points:
    681
    Yeah, I meant mini ITX. Forgot how to Engrish there for a moment.
     
  36. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    95C is the operating temperature of the 290x, and I think it is the same with the 290. Something to consider involving temps... The recommended operating temperatures of cards is going up every year. Video cards run better at higher temps. The 290X at 95C performs better than if it is watercooled down to 85C. I read an article in Tom's hardware on it a while back. For example I have a 980M in my notebook and the temps were higher than everyone else so I repasted it. It was running at 85C and I was getting benchmarks at around 8400. When I repasted with GC Extreme, my temps dropped to 75C average, but my benchmarks went back down to 8100 and dipped below 8000 on one test I ran. So heat on video cards is really not a big deal these days and aside from AMD, high temps on processors are ok as well. I believe the new Intel processors have a TDP of around 100C, whereas the FX series it is somewhere around 70C. It's like the tables flipped, you used to get better performance with really good cooling, but it seems to be opposite now.
     
  37. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    So I just bought a pair of HD7970's I got them off ebay but I had to buy the set because it was such a good deal... I only need one... If you want, I will sell you the other one for 100 USD...?

    EDIT: It's seems to be a decent bump up from the 570, but not as good as the 290, but it also won't hit your pocketbook too hard...
     
    Last edited: Apr 19, 2015
  38. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Hey a fellow GT upgrader! Wow that's a generous offer! Hope OP considers you.

    That's weird, I'm pretty sure that shouldn't be happening, as the higher the temperature the worse the conductivity of the card. Also my own experiences were the opposite, I repasted, and then got way better benchmarks and performance at a lower temp. Also a 980m.
     
    Last edited by a moderator: May 18, 2015
  39. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    Never Responded so I sold it.
    Don't know what to tell you. Higher temps = higher benchmarks for me. AMD 290X for sure performs best at 95C. Put a water cooler on it and you'll see a dip in performance. However, lower cooling allows for OCing which can then get you better performance, but results in higher temps. Do a little research into TDP and it makes a little more sense. Higher power draw creates more heat. Higher performance generates more heat. 980M GPU's performance is affected by the amount of power that the card is drawing. MSI's shift is a good example of that. Initially, they came out with a temperature that each mode of shift would allow the card to operate at. The higher the shift, the greater the power draw and the higher temperature limit. Something to note, it is next to impossible to find the thermal limit of the 980M chip... Do some of your own tests. Turn the computer off and leave it in a cold room, then turn it on and run firestrike after everything boots. Then run valley benchmark or furmark until you reach a max temp on normal fans, not boosted, then run the benchmark and compare the results. Mine, goes up everytime when it is hot.
     
  40. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Firestrike 8170 @ 97C... 8480 @ 84C turbo fan on... I get what your saying and how it generates heat, but cooling that heat shouldn't ruin its performance. It should be negligible I would think under ideal conditions.
     
  41. floydstime

    floydstime Notebook Consultant

    Reputations:
    6
    Messages:
    182
    Likes Received:
    30
    Trophy Points:
    41
    Very interesting... When I get some time, I'll post mine.
     
  42. Galm

    Galm "Stand By, We're Analyzing The Situation!"

    Reputations:
    1,228
    Messages:
    5,696
    Likes Received:
    2,949
    Trophy Points:
    331
    Lol like I get what both of us are saying, maybe its just conditions that are dif.
     
  43. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Temperature is completely independent of performance unless the card is down clocking because it has hit its thermal target limit (87C on my 980Ms). It makes absolutely no sense that performance would increase as temperatures go up. The same level of heat is going to be generated by the same load with the same power draw, the efficiency of the cooling system doesn't affect that. If that were the case, we wouldn't see liquid nitrogen overclocking and AIO water coolers on graphics cards. Heat is a complication of high-powered chips and it is their single largest enemy. The reason you are seeing benchmark variation is because the benchmark itself has linearity issues. There is no way in an operating system that runs background tasks that take up system resources to get consistent benchmark results. This is especially true in Windows 8/8.1