The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous pageNext page →

    Pascal: What do we know? Discussion, Latest News & Updates: 1000M Series GPU's

    Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Oct 11, 2014.

  1. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    There's the 200W 980 in the P870DM [This one is longer and wider than standard MXM3.0b, Eurocom calls it MXM3.0c, whether that's just a placeholder or it will be an actual thing, we don't know] (No SLI)

    There's the 180W 980 in the P775DM1 which should have the same clocks as the 200W 980 but it won't overclock as high due to less power phases, etc. [This one is almost standard MXM3.0b but is slightly longer] (Supports SLI since it has the connectors)

    There's the 150W 980 in the GT72S which has lower clocks and 5Ghz memory I believe and even less power phases than the 180W 980 but should be within 5-10% performance of the 200/180W models stock vs stock. [This one has a weird shape and is long left to right on one corner] (SLI Unknown)

    There's a "120W" version apparently that we have not seen yet. Whether it is real or not we will see. This one is rumored to be the one in the GT80 and would most likely be the standard MXM3.0b format. Should have same clocks as the 150W version but no less than 1064Mhz core/5Ghz memory. So still should be much faster than 980M. (Supports SLI)
     
  2. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    The laptops do look more uhh, modern without that 90's black box look. Although I have to say they most certainly ripped off took design cues from Alienware

    Well Maxwell is efficient due to aggressive dynamic throttling as well as being completely FP64 gimped. If you follow that Tom's link I posted, this is what they had to say:

    There you have it, GM204 is really a 250W card in disguise. This also explains why compared to GK110, GM200 runs ass hot, throttles at stock, and basically pushes the reference blower beyond its limits even at stock. Hell with a modded vbios I can get my 980 Ti to suck down 400W just playing BF4, that's some insanity.

    Although this throttling algorithm works most of the time and without issues, it completely breaks SLI if you have cards with different ASICs, or any game that doesn't push the GPU hard enough. I've had games crash at stock because the GPU was trying to boost to a certain clock state without enough juice due to voltage crossover. Just ask Ethrem, he knows what I'm talking about.

    So if nVidia decided to completely eliminate this throttling nonsense for Pascal, then I can see how Pascal has the potential to become Thermi 2.0
     
    Last edited: Dec 8, 2015
    triturbo, TBoneSan and jaybee83 like this.
  3. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    theres no universal answer to that. u add more cores, u need more memory bandwidth to keep up with that and vice versa. also depends on the arch and the software situation were talking about. what ive seen with the 980M ist that increasing core clocks brings more of a performance boost than increasing memory clocks, albeit both DO give boosts. with core clocks being the same and increasing the number of cores by 33% id bet you that the resulting boost is closer to 33% than 10% by far!
     
    Last edited: Dec 8, 2015
  4. Omnomberry

    Omnomberry Notebook Evangelist

    Reputations:
    45
    Messages:
    556
    Likes Received:
    497
    Trophy Points:
    76
    Nice autocorrect
     
  5. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    LOL, took me a bit to discover what u mean, HAHAHA! gonna change that right away :p :D and no autocorrect, rather a "Freud´sche Versprecher", probably thinking about my boss at that moment ;)
     
    Omnomberry likes this.
  6. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
  7. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
  8. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    28nm cards don't really benefit too much from HBM because the GPUs aren't bandwidth starved. Hell overclocking the memory from 1750 to 2000 (7 to 8 GHz effective) gives me a whooping 5-7% performance increase, and this is with a GM200 (980 Ti) overclocked past 1500.

    One benefit of HBM though is its much lower power consumption compared to GDDR5. Not just the memory chips, but the memory controller can be simplified as well, lowering power consumption/TDP further. I suspect the prime motivation for using HBM on Fury X isn't so much for the bandwidth, but rather to keep TDP under control, as I imagine a hypothetical GDDR5 based Fury X would easily go over 300W even with an AIO. That and probably to ease the transition into HBM2, since the board design would be so radically different from GDDR5, and a node shrink, arch update, and memory change all at once might be too daunting. Just FYI the last time nVidia tried to do all 3 things in one go we got Fermi.
     
    jaybee83, hmscott and triturbo like this.
  9. Ethrem

    Ethrem Notebook Prophet

    Reputations:
    1,404
    Messages:
    6,706
    Likes Received:
    4,735
    Trophy Points:
    431
    Umm... Before Dell, Alienware was a reseller. It is because of Clevo that Alienware ever even became a thing in the first place.
     
    TomJGX, tgipier, jaybee83 and 2 others like this.
  10. Robbo99999

    Robbo99999 Notebook Prophet

    Reputations:
    4,346
    Messages:
    6,824
    Likes Received:
    6,112
    Trophy Points:
    681
    Haha, hopefully your boss is not a notebook enthusiast! :)
     
    jaybee83 likes this.
  11. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Weren't the Clevo Alienwares criticized for being nothing more than "reskinned, fancy Clevos" that had aesthetic changes only with no functional improvements? Or am I thinking of something else.

    Either way though, you have to admit the P870DM does bear some resemblance to the 2013 AW machines, particularly the lid design. That said, the same could be said of the GT80's lid design as well. But speaking of which, I just realized Clevo may have taken some cues from GT72 as well.

    Exhibit A:
    [​IMG]

    Exhibit B:
    [​IMG]

    I'm not trying to crap on Clevo here, but there's just something about those sleeper laptops that I find very appealing. It was also one of the defining characteristics of a Clevo laptop. I actually do like the new, modern Clevo laptops better, but again, I'd much prefer if Clevo came up with their own design instead of imitating what others have done. Plus there's nothing wrong with a black, rectangular laptop, it's a style of its own. I mean anyone who's thinking about buying a P870DM values performance first and foremost, and probably doesn't care all too much about flash. If the P870DM had kept the P370SM's design, I doubt many walk away in disgust because the laptop just looked too plain.
     
    hmscott likes this.
  12. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    I see no resemblance there except that they both have lots of vent holes which are required to cool the components. /shrug/. It looks pretty different to me. And I can't blame them for adding a little more character to the chassis design to reel in AW customers that are pretty irritated with AW considering they screwed the pooch and are nothing more than soldered generic laptops now.
     
    TomJGX and i_pk_pjers_i like this.
  13. PrimeTimeAction

    PrimeTimeAction Notebook Evangelist

    Reputations:
    250
    Messages:
    542
    Likes Received:
    1,138
    Trophy Points:
    156
    What pisses me really off that this Nvidia GTX980 for notebook not only screwed up the mobile GPU naming but also MXM naming. There is s "standard MXM3.0b", a "wider than MXM3.0b",an "almost MXM3.0b" and a "God knows what MXM". Good luck trying to find a proper card for replacement/upgrade 5 years down the line.
     
    i_pk_pjers_i and jaybee83 like this.
  14. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    I think the special mobile GTX 980 is MXM 3.0c

    Meanwhile, AMD has completely withdrawn from the mobile graphics space. No MXM release since the R9 M290X.....in 2013. Here's to hoping Pascal will be something great and not half assed since there is no competition.
     
  15. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    there is no mxm 3.0c standard though....especially since each and every single 980 mobile variant comes in a different form factor with different dimensions :D
     
  16. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    I think this will be a one time thing with the desktop card being all of these different sizes. I think for pascal even if a desktop 1080 came out, it should fit a standard MXM board IF they use HBM.

    AMD should be back in the game when they launch Arctic Islands.

    2016 will be exciting for GPUs AND CPUs. ;)
     
    jaybee83 likes this.
  17. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    No, thats what eurocom calls the 200w version of GTX 980. MXM3.0c. As for the 150w, 180w, or possibly the 120w, no idea.
     
  18. Zymphad

    Zymphad Zymphad

    Reputations:
    2,321
    Messages:
    4,165
    Likes Received:
    355
    Trophy Points:
    151
    I don't see resemblance of P870DM to any Alienware model, current or past. The lid hinge is very different. The lid cover has different shape and angles. Granted the lighting has some resemblance. The rear panel, just see a lot of vents. Did you want Clevo to have squiggle shaped vents?
     
  19. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Compare the lid hinges of the P570WM, P370SM, P770ZM, and P870DM, then take a look at the lid hinge of the 2013 AW machines. Tell me the P870DM doesn't at least bear more resemblance than the other 3 Clevos. As for lid design, this was my basis for comparison:

    Alienware 18
    [​IMG]

    P870DM
    [​IMG]

    Yeah it's definitely the lighting as you said.

    As far as vents go, why not make them straight lines? Why do they have to be angled, and in the same direction as another laptop no less?
     
  20. Splintah

    Splintah Notebook Deity

    Reputations:
    278
    Messages:
    1,948
    Likes Received:
    595
    Trophy Points:
    131
    Yea it definitely does take a lot of design cues from it's competitors, doesn't take much to see it.
     
  21. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,708
    Messages:
    5,820
    Likes Received:
    4,311
    Trophy Points:
    431
    Who cares about cosmetics....give me good internals and structural integrity!
     
    jaybee83 likes this.
  22. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Well that's just isn't it. Up till a few years ago Clevo was never really known for its aesthetics, but as the only viable alternative to Alienware machines if you wanted all out performance. Heck, Clevo is still the only OEM to make laptops that use desktop chips and that counts for a lot.

    Clevo built its reputation as an OEM that "can and will shove anything into anything and make it work". Sure some of their early models weren't anything to look at, but as you said, who cares as long as the performance was up to par. Besides I'd argue anybody looking for a Clevo likely didn't care too much about aesthetics (or more accurately, flash) in the first place, else they would've got an Alienware.
     
    jaybee83 likes this.
  23. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    shove anything into anything and make it WORK! thats what im talking about baby! *rofl* loving it :D
     
  24. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    You really gotta hand it to them with the P570WM -- a cooling system that can handle an overclocked 4930K is no joke. And that's why I also like Clevo, because they seem to do the opposite of everyone else does: all out performance as top priority, not shying away from making big laptops bulky and heavy, and not giving a flying Peking duck about BGA crap.
     
    TomJGX, jaybee83 and TBoneSan like this.
  25. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    if only clevo could offer that 17 inch 4k ips screen with 980ms in sli....
     
  26. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Well you can get it with 980m SLI. 4K is still out of its reach though. It needs to be 2560x1440. Forget about 4k.
     
  27. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    *Slams fist on table*
    THEN WE SHALL SLI DESKTOP GTX 980S. NO OBSTACLE SHALL DENY US OUR 4K.
     
    TomJGX, jaybee83 and hmscott like this.
  28. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Glorious 4KMR amirite

    But seriously for the moment, 144Hz 1440p IPS is the sweet spot for me. Build quality issues and broken OSD aside, this Acer XB270HU really has given me a very good experience.
     
    TomJGX, TBoneSan and DataShell like this.
  29. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    I was just reading through this thread. Can someone tell me why components are still so inefficient? I mean, 200 W for a piece of silicon to throw around electrons and display 120 million coloured dots every second (FHD 60 FPS)? If someone games for 5 hours every day, and only the GPU is taxed, that's 1 kWh every day. I don't know how much electricity costs elsewhere, but where I live it's ruddy expensive. One could end up paying $75-100 extra every year, besides the initial costs of the extra-powerful laptop. My ceiling fan uses less power than that (it's just 75 W). We can completely forget monstrous 800-1000W desktops.

    Not to mention the thick, heavy cooling apparatus needed to keep everything at a reasonably low operating temperature. I wonder what data centres and workstations make of the costs. Probably 50% goes to paying for the air-conditioning.

    Also, speaking of 4K - I hooked up my laptop to my 4K TV and started up Witcher 3, cranked up the resolution and attempted to stumble through the game, at 10 FPS. It was fun. Oh, the lag.
     
    Last edited: Dec 10, 2015
    TomJGX and hmscott like this.
  30. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Is $75-100/year really that much? Adding an extra $6-8/mo to your electric bill? I mean people don't flinch to pay $50/mo or more for a smartphone bill. And gaming 5 hrs/day, that's like a part time job. Sure even in my heyday with no kids and only apartment I'd have sessions where I'd play 5-6 hours a day over a weekend, but never *every day*. Anyone with a full time or even part time job couldn't manage that. At my peak I was probably only playing 25 hrs/week. 150hrs/mo gaming seems a bit excessive. Just compare the performance per watt where we were five years ago. It's staggering how much better power consumption has become.

    Plus the TDP is the max cooling potential required, and won't necessarily draw that much power. Especially if you use G-sync, and FPS is limited to 75 which the GPU can handle easily at 1080p.
     
    TomJGX, hmscott and TBoneSan like this.
  31. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    Honestly, I haven't even noticed any difference in price no matter what system I've used. That's based on exclusively using devices ranging from 100w - 1000w in literally the most expensive place in the world for electricity.. Adelaide (oz) and now, #5 most expensive Japan.
    Gaming and general computing won't always draw 100%. On the other hand Heaters and Aircons are usually going full bore all the time.. And yes, come winter / summer I do notice that.
    It's worth it to me if it costs an extra $100 a year to give an awesome gaming experience.
     
    hmscott and HTWingNut like this.
  32. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,877
    Trophy Points:
    931
    Spend $3k on gaming laptop and $50 per big release game... worry about $100/yr electricity? Nope. It's a hobby. You suck up / budget for the costs.
     
  33. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,460
    Messages:
    5,558
    Likes Received:
    5,798
    Trophy Points:
    681
    That's it. Spending big $'s on a system then begrudging oneself to get value from using it doesn't make sense.
     
  34. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Well, to each his own. I haven't got independent means (like probably many others here), which means I don't exactly have the right to use as much electricity as I need/want. Also, my smartphone bill is $23 a month. The 5-hours-of-gaming-a-day thing was simply a mathematical estimate - an extreme case. Nevertheless, AAA gaming on a high-end desktop is likely to suck up lots of power at any rate.
     
    i_pk_pjers_i likes this.
  35. Omnomberry

    Omnomberry Notebook Evangelist

    Reputations:
    45
    Messages:
    556
    Likes Received:
    497
    Trophy Points:
    76
    Yea, but on the other hand my new gaming laptop displays far more advanced graphics at max 230W than my 5 year old pc with HD6700 (<- not sure any more which model)@ 300W.
    To me this was mindblowing when I switched.
    I can have more performance for less power with less noise in a far smaller package( which also powers the screen).

    It can feel slow, yes, but it still moves forward. (fast enough for me to still be amazed)
     
    Ionising_Radiation likes this.
  36. i_pk_pjers_i

    i_pk_pjers_i Even the ppl who never frown eventually break down

    Reputations:
    205
    Messages:
    1,033
    Likes Received:
    598
    Trophy Points:
    131
    wat

    You're concerned about $75 extra a year when you likely spent over $2000 on your laptop? Color me a bit confused.
     
  37. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    I suppose I ought to clarify - I did not spend $2000 on my laptop - someone else bought it for me. Just because someone spends a lot of money in one go doesn't mean they can afford to slowly leak money...
     
  38. i_pk_pjers_i

    i_pk_pjers_i Even the ppl who never frown eventually break down

    Reputations:
    205
    Messages:
    1,033
    Likes Received:
    598
    Trophy Points:
    131
    Oh, alright. I assumed that if you were concerned about the cost of electricity, that likely meant that you bought your own laptop as well since it would mean that you were handling finances if you were worried about the cost of electricity.

    That makes sense and that's a bit of a different story, however, with a powerful laptop/computer it should not be too surprising that it will cost a lot for electricity.
     
  39. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    757
    Messages:
    3,242
    Likes Received:
    2,667
    Trophy Points:
    231
    Fair enough. And that brings us back to where we started - powerful components, or computer components in general, appear inefficient.

    Certainly, things have improved a lot from 2010 and even more from 2000, but there's a long, long way to go.

    I'm hoping that Pascal makes add-on board sizes smaller, so we can have MXM cards at most 3x the size of a SO-DIMM, what with HBM and stacked memory and all.

    Actually, is stacked memory for Volta, i.e. SoC GPUs?
     
  40. Phase

    Phase Notebook Evangelist

    Reputations:
    7
    Messages:
    483
    Likes Received:
    100
    Trophy Points:
    56
    would big pascal or entry level pascal be released first? i know the 750 ti and 860m were the first maxwell cards released as a teaser. is nvidia going to do that crap again or will they just release the new titan or 1080 first
     
  41. i_pk_pjers_i

    i_pk_pjers_i Even the ppl who never frown eventually break down

    Reputations:
    205
    Messages:
    1,033
    Likes Received:
    598
    Trophy Points:
    131
    I would probably say high end Pascal would be released first.
     
  42. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    your guesses are as good as anyone´s at this point in time :) it really depends on how high the yield of the new chips is. the lower the yield, the higher the likelihood that theyre gonna release the lower-tier gpus first.
     
  43. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Mxm 3.0a 1080M would be tasty!

    Sent from my SM-A500FU using Tapatalk
     
    jaybee83 likes this.
  44. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    If GP100 (big Pascal) gets released first expect to pay through the nose ($1500+) for them. Plus you don't want to show your entire hand from the get go, so I strongly suspect nVidia will continue their model of releasing the fake medium die "flagship" first, then releasing the true big die flagship 6-12 months down the road for double dipping.
     
    TBoneSan, Robbo99999 and jaybee83 like this.
  45. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    that would be insaaaaane! *lol*

    Sent from my Nexus 5 using Tapatalk
     
  46. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    No it is unrealistic. However not impossible for the 1070M to be MXM 3.0a. If they use HBM AND the card consumes less power and therefore produces less heat it is feasible to use MXM 3.0a form factor!

    Then i will be able to link my CPU and GPU cooling modules together to help cool the stupidly hot 920xm!

    We all are allowed to dream... :rolleyes:
     
  47. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    "No it is unrealistic"

    look whos talking! the guy who managed to clock his "stupidly hot 920xm" up to 4.52 Ghz!!! :D
     
    Last edited: Dec 11, 2015
    TomJGX and triturbo like this.
  48. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    Lol that was all in the name of science with +200mv overvolt that was disgustingly unstable. I did that air-cooled though :D

    Under load I can "only" run 3.7ghz all cores stock air-cooled.
     
    hmscott, TomJGX and jaybee83 like this.
  49. jaybee83

    jaybee83 Biotech-Doc

    Reputations:
    4,125
    Messages:
    11,571
    Likes Received:
    9,149
    Trophy Points:
    931
    cheap excuses!

    (over here its 5.0ghz "for science" and 4.7 under load, 4.3 for everyday :D)

    Sent from my Nexus 5 using Tapatalk
     
    hmscott and TomJGX like this.
  50. King of Interns

    King of Interns Simply a laptop enthusiast

    Reputations:
    1,329
    Messages:
    5,418
    Likes Received:
    1,096
    Trophy Points:
    331
    I envy you. Your chip ain't a 6-7 year old chip at 45nm sucking more than 150W at high load ;)

    If I were in your shoes I would be shooting for 5.5ghz :D the crazy guy I am..
     
    hmscott and TomJGX like this.
← Previous pageNext page →