octiceps, I used the estimate someone else created, and multiplied it by 4 way SLI/CR.
If the original estimate was out of proportion, then the 4 way SLI/CR is way out of proportion, but I used their number, I didn't make it up.
-
-
Are you talking about the actual card cost, and the difference in cost between the cards, totally overshadowing the electricity cost per month?
The card costs / delta happens upon purchase, but the electricity cost goes on for years, and accumulates.
That's why I got rid of all my "large" computers. I used to spend $100's per month on electricity just for the computers and support hardware - running high speed internet connections into my house feeding routers, switches, mux's, etc, and powering racks of UNIX / Linux / Solaris equipment, and lots of desktop/deskside PC's running various OS's.
After I took accurate inventory of the costs, I decided to downsize and saved lots of recurring monthly costs.
If you measure the actual power used at the plug(s) / wall / distribution box for a deskside computer with all these high power draw components and multiply it out over all your usage, it starts adding up to real money.
But, enough of that. That's not fun to think about -
- You're not running your GPUs full bore 24/7
- Performance scaling, hence power consumption, per additional GPU decreases as you add more cards to a multi-GPU setup
-
Why let all that computing power go to waste, just sitting there idling? -
Let's make a few basic assumptions:
- 390X will cost $749 upon release, and will match Titan X in performance
- 390X achieves that performance using 300W, while Titan X does so with 250W
- Electricity costs $0.30 per kWh (I think this very nicely compensates for peak hour/tiered charges, because the average cost will still be far below that number, at least in California anyway)
- One runs their GPU 24/7 indefinitely till the cards go belly up
So your initial staring price delta is $250 since the Titan X retails for $999, and your power delta is 50W.
SINGLE GPU SETUP
Over an entire year, the extra cost of electricity of using the 390X vs Titan X = (50W/1000W) * $0.30 per kWh * 24 hours per day * 365 days per year =$131.4 per year
Which means you'll have to run such a machine for almost 2 years straight at 24/7 load before the extra electricity cost causes you to break even with the price delta incurred by buying the Titan X over the 390X for $250 more. (if you want to be really precise the break even point is 22.8 months)
QUAD GPU SETUP
Since you're buying 4x as many cards, the price delta is 4x as much = 4x $250 = $1000. This is the key thing I was trying to point out -- your initial upfront cost delta has now quadrupled as well.
Over an entire year, the extra cost of electricity = 4 * (50W/1000W) * $0.30 per kWh * 24 hours per day * 365 days per year = $525.6
BUT since you paid $1000 extra upfront for 4x Titan X vs 4x 390X GPUs, you'll still need to run this quad GPU machine for almost 2 years (22.8 months) straight at 24/7 load before you can offset the initial extra cost of the Titan X GPUs.
This is what I meant by it doesn't matter how many GPUs you have in your rig, you only need to consider the simplest case of running a single GPU because the math automatically scales itself with each additional card.
I mean yes, if you run your GPUs 24/7 till they croak then the electricity cost due to extra power usage will catch up to you eventually. That's a given. What I'm saying is that for 95% of the users out there, they don't run their GPUs anywhere near 24/7, and even if we take half of that ie 12 hours per day for the foreseeable future, it would take almost 4 years to "break even" so to speak. And that break even point simply increases as electricity cost goes down -- if you happen to live in Washington state where electricity costs $0.0822 per kWh, it'll take nearly 4 times as long to break even.
I'm simply pointing out that trying to "eventually" break even and cancel out the initial upfront cost by saving on electricity and buying a more expensive card with a marginally lower power consumption number is ultimately self-defeating, UNLESS you run your GPUs 24/7, the initial price delta is less than $150, and you live where electricity costs more than $0.30 per kWh. (that was a long run on sentence...)
For most users, extra electricity cost due to increased power consumption is essentially a non-issue. They'd be much better off figuring out if other performance metrics are up to their expectations when making a purchasing decision.
And now I feel like a goddamn nerd, and understand why people hated us back in school. (well no not me personally because I knew better than to nerd out like that)
@D2 Ultima: We should start a book club.Last edited: Mar 21, 2015 -
College life hack: Join the book club and do exactly the above if you want to meet cute brainy girls. Bonus points for acting dumb; smart chicks dig that.Last edited: Mar 21, 2015 -
Imagine all the good you can do with those high power idle compute resources at your finger tips, slowly becoming irrelevant, eventually being useless compared to the future technology that replaces it.
Why not run it 24/7 now, while it can do some good? As n=1 pointed out - at great length, the energy costs are irrelevant compared to initial costs, so why not run them 100% 24/7 and let the resources earn back their creation costs. During the warranty period, all failures along the way are covered.
If you leave it all sit there, powered off, and only turn it on to game - there is no amount of gaming that could be worth the sunk cost in a 4 x $1000 or 4 x $750 GPU system, costing about $8000 in total costs over 2 years.
Even if you played 2 hours / day, 365 days a year, for 2 years. An $8000 system (overall initial cost and expenses counted in), you would be spending $5.50/hr while gaming. $330/month. $4000/yr.
And, while it is neither here nor there, my wife was cured of cancer over a 5 year period, about 15 years ago. No matter how much we contribute back to society, it will be a fraction of the value we have already cashed out
I was working with distributed computing many years before my wife's health problems. But, it does put that effort into a different light, when it hits home, and the doctors at Stanford tell you it makes a difference in research results. Sure there are now lots of huge compute farms coming on line for research, but there are still many times more distributed compute cycles available.Last edited: Mar 21, 2015 -
That's great, I guess? Sorry, this thread has gone so far off-topic...
I suggest you read n=1's dissertation carefully one more time. His point wasn't that "the energy costs are irrelevant compared to initial costs." He was talking about the break-even point between buying a GPU with higher power consumption and lower upfront cost (390X) vs. buying a GPU with lower power consumption and higher upfront cost (Titan X). Basically, when choosing a GPU, power consumption in the long run doesn't matter. Perf/price in the short term is what matters. -
Your conclusion is "Basically, when choosing a GPU, power consumption in the long run doesn't matter". - which is another way of saying what I said, "the energy costs are irrelevant compared to initial costs".
@n=1, you also have to consider the situation where the cheap card is also the cheapest to run.
Costing less to purchase, and costing less to run, makes both elements relevant, and contributing to the overall price/performance result.
That was what I was trying to say before your long explanation of a particular instance where the more expensive card to purchase costs less to run. Somehow balancing both sides of the equation.
My point was that the energy consumption is relevant, irregardless of initial costs. And, it is the variable cost, plus the marginal cost of operation 24/7, continually accruing.
And, I run my GPU's a lot longer than 2 years, and my guess is you all do too.
Thanks for the off topic exchange. -
You're making a LOT of assumptions about how other people use or should use their hardware
-
So is energy consumption relevant or irrelevant compared to initial costs? Either way, I think the below 4 scenarios should be adequate to cover all our bases and end this OT.
Assuming both cards are similar in performance and you don't have a preference either way, we can break it down like this:
1. Card A more expensive than card B, also uses more power
- You'll never break even regardless of what you do if you buy card A
2. Card A more expensive than card B, but uses less power
- You'll theoretically break even at some point, but for all practical purposes short of running your rig 24/7, your GPU would've went to silicon Heaven, or you would've upgraded long before you'd reach that break even point. Card A might be more financially sound in the long run IF you run your GPU 24/7 AND your GPU doesn't die prematurely AND the cost of electricity + initial price delta + power consumption delta is such that the math works out to your favor financially.
3. Card A less expensive than card B, but uses power power
- Equivalent to scenario 2
4. Card A less expensive than card B, also uses less power
- You'll never have to worry about breaking even if you buy card A, so go do whatever you want with your GPU.
I think that covers all scenarios and should hopefully conclude this tangent we went on.Last edited: Mar 21, 2015hmscott likes this. -
LOL @hmscott lest you contradict yourself again and confuse us even more, I think you should just buy Titan X and be done with it since it clearly works better for your particular use case scenario. However, there are those of us who would prefer to wait and see how AMD responds before making a decision. That is all.
hmscott likes this. -
I expected more from the Titan X. It has 50% more cores but only provides like 35% more performance over the 980.
Can't wait to see how AMD responds. I am expecting disappointment.Last edited: Mar 21, 2015 -
-
-
The performance of both cards is still up in the air, it will take a while after release followed by new drivers to decide which to get
Last edited: Mar 21, 2015 -
390X is supposed to have 4096 GCN cores vs 290X's 2816, which represents a 45% increase, very similar to Titan X's 50% core count increase over the 980. Then factor in architectural improvements from GCN 1.1 (290X) to GCN 1.3 (390X), and bandwidth gain from HBM vs GDDR5, and I'd say 50% over 290X is very much in the realm of possibility.
If the 980 Ti is actually supposed to compete with the 8GB 390X, then it'll be in a very awkward position. They can't release a full 980 Ti with 12GB because that's what the Titan X is. They could disable 1 SMM and slap on 12GB of vram, but that would cannibalize their own Titan X sales. If they stick with 12GB but start cutting more SMMs, then they risk losing the competitive edge to 390X.
So nVidia would have to price the 980 Ti extremely aggressively, and hope the 8GB 390X is $700+. This way the 6GB 980 Ti would be competing against the 4GB 390X instead of the 8GB version.
Either way, if 390X delivers it'll be a huge win for AMD and put significant pressure on nVidia. I for one hope the 390X ***** slaps some sense into nVidia.Last edited: Mar 21, 2015hmscott likes this. -
It's relevant because it is a cost that accrues after you purchase the card, it is a marginal cost that is variable based on how you use it. If you are trying to make a purchase cost justification, it is relevant.
And, it's irrelevant because it is a minor expense over the initial cost. Find a good sale on the original purchase and you can save the energy cost over the usage life.
The fractional savings is likely not enough to justify getting one card over the other, there are likely many other factors to consider that would point to the card with the highest energy usage.
Higher performance trumps energy efficiency, in a performance purchase justification, energy cost is irrelevant.
If the highest energy usage device gets you where you want to go, when you want to get there, and the energy efficient device doesn't, then the energy cost is irrelevant to the choice of the device.
This shouldn't be confusing, it is a common occurrence when deriving cost benefit justifications. There are lots of factors that are relevant on both sides of a comparison. -
Yes that's pretty much what I said above as well as in a previous post. Now can we please stop this OT?
katalin_2003 and hmscott like this. -
Can't wait for the SC model to be in stock from EVGA. My LG 34UC97 monitor can use one of these. The GTX 780 SC isn't cutting it anymore.
-
Has anyone tested this card at 1440p yet?
-
-
It a great card at that res.
NVIDIA GeForce GTX Titan X: Discussion, Latest News & Updates.
Discussion in 'Desktop Hardware' started by J.Dre, Mar 8, 2015.