NVIDIA GeForce GTX Titan X: Discussion, Latest News & Updates.

Discussion in 'Desktop Hardware' started by J.Dre, Mar 8, 2015.

Thread Status:
Not open for further replies.
  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,195
    Messages:
    19,420
    Likes Received:
    24,169
    Trophy Points:
    931
    octiceps, I used the estimate someone else created, and multiplied it by 4 way SLI/CR.

    If the original estimate was out of proportion, then the 4 way SLI/CR is way out of proportion, but I used their number, I didn't make it up.
     
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,195
    Messages:
    19,420
    Likes Received:
    24,169
    Trophy Points:
    931
    n=1, not sure if I get your intended meaning...

    Are you talking about the actual card cost, and the difference in cost between the cards, totally overshadowing the electricity cost per month?

    The card costs / delta happens upon purchase, but the electricity cost goes on for years, and accumulates.

    That's why I got rid of all my "large" computers. I used to spend $100's per month on electricity just for the computers and support hardware - running high speed internet connections into my house feeding routers, switches, mux's, etc, and powering racks of UNIX / Linux / Solaris equipment, and lots of desktop/deskside PC's running various OS's.

    After I took accurate inventory of the costs, I decided to downsize and saved lots of recurring monthly costs.

    If you measure the actual power used at the plug(s) / wall / distribution box for a deskside computer with all these high power draw components and multiply it out over all your usage, it starts adding up to real money.

    But, enough of that. That's not fun to think about ;)
     
  3. octiceps

    octiceps Nimrod

    Reputations:
    3,146
    Messages:
    9,956
    Likes Received:
    4,193
    Trophy Points:
    431
    1. You're not running your GPUs full bore 24/7
    2. Performance scaling, hence power consumption, per additional GPU decreases as you add more cards to a multi-GPU setup
     
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,195
    Messages:
    19,420
    Likes Received:
    24,169
    Trophy Points:
    931
    octiceps, actually I do. I don't mine, but I do run distributed computing projects, so for me it is a consideration.

    Why let all that computing power go to waste, just sitting there idling?
     
  5. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Maybe I'll use some numbers again so it makes more sense.

    Let's make a few basic assumptions:

    - 390X will cost $749 upon release, and will match Titan X in performance
    - 390X achieves that performance using 300W, while Titan X does so with 250W
    - Electricity costs $0.30 per kWh (I think this very nicely compensates for peak hour/tiered charges, because the average cost will still be far below that number, at least in California anyway)
    - One runs their GPU 24/7 indefinitely till the cards go belly up

    So your initial staring price delta is $250 since the Titan X retails for $999, and your power delta is 50W.

    SINGLE GPU SETUP
    Over an entire year, the extra cost of electricity of using the 390X vs Titan X = (50W/1000W) * $0.30 per kWh * 24 hours per day * 365 days per year =$131.4 per year

    Which means you'll have to run such a machine for almost 2 years straight at 24/7 load before the extra electricity cost causes you to break even with the price delta incurred by buying the Titan X over the 390X for $250 more. (if you want to be really precise the break even point is 22.8 months)

    QUAD GPU SETUP
    Since you're buying 4x as many cards, the price delta is 4x as much = 4x $250 = $1000. This is the key thing I was trying to point out -- your initial upfront cost delta has now quadrupled as well.

    Over an entire year, the extra cost of electricity = 4 * (50W/1000W) * $0.30 per kWh * 24 hours per day * 365 days per year = $525.6

    BUT since you paid $1000 extra upfront for 4x Titan X vs 4x 390X GPUs, you'll still need to run this quad GPU machine for almost 2 years (22.8 months) straight at 24/7 load before you can offset the initial extra cost of the Titan X GPUs.

    This is what I meant by it doesn't matter how many GPUs you have in your rig, you only need to consider the simplest case of running a single GPU because the math automatically scales itself with each additional card.

    I mean yes, if you run your GPUs 24/7 till they croak then the electricity cost due to extra power usage will catch up to you eventually. That's a given. What I'm saying is that for 95% of the users out there, they don't run their GPUs anywhere near 24/7, and even if we take half of that ie 12 hours per day for the foreseeable future, it would take almost 4 years to "break even" so to speak. And that break even point simply increases as electricity cost goes down -- if you happen to live in Washington state where electricity costs $0.0822 per kWh, it'll take nearly 4 times as long to break even.

    I'm simply pointing out that trying to "eventually" break even and cancel out the initial upfront cost by saving on electricity and buying a more expensive card with a marginally lower power consumption number is ultimately self-defeating, UNLESS you run your GPUs 24/7, the initial price delta is less than $150, and you live where electricity costs more than $0.30 per kWh. (that was a long run on sentence...)

    For most users, extra electricity cost due to increased power consumption is essentially a non-issue. They'd be much better off figuring out if other performance metrics are up to their expectations when making a purchasing decision.

    And now I feel like a goddamn nerd, and understand why people hated us back in school. (well no not me personally because I knew better than to nerd out like that :D)

    @D2 Ultima: We should start a book club.
     
    Last edited: Mar 21, 2015
    octiceps and hmscott like this.
  6. octiceps

    octiceps Nimrod

    Reputations:
    3,146
    Messages:
    9,956
    Likes Received:
    4,193
    Trophy Points:
    431
    Fine, you're not most people. Most people turn their PC on to game or do work for a few hours a day, then turn it off. They're not running F@H or SETI@H or whatever 24/7. Since energy efficiency and not actual perf/price is your main concern, I hope you are being compensated handsomely for the increase on your electricity bill in the name of curing cancer or finding extraterrestrial life forms. :D

    Can I be the one who shows up at the meeting every week but never actually reads the book? :p

    College life hack: Join the book club and do exactly the above if you want to meet cute brainy girls. Bonus points for acting dumb; smart chicks dig that. :D
     
    Last edited: Mar 21, 2015
  7. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,195
    Messages:
    19,420
    Likes Received:
    24,169
    Trophy Points:
    931
    octiceps, there are millions of people that use their idle computing resources to contribute positively to society. I wouldn't go so far as to say it is unusual, or not what other people would do, and I wouldn't rule yourself out as being one of those that might do so in the future either :)

    Imagine all the good you can do with those high power idle compute resources at your finger tips, slowly becoming irrelevant, eventually being useless compared to the future technology that replaces it.

    Why not run it 24/7 now, while it can do some good? As n=1 pointed out - at great length :) , the energy costs are irrelevant compared to initial costs, so why not run them 100% 24/7 and let the resources earn back their creation costs. During the warranty period, all failures along the way are covered.

    If you leave it all sit there, powered off, and only turn it on to game - there is no amount of gaming that could be worth the sunk cost in a 4 x $1000 or 4 x $750 GPU system, costing about $8000 in total costs over 2 years.

    Even if you played 2 hours / day, 365 days a year, for 2 years. An $8000 system (overall initial cost and expenses counted in), you would be spending $5.50/hr while gaming. $330/month. $4000/yr.

    And, while it is neither here nor there, my wife was cured of cancer over a 5 year period, about 15 years ago. No matter how much we contribute back to society, it will be a fraction of the value we have already cashed out :)

    I was working with distributed computing many years before my wife's health problems. But, it does put that effort into a different light, when it hits home, and the doctors at Stanford tell you it makes a difference in research results. Sure there are now lots of huge compute farms coming on line for research, but there are still many times more distributed compute cycles available.
     
    Last edited: Mar 21, 2015
  8. octiceps

    octiceps Nimrod

    Reputations:
    3,146
    Messages:
    9,956
    Likes Received:
    4,193
    Trophy Points:
    431
    That's great, I guess? Sorry, this thread has gone so far off-topic...

    I suggest you read n=1's dissertation carefully one more time. His point wasn't that "the energy costs are irrelevant compared to initial costs." He was talking about the break-even point between buying a GPU with higher power consumption and lower upfront cost (390X) vs. buying a GPU with lower power consumption and higher upfront cost (Titan X). Basically, when choosing a GPU, power consumption in the long run doesn't matter. Perf/price in the short term is what matters.
     
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,195
    Messages:
    19,420
    Likes Received:
    24,169
    Trophy Points:
    931
    octiceps, re-read what you said, you validated my conclusion, even as you point out it wasn't the focus of what n=1 was trying to say :)

    Your conclusion is "Basically, when choosing a GPU, power consumption in the long run doesn't matter". - which is another way of saying what I said, "the energy costs are irrelevant compared to initial costs".

    @n=1, you also have to consider the situation where the cheap card is also the cheapest to run.

    Costing less to purchase, and costing less to run, makes both elements relevant, and contributing to the overall price/performance result.

    That was what I was trying to say before your long explanation of a particular instance where the more expensive card to purchase costs less to run. Somehow balancing both sides of the equation.

    My point was that the energy consumption is relevant, irregardless of initial costs. And, it is the variable cost, plus the marginal cost of operation 24/7, continually accruing.

    And, I run my GPU's a lot longer than 2 years, and my guess is you all do too.

    Thanks for the off topic exchange. :)
     
  10. octiceps

    octiceps Nimrod

    Reputations:
    3,146
    Messages:
    9,956
    Likes Received:
    4,193
    Trophy Points:
    431
    You're making a LOT of assumptions about how other people use or should use their hardware
     
Loading...
Thread Status:
Not open for further replies.

Share This Page