NVIDIA GeForce GTX Titan X: Discussion, Latest News & Updates.

Discussion in 'Desktop Hardware' started by J.Dre, Mar 8, 2015.

Thread Status:
Not open for further replies.
  1. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Ummm what?!

    So is energy consumption relevant or irrelevant compared to initial costs? Either way, I think the below 4 scenarios should be adequate to cover all our bases and end this OT.

    Assuming both cards are similar in performance and you don't have a preference either way, we can break it down like this:

    1. Card A more expensive than card B, also uses more power
    - You'll never break even regardless of what you do if you buy card A

    2. Card A more expensive than card B, but uses less power
    - You'll theoretically break even at some point, but for all practical purposes short of running your rig 24/7, your GPU would've went to silicon Heaven, or you would've upgraded long before you'd reach that break even point. Card A might be more financially sound in the long run IF you run your GPU 24/7 AND your GPU doesn't die prematurely AND the cost of electricity + initial price delta + power consumption delta is such that the math works out to your favor financially.

    3. Card A less expensive than card B, but uses power power
    - Equivalent to scenario 2

    4. Card A less expensive than card B, also uses less power
    - You'll never have to worry about breaking even if you buy card A, so go do whatever you want with your GPU.

    I think that covers all scenarios and should hopefully conclude this tangent we went on.
     
    Last edited: Mar 21, 2015
    hmscott likes this.
  2. octiceps

    octiceps Nimrod

    Reputations:
    3,146
    Messages:
    9,956
    Likes Received:
    4,193
    Trophy Points:
    431
    LOL @hmscott lest you contradict yourself again and confuse us even more, I think you should just buy Titan X and be done with it since it clearly works better for your particular use case scenario. However, there are those of us who would prefer to wait and see how AMD responds before making a decision. That is all.
     
    hmscott likes this.
  3. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,316
    Likes Received:
    3,819
    Trophy Points:
    431
    I expected more from the Titan X. It has 50% more cores but only provides like 35% more performance over the 980.

    Can't wait to see how AMD responds. I am expecting disappointment.
     
    Last edited: Mar 21, 2015
  4. DataShell

    DataShell Notebook Deity

    Reputations:
    47
    Messages:
    777
    Likes Received:
    354
    Trophy Points:
    76
    Why are you expecting disappointment? The 390X is rumored to be 50%-70% faster than the 290X; and the 290X still gives the 980 a run for its money, especially at 4k and/or when using multi-GPU configurations. I would not be surprised if the 390X actually outperforms the Titan X by a bit (then again, that wouldn't be my first guess).
     
  5. J.Dre

    J.Dre Notebook Nobel Laureate

    Reputations:
    3,700
    Messages:
    8,316
    Likes Received:
    3,819
    Trophy Points:
    431
    Because of those words in bold I'm expecting to be disappointed. I'm hoping it will be better than the Titan X, because then NVIDIA will have to do even better with the 980Ti. We'll just have to wait and see.
     
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,477
    Messages:
    19,766
    Likes Received:
    24,594
    Trophy Points:
    931
    The performance of both cards is still up in the air, it will take a while after release followed by new drivers to decide which to get :)
     
    Last edited: Mar 21, 2015
  7. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Cores don't scale 1:1 vs performance. You also have to remember the Titan X is clocked lower (1076 boost) vs the 980 (1216 boost). Based on Maxwell's OC scaling I estimate when clocks are matched the Titan X is around 42-45% faster than 980, which is very respectable.

    390X is supposed to have 4096 GCN cores vs 290X's 2816, which represents a 45% increase, very similar to Titan X's 50% core count increase over the 980. Then factor in architectural improvements from GCN 1.1 (290X) to GCN 1.3 (390X), and bandwidth gain from HBM vs GDDR5, and I'd say 50% over 290X is very much in the realm of possibility.

    There's no way to do better with the 980 Ti, since Titan X is the full GM200 chip already. If 390X delivers on Titan X like performance for less money, nVidia will be under significant pressure to price the cut down GM200 chip very aggressively.

    If the 980 Ti is actually supposed to compete with the 8GB 390X, then it'll be in a very awkward position. They can't release a full 980 Ti with 12GB because that's what the Titan X is. They could disable 1 SMM and slap on 12GB of vram, but that would cannibalize their own Titan X sales. If they stick with 12GB but start cutting more SMMs, then they risk losing the competitive edge to 390X.

    So nVidia would have to price the 980 Ti extremely aggressively, and hope the 8GB 390X is $700+. This way the 6GB 980 Ti would be competing against the 4GB 390X instead of the 8GB version.

    Either way, if 390X delivers it'll be a huge win for AMD and put significant pressure on nVidia. I for one hope the 390X ***** slaps some sense into nVidia.
     
    Last edited: Mar 21, 2015
    hmscott likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,477
    Messages:
    19,766
    Likes Received:
    24,594
    Trophy Points:
    931
    n=1/octiceps, it's both. Why do either of you think it has to be one or the other?

    It's relevant because it is a cost that accrues after you purchase the card, it is a marginal cost that is variable based on how you use it. If you are trying to make a purchase cost justification, it is relevant.

    And, it's irrelevant because it is a minor expense over the initial cost. Find a good sale on the original purchase and you can save the energy cost over the usage life.

    The fractional savings is likely not enough to justify getting one card over the other, there are likely many other factors to consider that would point to the card with the highest energy usage.

    Higher performance trumps energy efficiency, in a performance purchase justification, energy cost is irrelevant.

    If the highest energy usage device gets you where you want to go, when you want to get there, and the energy efficient device doesn't, then the energy cost is irrelevant to the choice of the device.

    This shouldn't be confusing, it is a common occurrence when deriving cost benefit justifications. There are lots of factors that are relevant on both sides of a comparison.
     
  9. n=1

    n=1 YEAH SCIENCE!

    Reputations:
    2,544
    Messages:
    4,346
    Likes Received:
    2,600
    Trophy Points:
    231
    Yes that's pretty much what I said above as well as in a previous post. Now can we please stop this OT?
     
    katalin_2003 and hmscott like this.
  10. ssj92

    ssj92 Neutron Star

    Reputations:
    1,862
    Messages:
    3,735
    Likes Received:
    4,563
    Trophy Points:
    331
    Can't wait for the SC model to be in stock from EVGA. My LG 34UC97 monitor can use one of these. The GTX 780 SC isn't cutting it anymore. :)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page