***The Official MSI GT80S Titan (w/desktop 980 GPU's) Owner's Lounge***

Discussion in 'MSI Reviews & Owners' Lounges' started by -=$tR|k3r=-, Dec 15, 2015.

  1. Kevin@GenTechPC

    Kevin@GenTechPC Company Representative

    Reputations:
    1,014
    Messages:
    8,501
    Likes Received:
    2,098
    Trophy Points:
    331
    Sorry, I replied it too short, thanks for pointing it out.
     
    hmscott likes this.
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,089
    Messages:
    20,398
    Likes Received:
    25,150
    Trophy Points:
    931
    "Best" is subjective, but the GT80's are Awesome :)

    You won't find another 980 SLI model, this is the only one.

    It's got limitations, but all laptops have limitations.

    The GT80S 980 SLI model will outperform the GT80(S) 980m SLI model, even OC'd the GT80(S) 980m SLI will be close, but not surpass a similarly tuned 980 SLI GT80S.
     
    Last edited: Mar 29, 2016
    Kevin@GenTechPC likes this.
  3. sticky

    sticky Notebook Consultant

    Reputations:
    16
    Messages:
    200
    Likes Received:
    3
    Trophy Points:
    31
    Somehow I feel like you're speaking to me like I'm a child :p

    I understand your point though. For overclocking dual 980's I'll need more power.

    I don't need to overclock so I'm cool. Pun intended.
     
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,089
    Messages:
    20,398
    Likes Received:
    25,150
    Trophy Points:
    931
    Quite often posters here have English as a second language, or even 3rd / 4th, so I try to keep the wording simple and make points broken out in separate straight forward statements.

    I also write for other readers now and in the future. Even if you catch on quickly, someone else might not be able to follow if I shorten things up and slang it locale.

    That's a good point, with 2 x 980 SLI, who needs to OC?

    As it turns out, just about everyone likes free performance. :cool:

    Once people see how easy and safe it is to tune for more performance; by seeing others do it and get better results, they want to do it too.

    More FPS really isn't a problem with the GT80S 980 SLI, or even the GT80(S) 980m SLI, but with the awesome GT80(S) cooling systems, either way it's cool :D
     
    Prostar Computer likes this.
  5. Porter

    Porter Notebook Virtuoso

    Reputations:
    758
    Messages:
    2,120
    Likes Received:
    930
    Trophy Points:
    131
    I don't want to confuse the conversation any more, but I wanted to say the GT80S does work with dual 330W adapters, it doesn't split the load evenly though (to be fair I don't think any laptop can do that) so on mine one gets warmer than the other.

    That is not the issue at all for me. The issue is that the EC limits the total draw to about 1 power supply so it doesn't buy you anything except maybe extending the life of a supply, and keeping them cooler. Nothing else.

    I did get a "trial" update but honestly it made zero difference in anything. If it's drawing more power you sure can't tell because it performs the same, and in benchmarks it made no difference at all. Stock or overclocked.

    I didn't expect any huge gains, but something measurable was expected.
     
    GenTechPC and hmscott like this.
  6. lichensoul

    lichensoul Notebook Evangelist

    Reputations:
    90
    Messages:
    319
    Likes Received:
    105
    Trophy Points:
    56
    SO still nothing on the power draw issue. I think I am getting a bit disappointed in this laptop. I love it and hate it at the same time. I love the performance. I hate the power draw issue. It is almost like MSI added the power draw part to have you destroy your battery faster with all the charge and recharging of it. So you have to replace it more often. :/
     
    hmscott likes this.
  7. Porter

    Porter Notebook Virtuoso

    Reputations:
    758
    Messages:
    2,120
    Likes Received:
    930
    Trophy Points:
    131
    I really am more of an optimist, but this is going to sound very pessimistic. I honestly think they limited it to such an extreme* limit in order to keep the performance only slightly above the 980m SLI and 980(200W) model notebooks. If they truly unleashed this thing, allowing us to utilize well past 330w (even 660w for us dual power supply folks) then this notebook would set the bar "too" high. It doesn't pay well for them to allow large bumps where we won't upgrade for many years. It pays much better for them to only have 10% or 20% speed bumps each revision and have some of us upgrade more often.

    *What I call extreme limit, is about 10% below the rated adapter output, while many other notebook don't set ANY limit at all, you can pull 400w from a 330w for periods of time with no problems. They don't need to fully support dual adapters, but they sure should not block it by limiting overall power to less than one adapters worth!
     
    hmscott likes this.
  8. Q937

    Q937 Notebook Deity

    Reputations:
    393
    Messages:
    1,040
    Likes Received:
    1,607
    Trophy Points:
    181
    I tested it and it doesn't solve the problem. MSI refused to increase the limit further. I'm also fairly convinced from my testing that the battery drain happens well under 324W, but without more accurate testing equipment I can't say for certain.


    The 400W numbers cited are from the wall, which is 340W to the machine at 85% efficiency. That said, the EC doesn't even allow that. A limit is necessary if they don't want to support dual adapters, but the number they use is far too conservative.
     
    hmscott likes this.
  9. Porter

    Porter Notebook Virtuoso

    Reputations:
    758
    Messages:
    2,120
    Likes Received:
    930
    Trophy Points:
    131
    I will argue a limit is not needed, or put it this way, I've never seen a limit used on any other laptop, other than my old MSI GT70 which I sold because of the battery boost, and power limit. I bought the GT80S because it did NOT have battery boost and I assumed they fixed the EC power limit issue since none of my other laptops suffer from this problem. After I talked to tech support I kindly informed them that it is not a feature ever mentioned anywhere, and if they indeed are saying it's normal operation they need to mention it like they used to for the older models. Otherwise people will assume they fixed it in newer revisions.

    Many times SLI GPUs and high end CPUs plus overclocking will take a system way past what the power bricks are rated for (and dual adapter setups have been around forever too, which works fine BTW for even the GT80S, unfortunately with the 300w limit you can't really get any good out of it). None of that could have been possible if they limited it before, so obviously they are not limited, why the sudden need for this extra "limit". Not sure why MSI chose to do that on a couple of their models, since there are plenty of other ways users can screw up their system, no need to put additional, and such tight constraints that really only affect the enthusiast crowd which these machines are designed for.

    There are plenty of built in power limitations inside the CPU already, and the GPUs already, no need for yet ANOTHER limiting factor, especially when its well below the factory output of the adapter.
     
    CaerCadarn and hmscott like this.
  10. Q937

    Q937 Notebook Deity

    Reputations:
    393
    Messages:
    1,040
    Likes Received:
    1,607
    Trophy Points:
    181
    A power limit of some sort is necessary because there are no restrictions on CPU power whatsoever besides what the cooling is capable of dissipating. The only time there is a power limit is when the battery is below 30% or absent. With 2x130W on the GPUs, that leaves around 70W for the CPU and remaining components, and I assure you that they are capable of exceeding that if pushed. That said, it's incredibly stupid how MSI doesn't offer an option to disable battery boost with the caveat that any damage resulting from a fried PSU is entirely your responsibility. What makes it even more absurd is that when the GPUs are running at max, you can make still the CPU pull >70W, and it'll happily start sucking 350 from the adapter on top of the 15 it's using from the battery, and this is with the stock EC. So why can't it just use 335 from the wall under normal load conditions?
     
Loading...

Share This Page