GTX 980 vs GTX 970M SLI? Which one wins?

Discussion in 'Gaming (Software and Graphics Cards)' started by Esaelias187, Jul 18, 2016.

Thread Status:
Not open for further replies.
  1. Esaelias187

    Esaelias187 Notebook Geek

    Reputations:
    5
    Messages:
    92
    Likes Received:
    15
    Trophy Points:
    16
    What will be better for most games near ultra on 1080p?

    getting the highest FPS?

    i personally believe a 970m SLI 12gb VRAM can last 3/4 years
     
  2. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,565
    Messages:
    5,607
    Likes Received:
    4,040
    Trophy Points:
    431
    You mean Mobile 980? definitely mobile 980 > 970M SLI
     
  3. HaloGod2012

    HaloGod2012 Notebook Deity

    Reputations:
    672
    Messages:
    1,742
    Likes Received:
    1,412
    Trophy Points:
    181
    go for the single 980. I have two 980s in my laptop and the power is complete overkill, one is perfect. 970m sli is great, but you need to make sure all your games play nice witth sli. Once explicit multi adapter is the norm in big games, I can see sli rigs flying.
     
  4. saturnotaku

    saturnotaku Notebook Nobel Laureate

    Reputations:
    4,310
    Messages:
    7,842
    Likes Received:
    3,356
    Trophy Points:
    431
    SLI doesn't double the amount of video RAM you have - it's still a 6 GB frame buffer.
     
  5. Porter

    Porter Notebook Virtuoso

    Reputations:
    736
    Messages:
    2,071
    Likes Received:
    897
    Trophy Points:
    131
    980(200w) is better. SLI is fine for two top level cards but they don't make SLI for the 980 (200w) and likely never will. Plus you don't have to deal with games that don't play nice with SLI.
     
    TBoneSan likes this.
  6. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,334
    Messages:
    11,801
    Likes Received:
    9,734
    Trophy Points:
    931
    The day THAT happens... call me skeptical, but devs aren't coding for multi-GPU now. Asking them to code in optimizations for multi-GPU in DX12/Vulkan when it's MORE work than it is now? That's gonna be the day =D
     
    TBoneSan likes this.
  7. Raidriar

    Raidriar ლ(ಠ益ಠლ)

    Reputations:
    1,565
    Messages:
    5,607
    Likes Received:
    4,040
    Trophy Points:
    431
    This....SLI/CF has pretty much been eliminated from games, to many people's chagrin. Only benchers really get SLI/CF any more. I split up my 980M SLI and put one in the M18x R2 and the other Alienware 18, since 1 980M is more than enough to handle 1080P
     
    Kade Storm likes this.
  8. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,087
    Messages:
    20,399
    Likes Received:
    25,150
    Trophy Points:
    931
    I like laptop SLI, and all of the games I mostly play have SLI support that puts the performance high enough that I can run 100 FPS / 100hz refresh for most games.

    There are games that support SLI later after release, or work with simple manual tuning, otherwise all my games work with SLI without fiddling.

    Will new games do that? IDK, I am not going to use Windows 10 or DX12 for a long time, so it doesn't matter to me, I will continue to play games on DX11 / Vulcan.

    But, I would only SLI the highest single GPU's available, like the 980m or 980 mobile desktop - you don't need 200w to get great performance well over a 980m single/SLI, and there are several models.

    The sweet spot is the 980m SLI, for now.

    Single 1080 or SLI 1080 will replace it, but who knows when :)

    Either a single 980 or 980m SLI, that's what I would go with right now, but only if it was imperative that I game now, and have nothing else to use.

    Otherwise, game on what you have and wait for the 1080/1080m releases.
     
    Kade Storm and TBoneSan like this.
  9. HaloGod2012

    HaloGod2012 Notebook Deity

    Reputations:
    672
    Messages:
    1,742
    Likes Received:
    1,412
    Trophy Points:
    181
    Didn't Microsoft just release something to make multi gpu coding extremely easy?
    Edit:
    see below, this supposedly makes it much easier for devs and is coming out very soon. We can pray they use make use of it. We see how awesome it can be in the new 3dmark and Ashes!

    http://m.hexus.net/tech/news/software/94249-microsoft-makes-multi-gpu-support-easier-dx12-devs/
     
    Last edited: Jul 19, 2016
  10. D2 Ultima

    D2 Ultima Livestreaming Master

    Reputations:
    4,334
    Messages:
    11,801
    Likes Received:
    9,734
    Trophy Points:
    931
    It's even at the point that when games DO work, they rarely if ever hit over 90% utilization. I couldn't get BO3/GTA V/Dying Light/etc to pass 90% util. MGS V Ground Zeroes could not pass 70% (rare spikes to maybe 80%). Etc etc. Dark Souls 3 generally sits UNDER 50% util on my system for reasons unknown; I've heard of it getting better util on desktops but the limit is around 60%.

    And please note, this isn't even considering "scaling". 95-99% util on each card could STILL only be 90% scaling, for example. So when I'm at 80-90% range for util, what does that leave scaling to be? 70-75%?

    And sometimes their default profiles aren't even so good. Black Ops 3 apparently has a better scaling profile if you're using the BattleForge bits... even though it had its own profile since launch.

    That still uses AFR. Games must be AFR-friendly to use this. The problem is games aren't AFR friendly or require too much bandwidth to work with AFR forced. Unreal Engine 4 is one engine that almost needs more bandwidth. The High Bandwidth bridges will basically fix that, but laptops pretty much can't use them, since the tech is a band-aid on the bandwidth problem by using double SLI bridges in one, since NVLink can't be sold effectively in the consumer market and they didn't bother using tech similar to AMD's XDMA (use the PCI/e interface bandwidth to transfer memory data between the cards; it needs the memory to be designed to allow it or the transferrable data is very low) as they wouldn't make extra money off it.

    Like, devs are either ignoring multi-GPU, or they're picking engines/coding using technologies that are AFR-unfriendly. Even if EMA becomes a thing, the bandwidth between the cards will still be low and the connector can't carry enough data. Anything that can't currently effectively use multi-GPU will not use EMA any better.

    Now, don't get me wrong. As pessimistic as I sound about these things, I'm really just being a realist. What I *WANT* is for multi-GPU to return. Two 980s would be "about" enough for me 1080/120, partially because of unoptimization creep and partially because I really like eye candy and partially because of how newer titles don't like SLI as much so I'd need strong single GPU often. I would love to tell people a couple midranged cards are good again. I'd love to see games working well and bug free and not have to worry about the tech, pretty much what every game I owned was like when I got this SLI laptop in 2013. Game existed? It either used SLI or could be maxed on an 8800GTS. Or both. I'd like that kind of thing to return, it made owning multi-GPU a real joy.
     
    Porter, Ashtrix and TBoneSan like this.
Loading...
Thread Status:
Not open for further replies.

Share This Page