No more SLI laptops?

Discussion in 'Gaming (Software and Graphics Cards)' started by paulofeg, Jan 7, 2019.

  1. CGornet

    CGornet Newbie

    Reputations:
    0
    Messages:
    7
    Likes Received:
    4
    Trophy Points:
    6
    Most don't want to spring for SLI and it's kind of dying tech. For me, it was never worth the expense.
     
  2. cj_miranda23

    cj_miranda23 Notebook Evangelist

    Reputations:
    312
    Messages:
    535
    Likes Received:
    509
    Trophy Points:
    106
    Let's pray and hope for the NVLINK technology to improve overtime and be implemented to laptops.
    https://www.nvidia.com/en-us/data-center/nvlink/

    https://www.pcgamesn.com/nvidia-rtx-2080-ti-nvlink-performance

    "Essentially, the GPUs are much closer together now, and that will allow game developers to actually see the NVLink connection and use it to render their games, hopefully in a more elegant, repeatable way than with SLI.

    “That bridge is visible to games, that means that maybe there’s going to be an app that looks at one GPU and looks at another GPU and does something else.” explains Petersen. “The problem with multi-GPU computing is that the latency from one GPU to another is far away. It’s got to go across PCIe, it’s got to go through memory, it’s a real long distance from a transaction perspective.

    “NVLink fixes all that. So NVLink brings the latency of a GPU-to-GPU transfer way down. So it’s not just a bandwidth thing, it’s how close is that GPU’s memory to me. That GPU that’s across the link… I can kind of think about that as my brother rather than a distant relative.”
     
  3. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    33
    Messages:
    198
    Likes Received:
    152
    Trophy Points:
    56
    The problem is not with sli or nvlink bridges. Problem is with game devs seeing no point to waste extra money for properly implementing Sli/Nvlink. There's really small percentage of people who use it so they just dont care.
     
  4. paulofeg

    paulofeg Notebook Geek

    Reputations:
    10
    Messages:
    88
    Likes Received:
    13
    Trophy Points:
    16
    I understand people's opinion on this but whenever and wherever i discussed my support for sli and nvlink i was almost always attacked *not on these forums but other ones*
    Some people seem to have a viewpoint that if sli didn't exist that games would function better with less bugs. I was almost always made to feel like a fifth wheel who shouldn't exist. You guys on this forum are respectful but the vast majority of the community is hateful toward sli. I guess all the sli builds that made pc gaming famous and showed the true horsepower of pc gaming don't matter and should just rollover and die. You pretty much can't have enthusiast builds without sli or nvlink. Please don't take my opinion as a personal attack against anyone its just massive frustration.
     
    TheDantee likes this.
  5. Lumlx

    Lumlx Notebook Consultant

    Reputations:
    33
    Messages:
    198
    Likes Received:
    152
    Trophy Points:
    56
    Poor ports from consoles make games run like ****. Definitely not sli or nvlink
     
  6. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    32
    Likes Received:
    14
    Trophy Points:
    16
    I used a laptop with GTX 1070 SLI for a while, before swapping it for one with a GTX 1080. Would I go the SLI way again? No! Not if I can help it. Was the SLI setup faster than the single 1080? Yes, it was. Objectively. But a 20% performance increase is not justifiable with the consistent microstuttering that I had to deal with the SLI setups. I don't play games... may be once in a blue moon, but not regularly... so I won't speak about that much. But for all my lab work the SLI setup did not justify the headache w.r.t. the performance gains!

    May be nvlink will improve... may SLI is dead... but I highly doubt it really matters (at least from the manufacturer's viewpoint)! My lab orders the highest specs laptops available at any given time, and I have not heard any complaints about dual GPUs since the Turing cards came out. Part of that relates to the stuttering issues, I am sure. But part of that also has to do with the fact that most of us are already lugging around two power bricks with a 2080 and and i9 K series CPU!
     
  7. aarpcard

    aarpcard Notebook Deity

    Reputations:
    595
    Messages:
    1,101
    Likes Received:
    259
    Trophy Points:
    101
    In games that support SLI, I consistently get a 60-80%+ performance increase. I also found that microstuttering is effectively eliminated when using the modified stripped down drivers.

    Also a single RTX 2080 won't cut it for 60fps 4k gaming. If I could do that on a single card, then I would, but for that SLI GTX 1080's are the only option
     
    TheDantee and bennyg like this.
  8. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    32
    Likes Received:
    14
    Trophy Points:
    16
    See, it's not that I disagree with you per se. But I was talking about a more delicate cost ~ gain ratio w.r.t. what is being gained. I think that reflects in your post as well. First major nuance is "in games that support SLI", meaning that in games that don't actively support it (which would be quite a lot of them) the gains would be much less (if any at all). Also, I am sure your description of your experiences are accurate, but I have hardly ever heard of people getting an 80% boost. Second, the best I can recall there is very little evidence from vision neuroscience that gains above 60 fps are meaningfully perceptible. On the other hand, there is a bunch load of evidence for induced placebo effects of illusory advantage from being exposed to fps-count addiction.

    The next major nuance is using "modified stripped down drivers". It's extra work, and for 'may-be' advantage in an area where the realism of the advantage attained is suspect. In many ways this is analogous to the deep-learning hooplah that gets passed around as "A.I." is popular literature. A deep look into any DL algorithm reveals them to be variations on a single learning algorithm that requires terabytes of data to even function properly, while organic intelligence can attain better performance with mere bytes of data. The point is not that DL does not work... it is that it does not function intelligently, and given Occam's Razor it is actually a step-backward in terms of attaining true "intelligent" operations. So, to return to your statement, what do I gain by hunting down modified stripped down drivers, and countering all the hassles that will inevitably accompany the modifications? And also in what ways is that "gain" truly beneficial? It is a lot of work to satisfy an hunger for fps, and not in any (objectively) demonstrably advantageous ways. Without this, there is absolutely no motivation for an engineering department to devote additional resources to this problem.

    Finally, "60fps at 4K" itself is an extension of the previous point. I fail to see what the point of 4K gaming is. It will not add to performance (either human or machine) in any meaningful sense, and any gains would be purely cosmetic. Furthermore, even from a cosmetic standpoint, with the usual distance between your eyes and the monitor for gaming it is rather dubious how much visual advantage is attained by bumping from 1080p to 4K. 4K can be useful for some purposes, for instance when pixel-clarity is a concern for editors trying to attain a certain end-goal for a visual depiction of details. Data embedding in images through pixel manipulation comes to mind as an example. But for gaming, when you won't be focussing on any particular scene-fraction for more than a few seconds (at most), I fail to see why anyone would want 4K. In fact, if anything I would avoid it actively. I will give you an example -- I am currently preparing to order an Eurocom SkyX7C with a RTX 2080 and an i9 K series through my lab. I have spec'd out pretty much everything to the max, including 2X 2 TB NVME drives. But one thing I actively ensured was to order the 1080P display @ 120 HZ. The refresh rate is the most ambient, and bumping that up from 60HZ makes a lot more sense than making it 4K, IMO, because (a) I fail to see any objective visual degradation, and (b) the GPU can dedicate itself to maximum data crunching without being bogged down by pixel counts. For a personal experience, I remember playing the first Max Payne on a Core Duo desktop with integrated graphics. I didn't mess with the settings, the game ran fine enough for me to follow the story and immerse myself in the world... and frankly that's the most fun I have had playing games. Like I mentioned, I have not touched any games in the last ten years or so (unless you count the odd few minutes on someone else's machine)... and it could be that I enjoy it less these days because I am not as young as I used to be... but it seems to me that people these days focus a lot more spec-sheets than on actually experiencing the story. The best and most powerful GPU is in your head, and that's where most of the details of the story come to life.

    Now, all of this is not to say that you are wrong, and should do what I do. Rather, these are concerns that would keep me from devoting time to optimizing SLI/nvlink for modern GPUs! I would rather have a (still somewhat) portable powerhouse over a SLI maching with dubious aesthetic benefits.
     
    Prototime likes this.
  9. aarpcard

    aarpcard Notebook Deity

    Reputations:
    595
    Messages:
    1,101
    Likes Received:
    259
    Trophy Points:
    101
    My only response is that for my use case combined with my personal preferences SLI is my only option.

    I use a 42" 4k TV as my primary monitor and sit no more than 30" from the screen. At that distance, the advantage 4k has in producing a more detailed image (and no need for AA) is very obvious.

    I also largely play single player games with immersive, realistic graphics. I don't have much interest in competition shooters or similar games which benefit from greater than 60fps.

    It also happens to be the case that the majority of the games I play do support SLI and support it well. Tomb Raider, GTAV, the Witcher all have 80%+ scaling. In all of these games the only way to get greater than 60fps at 4k with max settings in a laptop is with SLI GTX1080s. A single RTX 2080 won't cut it.

    Cost doesn't matter as much to me as it used to. I've also recycled my sunk costs into my machine consistently for the past decade. After selling the parts to my old laptop, the laptop in my sig cost about $2k.

    My use case is more specific than many people, but theres obviously a market for SLI if nvidia is supporting it in the RTX 2080, 2070 Super, 2080TI. I'm concerned that in the future SLI will be completely killed alienating those of us who do use it and do benefit from it.
     
    TheDantee and Prototime like this.
  10. Deks

    Deks Notebook Prophet

    Reputations:
    1,115
    Messages:
    4,697
    Likes Received:
    1,856
    Trophy Points:
    231
    SLI is not a well supported feature from game devs, and it seems to be going the way of the dinosaur.
    What's likely going to replace it is AMD's approach with Infinity Fabric and chiplets from Zen CPU's applied to their GPU's to move away from the monolithic die approach and start interconnecting GPU chips like that to create more powerful ones that register as a single GPU.
     
    x-pac likes this.
Loading...

Share This Page