No more SLI laptops?

Discussion in 'Gaming (Software and Graphics Cards)' started by paulofeg, Jan 7, 2019.

  1. ssj92

    ssj92 Neutron Star

    Reputations:
    1,912
    Messages:
    3,839
    Likes Received:
    4,667
    Trophy Points:
    331
    Thanks to SLI, my Alienware M18xR2 is keeping up even today in many games that support it.

    I do agree that it is a dying tech though. mGPU with DX12 is supposed to be the new thing but almost nothing uses it.

    400 series to 600 series were the good days for SLI. After that it just kept going downhill.
     
  2. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    31
    Likes Received:
    14
    Trophy Points:
    16
    I agree. Personally I can't see any reason to use 4K for gaming, nor to lug around an environmentally unfriendly and arthritis inducing hunk of a machine that requires multiple powerbricks. With the performance available from the RTX 20 series, it makes very little sense to work for SLI! The primary concern for engineering is optimization, and that is much better attained by limiting the pixel count to sensible numbers. I usually don't argue with people on these issues because they take it personally... but from my own experience I have never honestly been able to tell what the big whoop with 4k is all about? I don't see any noticeable difference in everyday use... and aesthetics is much better attained by tuning the color calibration, contrast ratios etc. I can see 1440p... but I will never bog down my GPU with anything with a higher pixel count. Just makes no sense... and I do not give in to marketing gimmicks about "retina" displays or "infinity" edges. Trying to reinvent the wheel is a pointless effort... just make sure the display is pleasing, crisp and there is no pixelation. And as I keep saying, I can't tell the difference with 4k even by pressing my nose against the monitor.

    AMD's approach has huge advantages simply by drastically reducing the power consumption. On that concern alone, SLI should be extinct (and the faster the better).
     
  3. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    31
    Likes Received:
    14
    Trophy Points:
    16
    30" from a 42" 4K TV? I doubt that's very good for your eyes! I am using a 60" Sony Bravia smart LED TV with 1440p currently (but not all the time, as I do use a regular 25" 1080p monitor most of the time), and I can barely stand to stare at it unless I am sitting right up against the opposite wall. But if it works for you...
     
  4. aarpcard

    aarpcard Notebook Deity

    Reputations:
    590
    Messages:
    1,097
    Likes Received:
    256
    Trophy Points:
    101
    Damaging your eyes by sitting too close to a TV is an old wive's tale. Sitting too close may result in eyestrain (which goes away after you focus on something farther away), but that's not something all people are susceptible to.

    At farther viewing distances there is no real advantage with 4k over 1080p on the same sized screen, but close up it's night and day.

    That's not a great argument. The laptop won't draw any more power than a similarly specced desktop, and if someone is willing to carry around the weight, then more power to them. If saving the environment is a high priority, then maybe the answer is to not game at all. Realistically the few extra kwh's per month gaming adds to your energy consumption is background noise in the scheme of things. Keeping your house 10degrees cooler in the winter will have a logarithmically greater impact.

    But regardless, when it comes to personal preferences, neither side will win the argument, because there really is no argument to win.
     
    Last edited: Jul 12, 2019
    Prototime likes this.
  5. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    31
    Likes Received:
    14
    Trophy Points:
    16
    You misunderstood me! I was trying to put myself in the OEM's shoes. They would only be concerned with ticking checkboxes, and it seems like they find it easier to justify going away from SLI. True, personally I wouldn't care for anything more than 1440p. But I wasn't making a personal statement here...
     
  6. Deks

    Deks Notebook Prophet

    Reputations:
    1,105
    Messages:
    4,661
    Likes Received:
    1,836
    Trophy Points:
    231

    I never saw a point in AMD Crossfire of NV SLI laptops. The thermal constraints for one with existing cooling solutions are too large to ignore and dual GPU configurations jack up the prices by A LOT. Not to mention that both Crossfire and SLI were never well supported and had really bad scaling (Crossfire was actually slightly better in this regard than SLI).
    Still, if one had a mid/high end range GPU inside a laptop to begin with, it was likely that by the time people reach their chip limits, they would just move on to getting a new system anyway, so it never made sense to spend so much money on a single laptop.
     
  7. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    31
    Likes Received:
    14
    Trophy Points:
    16
    I agree! It all comes down to cost~effort~benefit balance. High price, questionable performance gains, thermal issues, power issues and the pointlessness of pixel-chasing has sent SLI/Crossfire the way of the dodo-bird. I am not shedding any tears for it personally.
     
    Deks likes this.
  8. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,538
    Messages:
    2,339
    Likes Received:
    2,316
    Trophy Points:
    181
    SLI wouldn't be needed if Nvidia gave permission and OEMs had the desire to put 102-class GPUs in the big laptops. They don't, and the throttled mobile 2080 isn't enough for Ultra 1080p 120Hz in some modern titles... let alone RTX on.

    It is a shame. Imagine the hate that would flow if car makers all decided that 300hp was the most any future engine in a passenger car would ever have, that any more than that was unnecessary (well until next year's model had 320hp anyway).

    That's fine if you don't see the point, you're free to not buy into it. To advocate it shouldn't exist for anybody, is approaching selfishness.

    I used to rag on idiots who bought huge giant laptops and peasants who bought old BMWs because they couldn't afford a new one now I own and love both for my own reasons and dilligaf to what the world thinks.
     
    Last edited: Jul 13, 2019
    aarpcard likes this.
  9. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    31
    Likes Received:
    14
    Trophy Points:
    16
    I agree with Nvidia limiting stuff being bad. That's unjustifiable! Personally, I am yet to come across anything that wouldn't do 60 FPS+ on a mobile RTX 2080... of course, ray-tracing is whole another issue. How many games (as of now) actually take advantage of it? My reason for using the RTX is not ray-tracing, but to take advantage of the tensors!

    That's not quite a proper analogy, though. I will use motorcycles for my example since I don't use cars. Used to be you could buy a Hayabusa (my previous bike was an example) that was unrestricted and could be charged upto 480 hps. But there are strict limitations on how fast motorcycles can be these days. My 2000 Busa could touch 350 kmph with a little coaxing of the gearbox, and without any super/turbo-charging. No modern Busa would come unrestricted. You could disable the rev limiter, but it would be illegal to do so (mostly) and immediately void all warranty. Suzuki resolutely refuses to service any super/hyper-bike that has the rev-limiter disabled. Even the factory-supercharged Kawasaki H2 is limited to under 330 hp from factory. The H2R, still under 400 hp, is not street legal at all. So the idea of imposing limits is very common, and very necessary. The issue is whether it is being done for the right reasons or to introduce planned obsolence.

    I am not at all recommending that SLI should be made "taboo"! I am just saying that from my perspective the decision to move away makes perfect sense. It doesn't have to be your view at all!
     
    Last edited: Jul 13, 2019
  10. x-pac

    x-pac Notebook Enthusiast

    Reputations:
    14
    Messages:
    31
    Likes Received:
    14
    Trophy Points:
    16
    Just FYI: Eurocom now offers you the ability to configure the SkyX9C with either a GTX 1080 SLI or two RTX 2080s for PhysX processing.
     
Loading...

Share This Page