Video transcoding, 1080 vs. 1070 ti vs 2070

Discussion in 'Gaming (Software and Graphics Cards)' started by rlk, Jan 1, 2019.

  1. rlk

    rlk Notebook Evangelist

    Reputations:
    97
    Messages:
    478
    Likes Received:
    233
    Trophy Points:
    56
    I'm looking to buy a (desktop) GPU for video transcoding (not gaming). I'm running Linux on an i7-5820K, and I want to be able to transcode multiple (up to maybe 4-5) videos at better than real time performance, although not for actual real time use. The goal here is reasonably high quality (constant quality), not streaming. In general, the vast number of published reviews and such don't go into video capabilities.

    I've run some tests on my laptop, a Lenovo ThinkPad P70 with a Xeon E3-1505Mv5 Quadro M4000M. H.264 lets me get FHD frame rates up to about 11x, or 330 fps, but the quality isn't there. Using HEVC I can get good quality, but only about 140-150 fps. Running the i7-5820K oc'ed to 4.2 MHz I can get slightly better than 4x net performance.

    I'm having trouble sorting through the various claims, with seemingly contradictory information between the NVidia support matrix (https://developer.nvidia.com/video-encode-decode-gpu-support-matrix) and a test that somebody ran (https://devtalk.nvidia.com/default/topic/987460/nvdec-cuda-nvenc-speed-comparison/) wrt the GTX 1070 (and no clear mention of the 1070Ti). The latter claims 2x performance of the 1080 over the 1070 and 1060, with no mention of the 1070Ti. The former claims that the 1070, 1080, and 1080Ti have two nvenc units vs. 1 on the 1060. If the 1070 (or 1070 ti) has the same encoding performance as the 1080, then that's the way to go. That latter chart does list the Quadro M4000, but that's not the same as the M4000M.

    The RTX 2070 might be an option, but those are still expensive.

    I'm aware that the non-Quadro cards are limited (artificially) to two concurrent sessions, but that can be handled by simply sequencing the operations.

    It sounds like the nvenc capabilities are significantly better on Pascal than Maxwell, so getting an older GTX 980 wouldn't be a very good solution.

    Does anyone have experience with nvenc (particularly generating HEVC, or is the H.264 quality significantly better) on the Pascal GTX adapters?
     
  2. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    790
    Messages:
    2,447
    Likes Received:
    2,319
    Trophy Points:
    181
    I’d get the 2070. NvENC on Turing seems like a significant improvement over Pascal, which in turn was a significant improvement over Maxwell.
     
  3. rlk

    rlk Notebook Evangelist

    Reputations:
    97
    Messages:
    478
    Likes Received:
    233
    Trophy Points:
    56
    I've seen some conflicting information on that -- do you have personal experience with it?
     
  4. yrekabakery

    yrekabakery Notebook Virtuoso

    Reputations:
    790
    Messages:
    2,447
    Likes Received:
    2,319
    Trophy Points:
    181
    Personal experience with Turing, no, just reading the whitepapers.
     
  5. pete962

    pete962 Notebook Evangelist

    Reputations:
    112
    Messages:
    473
    Likes Received:
    211
    Trophy Points:
    56
    According to the link you provided, all RTX cards have 1 NVENC chip and GTX1060 has one but 1070 and 1080 have 2 and Titan V has 3 but the footnote says RTX NVENC is about twice as fast as Pascal. Based on specs alone, TitanV should be the fastest, 1080/1070 more or less as fast as new RTX and GTX 1060 slowest. Most cost efficient seems to be GTX1070, probably could buy 2 for the price of one RTX. They all have 1 decoder, but I think encoding is the most work. If you posted link to some sample video with info on encoder you're using, maybe members with particular systems could run quick test for you, to give you some idea of real life performance to confirm the numbers. Also I read something about bit coin mining PC market collapsing and systems that used to cost $600, now go for $100. I don't know much about it, but I think they use Nvidia cards, may be worth looking into.
     
  6. rlk

    rlk Notebook Evangelist

    Reputations:
    97
    Messages:
    478
    Likes Received:
    233
    Trophy Points:
    56
    One possible wildcard is the 2060, when it comes out soon (this is not especially urgent). If that has the same nvenc/nvdec hardware as the 2070/2080, that would be an obvious choice, since I don't otherwise need that kind of power.

    With my Quadro M4000M -- again, that's rather old by now -- I get very poor quality (mostly blockiness) at any reasonable bit rate or CQ setting using H.264. Using H.265 I do get better quality that will likely be usable, but the performance is quite low compared to H.265 (150 fps FHD vs. 300).

    So I guess what it comes down to is how much the Pascal nvenc is better (both performance and quality wise) than the Maxwell, and ditto Turing vs. Pascal, if anyone has tried it. The nvenc/nvdec seems to be a real afterthought in reviews I find.
     
Loading...

Share This Page