Graphics going to GDDR 6 next year

Discussion in 'Gaming (Software and Graphics Cards)' started by 3Fees, Apr 24, 2017.

Thread Status:
Not open for further replies.
  1. Ashtrix

    Ashtrix ψυχή υπεροχή

    Reputations:
    2,366
    Messages:
    2,079
    Likes Received:
    3,247
    Trophy Points:
    281
    This was expected, It's no surprise long back the news came in 2016 August. The HBM2 made to the only GP100 chip non mass market, plus the HBM cannot be overclocked at all, Also the memory chip being closest to the die might factor in more heat to transfer to the heatsinks, Overclocking capabilites. This reason is alone enough to ignore HBM on the mobile market where we are now going reverse (Look at that Triton 700 - a gimped machine which banks on it's 19mm thickness with a GTX1080 lol, I bet it'd perform at 1070 level or worst depending on the temps) advertising the thinness & buzzwords over the performance stability and holding clocks to the optimal temps, sigh..(Side note - The power consumption is one of the advantage but AMD Fiji needs a powergrid to run it lol despite HBM, that's the reason AMD are on this path)

    G6X will come too and iirc, afaik the G6 beats HBM by a huge margin, 512GBps of HBM vs G6 with 384bit at 760GBps speeds, Plus another point is the heatsink and OEM R&D costs, they always want to keep them as low as possible, the mobile market followed a standard design principles since a decade, but the desktop market is a whole another case, So changing that would mark a new era but at the expense of lot of resources doesn't seem plausible, this is a corporate pov ofc if they can cool it with more robust cooling and efficient machines with non BGA machines with more better power delivery systems will pan out more better for consumers for the servicing or upgrades from this point which might come showing us a new standard, but with caveats - that was all thought of when the MXM standards changed and the speculation at the future of the GPUs after Pascal, But let's save this talk for another day..

    Bring it on !!
     
    Last edited: Apr 29, 2017
    Cloudfire likes this.
  2. Cloudfire

    Cloudfire (Really odd person)

    Reputations:
    7,279
    Messages:
    10,304
    Likes Received:
    2,878
    Trophy Points:
    581
    You dont need HBM with any graphics card that render games.
    GDDR6 will not only be cost effective but will be readily available against HBM, which by the way requires a complete overhaul of existing PCB and package design. It requires changes to almost everyone involved which makes it a difficult transition when you output millions upon millions of graphic cards to the market.

    448GB/s on a 256bit GDDR6 card should be more than enough when GTX 1080 offer 323GB/s with GDDR5X.
    For 384-bit gamer cards with GDDR6 we are talking around 740GB/s. No way you will ever need more than that. Its overkill.

    HBM is better in bandwidth than GDDR5X and GDDR6, but it belong on professional cards that actually needs the bandwidth and is produced in a lot less numbers than gamer cards for the average joe
     
  3. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,867
    Trophy Points:
    931
    Stop posting sensible facts. This is the internet.
     
    Starlight5, TBoneSan and Ashtrix like this.
  4. Ionising_Radiation

    Ionising_Radiation ?v = ve*ln(m0/m1)

    Reputations:
    750
    Messages:
    3,242
    Likes Received:
    2,646
    Trophy Points:
    231
    See, I never said anything about the bandwidth. I was talking about PCB footprint, whereby HBM certainly wins out. Especially in the mobile gaming space where, well, space is at a premium, all the more we ought to be pushing towards HBM instead of using oddly-shaped MXM cards like Clevo, MSI and Asus are doing, just to fit desktop power into a mobile device. There also have been attempts to introduce low-cost HBM into the mass market. I, for one, would like to see an MXM-A GPU that is as powerful as a GTX 1070, for example, which cannot be done with the current GDDR standard.
     
    Starlight5 likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page