AMD's Ryzen CPUs (Ryzen/TR/Epyc) & Vega/Polaris/Navi GPUs

Discussion in 'Hardware Components and Aftermarket Upgrades' started by Rage Set, Dec 14, 2016.

  1. Papusan

    Papusan JOKEBOOK's Sucks! Dont waste your $$$ on FILTHY

    Reputations:
    24,194
    Messages:
    24,298
    Likes Received:
    42,469
    Trophy Points:
    931
    AMD Shares Details on Radeon GM's Departure Tomshardware.com | December 14, 2018
    [​IMG]
    AMD's Radeon Technology Group (RTG) has seen plenty of shakeups this year as it has lost several key players to Intel, like Raja Koduri and Chris Hook, among others. Now Mike Rayfield, the Senior Vice President and General Manager of RTG, has announced his retirement.


    EXCLUSIVE: AMD RTG Boss Mike Rayfield Retires Amidst Chatter About ‘Disengaged Behavior’ wccftech.com | December 14, 2018

    Mike Rayfield will be officially retiring at the end of the year and David Wang is going to be taking the interim leadership position for Radeon Technologies Group while the company searches for new leadership. This would mean interesting things for RTG as the leadership dynamics change just over a year after Raja Koduri left for Intel.

    When will David Wang leave the AMD ship?:rolleyes:



     
    Robbo99999 likes this.
  2. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,520
    Messages:
    9,502
    Likes Received:
    4,919
    Trophy Points:
    431
    I just do not get it. For Intel these guys are just to hopefully boost IGP, not mainstream GPU graphic. Even assuming they can accomplish this they have a ways to go to compete with AMD. Intel sorely needs to concentrate on its core business, CPU's, as AMD is rounding the corner.
     
    hmscott likes this.
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,862
    Messages:
    20,217
    Likes Received:
    25,040
    Trophy Points:
    931
    Exactly, Intel is hitting a rough patch that only CPU innovation and investment can help Intel dig is way out and back to a leading position in the market. Success at 10nm and 7nm is crucial.

    For Intel making a new dGPU isn't crucial.

    Intel iGPU's / dGPU's as investments are a distraction and a financial burden with negligible payoff. Even if Intel could pull a rabbit out of a hat and leap frog AMD GPU performance - past the new releases from AMD *next* year, not just the existing AMD GPU's, Intel GPU's would still land short of Nvidia GPU performance.

    What AMD splits off from Nvidia's market share is so minuscule as to be negligible to a company the size of Intel - that small low end market keeps AMD going but it's not going to be large enough to give Intel the ROI it needs for such a long sustained expensive development effort.

    Most of AMD's GPU market is based on investment gained through custom development for console company's, and then AMD uses both that and the discrete GPU market to spread the costs and profit. Intel won't have that console market to sustain the development of the low end GPU's it could produce.

    Just like Intel Larrabee the new Intel Arctic Sound GPU's will underperform, be incomplete from the software side, and too costly to build on the hardware side, and yet be a year or two late to market so as to be well behind the competition, ending up being worthless to invest in to take to production.

    Intel won't have the "cojones" needed to stick with this GPU effort and sustain that effort for the 2-3 generations of gradual catch-up Intel would need to cycle through to gain the IP needed to generate a truly outstanding class leading GPU that would compete on features, performance, and cost. That's a lot of years of lossage for Intel to eat before a payout happens.

    Maybe the effort will generate a co-processor die as an iGPU chiplet for CPU substrate's, greater in size, performance, and power draw than the current iGPU, but to make that leap to a full discrete GPU that's competitive, I just don't see Intel able to sustain the effort long enough.

    Intel will fold their GPU effort long before it could pay off in the dGPU market.
     
    Last edited: Dec 15, 2018
  4. bennyg

    bennyg Notebook Virtuoso

    Reputations:
    1,552
    Messages:
    2,347
    Likes Received:
    2,344
    Trophy Points:
    181
    Since its being discussed as a theory, it must at least be possible to physically cut a die into four through the middle of the active logic.

    Is this a common thing?

    Any idea what kind of tolerances would have to be built in, say a big gap where the physical cut would occur, or extra logic to build in tolerance for now-incomplete links to other parts of the die?
     
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    6,862
    Messages:
    20,217
    Likes Received:
    25,040
    Trophy Points:
    931
    AMD Adrenalin 2019 Drivers - Benchmarked & Explained!
    HardwareCanucks
    Published on Dec 13, 2018
    AMD is launching their new Adrenalin drivers today which include performance improvements, a new game streaming feature, updates to AMD Link and a ton of other cool stuff. Let's run some benchmarks and see what kind of improvements have been made! Download the drivers here: http://bit.ly/adrenalinAMD


    The Bring Up: Episode 4: AMD Radeon™ Software Adrenalin 2019 Edition
    AMD
    Published on Dec 13, 2018
    We Bring Up: 2019 Radeon™ driver updates, a trip down memory lane and a trip up north to AMD Markham
    00:55 What is a driver?
    02:40 Cavin and Bridget take a trip down 2013 Memory Lane
    05:19 Radeon™ Software Adrenalin 2019 Edition with Terry Makedon
    05:48 Radeon™ ReLive updates
    09:21 Radeon™ ReLive VR
    11:42 In Game Replay and GIF Support
    13:38 In-Scene Editor
    14:09 Radeon™ WattMan updates
    15:59 Voice Command in AMD Link
    17:16 Built-in Radeon™ ReLive Gallery

    How To Use Radeon Adrenalin Wattman Auto Overclocking
    WccftechTV
    Published on Dec 14, 2018
    In this quick tutorial we show you how to find and use the new AMD Radeon Adrenalin Wattman Auto Overclocking Utility. Performance Results: https://wccftech.com/amd-radeon-softw...
     
    Last edited: Dec 15, 2018
  6. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,586
    Messages:
    5,796
    Likes Received:
    8,189
    Trophy Points:
    681
    It is, to a degree. You have to engineer traces that don't kill the chip when cut, leave clear set backs along where you cut, engineer power delivery to be able to be supplied to each quadrant collectively and independently, etc. To do it properly is really good engineering and it isn't done too often because of the pain in the butt it is.
    Dude, you just letting a troll troll?

    Literally, he is asking when David Wang will leave, calling AMD a sinking ship when the article he posted gave a viable alternative explanation, all while doing so after trying to troll in the same way yesterday. This is exactly why I left and don't visit this forum as much!

    And why Intel is doing it is to diversify the product portfolio. They stayed on x86 too long, missed out on mobile, have Qualcomm, the beast that they are, competing on radios in that segment, are later and over budget on the lines of optane products, see a re-emergence of AMD in CPUs, see the P.C. market shrinking, have Apple deciding to possibly go in-house on processors by 2020, have Amazon designing their own arm design for AWS, have RISC V chips made to order, have fallen behind on fab processes, etc. They didn't react to the warning signs when they should have. Now, instead of trying to predict the market, they are trying to disrupt a mature market which can bring meaningful capital because of brand loyalty and fanboys. This isn't hard to understand why the change.

    It also is a plan to go after the embedded market which kept AMD going in the hard years, trying to cut the feet out from them, but and has the two large contracts already, which is like 5-6 years. So...

    Edit: And no, they had to go to AMD to get the performance necessary for their Hyades Canyon NUC, and their iGP wasn't cutting it. But aside from iGPs, GPUs are taking over the computational processing in the server segment, which means it is a huge threat to their marketability down the road. Computing at that level means if Intel doesn't do something, AMD with superior PCIe or ARM server chips will eat away at them overtime as the transition to GPGPU is already underway. Not only that, you have the rumors of Samsung exploring the GPU space, granted for mobile, which is hitting at where everyone thinks they are going. Embedded and Server is why Intel is doing it, both with very large markets that are growing.

    In fact, that is why I pointed to the list of people pouched from AMD. If you look, most of it was AI and deep learning engineers. According to PricewaterhouseCoopers, one of the largest auditing firms in the world, they used OECD methodology to calculate that 38% of US jobs, and around 30% of all jobs globally, will be replaced by AI and robotics by 2032. That is about 13 years from now. A Harvard professor estimated AI and robotics could take as much as 80% of jobs by 2050 (non-OECD methodology, though). What is driving that? GPUs, in part, AI ASICs in another part, etc.

    So bringing it back to your question on Intel, they are playing catch up to one of the largest markets that is about to boom after missing out on mobile, etc.

    Now I'm going to repost my comments from the Anand article on Intel's stacked chips, or at least the links to the numerous sources showing why Intel is where practically the entire industry is on 2.5D and 3D integration, then not visit this site for awhile, because otherwise I'm fully de-activating my account. Peace.

    Sent from my SM-G900P using Tapatalk
     
    Last edited: Dec 15, 2018
    bennyg likes this.
  7. TANWare

    TANWare Just This Side of Senile, I think. Super Moderator

    Reputations:
    2,520
    Messages:
    9,502
    Likes Received:
    4,919
    Trophy Points:
    431
    I am not saying AMD has issues but why is Intel gobbling up GPU people? Other than for the sake of saying we are gobbling up AMD talent were ever we can to satisfy stock holders for the short term as it does make it look like they are at least doing something, what no one seems to know.

    So you are both falling into their trap. Arguing the actions not the intent or consequence.
     
  8. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    4,013
    Messages:
    3,037
    Likes Received:
    2,070
    Trophy Points:
    231
    Hmmm... Before poo-poo'ing or dismissing the intel GPU move, one should try to take a look out 3-4 years out. Maybe it's not about gaming after all...

    "In The Era Of Artificial Intelligence, GPUs Are The New CPUs"
    - https://www.forbes.com/sites/janaki...elligence-gpus-are-the-new-cpus/#205ebba65d16

    "Intel's GPU is not what you think"


    Hmmmm...
     
    Last edited: Dec 15, 2018
    hmscott likes this.
  9. ajc9988

    ajc9988 Death by a thousand paper cuts

    Reputations:
    1,586
    Messages:
    5,796
    Likes Received:
    8,189
    Trophy Points:
    681
    Did you guys even read what the hell I wrote?

    "Edit: And no, they had to go to AMD to get the performance necessary for their Hyades Canyon NUC, and their iGP wasn't cutting it. But aside from iGPs, GPUs are taking over the computational processing in the server segment, which means it is a huge threat to their marketability down the road. Computing at that level means if Intel doesn't do something, AMD with superior PCIe or ARM server chips will eat away at them overtime as the transition to GPGPU is already underway. Not only that, you have the rumors of Samsung exploring the GPU space, granted for mobile, which is hitting at where everyone thinks they are going. Embedded and Server is why Intel is doing it, both with very large markets that are growing.

    In fact, that is why I pointed to the list of people pouched from AMD. If you look, most of it was AI and deep learning engineers. According to PricewaterhouseCoopers, one of the largest auditing firms in the world, they used OECD methodology to calculate that 38% of US jobs, and around 30% of all jobs globally, will be replaced by AI and robotics by 2032. That is about 13 years from now. A Harvard professor estimated AI and robotics could take as much as 80% of jobs by 2050 (non-OECD methodology, though). What is driving that? GPUs, in part, AI ASICs in another part, etc.

    So bringing it back to your question on Intel, they are playing catch up to one of the largest markets that is about to boom after missing out on mobile, etc.

    Now I'm going to repost my comments from the Anand article on Intel's stacked chips, or at least the links to the numerous sources showing why Intel is where practically the entire industry is on 2.5D and 3D integration, then not visit this site for awhile, because otherwise I'm fully de-activating my account. Peace."

    This is LITERALLY the reason I'm moving on.
     
  10. jclausius

    jclausius Notebook Virtuoso

    Reputations:
    4,013
    Messages:
    3,037
    Likes Received:
    2,070
    Trophy Points:
    231
    Uh. Yes. That is why I used your post.

    You just said the ppl leaving AMD were mostly "AI and deep learning engineers", correct?

    And why do you think those people were targeted? Also, didn't intel snatch up a bunch of AMD folks? Does anyone really think intel is interested in the gaming GPU market? Especially with the shrinking PC marketplace... My guess is home consoles can't be faring much better.

    However, in reading my own tea leaves, the computational needs that AI and larger data centers will emerge as a new growing market. And right now it seems adding GPUs is one way to design systems meeting those computational needs. That is *my* best guess to explain the intel GPU move.

    What in the...???

    If the text in my post is causing you this much frustration, remember they're only words. I didn't attack you calling you a fanboy, troll, senile or even obtuse. They represent my own thoughts taking part of this discussion, they are not meant to inflict emotional or mental harm, but rather transmit ideas.
     
    Last edited: Dec 15, 2018
Loading...

Share This Page