Intel AMD Multi-Chip Module (MCM) Laptops and Desktops

Discussion in 'Hardware Components and Aftermarket Upgrades' started by hmscott, Nov 11, 2017.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    3,627
    Messages:
    13,511
    Likes Received:
    15,627
    Trophy Points:
    931
    First photos of the new Intel-AMD multi-chip module
    The first images of the new Intel-AMD multi-chip module (MCM) in the wild have surfaced and going by the looks of it, appears to be exactly what ultrabooks and mini-PCs have been longing for all this while.
    https://www.notebookcheck.net/First...l-AMD-multi-chip-module-surface.263224.0.html
    mcm_intel_guru3d.jpg
    TWGTkJPgr0LNcr16.jpg
    Source(s)
    Bits and Chips on Twitter
    Guru3D
    ChipHell

    Rivals with benefits — Intel and AMD collaborate to create a new entrant to the 8th generation CPU family
    In an interesting development, CPU arch rivals Intel and AMD have collaborated together to create a unique new entrant to Intel's 8th generation CPU family — a combination of an Intel H-series chip coupled to a...wait for it...a custom AMD Radeon GPU with HBM2 memory!
    https://www.notebookcheck.net/Rival...o-the-8th-generation-CPU-family.262554.0.html

    "Even the worst of rivalries can end in friendships. Intel's latest announcement just proves this point. After a number of rumors and subsequent denials, Intel's Vice President of Client Computing Group and General Manager of the Mobile Computing Platform, Christopher Walker, finally announced that Intel and AMD have collaborated together to create a new die design that combines an Intel H-series CPU and a custom-to-Intel AMD Radeon GPU along with High Bandwidth Memory 2 in a single package. This new design is facilitated by Intel's Embedded Multi-Die Interconnect Bridge (EMIB) technology that allows heterogeneous silicon to exchange data in extremely close proximity.

    EMIB has been a traditionally server class FPGA technology and ever since the benefits of it were clear, there was always speculation as to how it can make its way to the consumer side of things. Intel's integrated graphics solutions, though good, are clearly not enough for handling escalating modern workloads combined with the necessity for a thin form factor. AMD's new Vega architecture has been successfully implemented in the new Ryzen Mobile CPUs so this unique combination could be a win-win situation for both the companies and customers."

    "Walker said the following about the new die —

    "The new product, which will be part of our 8th Gen Intel Core family, brings together our high-performing Intel Core H-series processor, second generation High Bandwidth Memory (HBM2) and a custom-to-Intel third-party discrete graphics chip from AMD’s Radeon Technologies Group* – all in a single processor package.

    ... Now, we’re opening the door for thinner, lighter devices across notebooks, 2 in 1s and mini desktops, while delivering incredible performance and graphics for enthusiasts."

    Vice President and general manger for AMD Radeon Technologies Group, Scott Herkelman said,

    "Our collaboration with Intel expands the installed base for AMD Radeon GPUs and brings to market a differentiated solution for high-performance graphics. Together we are offering gamers and content creators the opportunity to have a thinner-and-lighter PC capable of delivering discrete performance-tier graphics experiences in AAA games and content creation applications."

    Supposed Intel-AMD Core i7-8705G benchmarks rival the GTX 1050 in performance
    The Intel-AMD Kaby Lake-R and Polaris/Vega combo could potentially outperform the current console generation and the GeForce MX150 based on unconfirmed reports.
    https://www.notebookcheck.net/Suppo...val-the-GTX-1050-in-performance.262790.0.html

    "With the initial launch supposedly just months away, more concrete leaks are now hinting at what we can expect to come. The latest is a set of unidentified 3DMark 11 scores that are purportedly the upcoming Core i7-8705G and i7-8809G with the unnamed AMD Radeon Polaris/Vega graphics. The 3DMark 11 Graphics Performance scores sit at 3777 points and 3960 points for the i7-8705G and i7-7709G, respectively, to be a step below the average GTX 950M at 4357 points but well above the average GeForce 940MX at 2560 points. These scores, of course, are lower than expected given that the integrated Iris Plus Graphics 640 is already comparable to the GTX 850M and 950M when gaming. Nonetheless, the scores show we can expect just a minor GPU performance dip of only about 5 percent between the higher-end i7-7709G and lower-end i7-8705G.

    The next set of unconfirmed benchmarks is potentially more interesting as it involves a repeatable scene in a shipped game. As leaked by Reddit and reported by Hot Hardware, the i7-8705G returns an average of 33.5 FPS in the built-inAshes of the Singularity benchmark at 1080p High settings. When compared toour own benchmark tests, the result is nearly identical to the GTX 1050 (29.5 FPS) and, by extension, the GTX 965M. The more recent MX150 runs significantly worse at 17.1 FPS in the same benchmark test. Ashes of the Singularity is quite CPU intensive as well since it is a RTS title that benefits directly from higher processor core counts. Nonetheless, a wider variety of benchmarks will be necessary before assuming a bigger picture."
    ex177.png
    csm_big_aots_high.jpg_c8c2d7bf85.jpg
     
    Last edited: Nov 14, 2017
  2. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,330
    Messages:
    11,844
    Likes Received:
    1,874
    Trophy Points:
    631
    Thank you for the non-video reporting! :)

    What this will surely bring is the death of 1366x720 displays. Finally.

    This may be what my next 'digital' notebook will be based on. ;)

    The author is dreaming and/or out of touch with where Intel wants to take this. This isn't the end of their own igpu's - not by a long shot.

    He miscalled this this by a mile. :rolleyes:

    From the link in the first post;
     
    Last edited by a moderator: Nov 13, 2017
  3. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    3,627
    Messages:
    13,511
    Likes Received:
    15,627
    Trophy Points:
    931
    Let's hope he got it right, and Intel goes back to growing the CPU silicon - dropping that parasitic iGPU for good.
     
  4. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,330
    Messages:
    11,844
    Likes Received:
    1,874
    Trophy Points:
    631
    lol... now you're dreaming too.

    That 'parasitic' igpu is what made mobile computers, mobile for me and many, many others.

    A dGPU is very much unneeded - along with it's battery taxing and throttle inducing antics - in the majority of mobile use cases.

    And yeah; gaming isn't even close to being 'some', let alone 'most'. :)

     
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    3,627
    Messages:
    13,511
    Likes Received:
    15,627
    Trophy Points:
    931
    That Intel iGPU was the biggest mistake Intel ever made, and if they hadn't taken on that folly we'd all be a lot better off.

    The iGPU stole power and thermal budget from the gem of Intel, the CPU - causing Intel to putter in place for years.

    Intel lost their way for a long time, with this MCM project, Intel may be back on the right path.
     
    Last edited: Nov 11, 2017
    triturbo likes this.
  6. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,330
    Messages:
    11,844
    Likes Received:
    1,874
    Trophy Points:
    631
    Ha! Take the best CPU you can (yeah; go ahead, pick the TR or EPYC ones...) and put it in a mobile system.

    UNUSABLE.

    Even Intel's silly little igpu would make it a real computer.

    But you'll tell me you'd pick something better than an igpu anyway. Good luck powering it for more than a few minutes. ;)

    There isn't one right 'path', nor one right platform.

    Give credit where credit is due. :)

     
  7. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    3,627
    Messages:
    13,511
    Likes Received:
    15,627
    Trophy Points:
    931
    At least you got that part right :)

    There were and would have been many other solutions for mobile GPU, more innovation in different directions - better one's would prevail over time.

    Intel stifled innovation in mobile GPU by taking it on themselves and blocking integration with higher performance GPU's with the CPU, and it's clear they now know that they made a big mistake, and as hard as it is to admit it and turn that big lumbering mistake around, Intel is doing it.

    Let's stop talking about it, and see what happens, ok? :)
     
    Last edited: Nov 13, 2017
  8. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,330
    Messages:
    11,844
    Likes Received:
    1,874
    Trophy Points:
    631
    But we're here to talk about stuff. ;)

    That silly little gpu that Intel has been making a fool of themselves with has probably sold more units and made more money for Intel than what other pure gpu makers have made. You can laugh and so is Intel, all the way to the bank.

    I sincerely hope Intel keeps stifling innovation like you seem to think. We're all the better off for it.

    It truly is amazing how people can look at the same thing and see completely opposite things, huh?

    Like I already said; the better one has already prevailed. Best CPU with just enough graphics hp to give humans the video output they need. They predicted how slow it would be for the start, but they're going to sling shot past anything in the solar system as they pick up exponentially more speed.

    How are they doing this? Because they have a plan. Their plans don't span timelines like consumers may like or appreciate, but they do follow through where and when they can.

    They plan long and they plan cohesively. Sure; not all work out, but when they do, it isn't a fluke. It is their hard work and dedication paying off.

    Dedication to their investors and their customers - even when those seem to be at opposite ends sometimes.

    I can easily see how the power of a dGPU is needed for certain workflows/playloads - but I can also see it being implemented much better than what anyone up to now has offered us.

    I trust Intel to be going into this venture with Raja with not only open eyes; but also a clear goal of what needs to be accomplished by him and the team they let him build.

    Just like FPU's were once optional on a platform, gpu 'compute' power is too right now for many, many workloads. Where Intel is headed is to change that. Disruptively.

    Combined with all the other parts of the platform Intel based systems offer for consumers; this will be an order of magnitude different computing world for mere humans - once the programming and the O/S subsystems have been put in place.

    You can look at Intel's slow, methodical journey with igpu's as a mistake. But you'll only be right if Intel stops producing and innovating with gpu's, today.

    Given the environment right now over at Intel; that isn't happening soon.

     
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    3,627
    Messages:
    13,511
    Likes Received:
    15,627
    Trophy Points:
    931
    Some interesting takes on Raja's move, Intel's MCM use of Vega / Polaris GPU's, etc. I'll add more here as I find them.

    HW News: Radeon Chief Leaves AMD for Intel, RAM Supply Surge

    The Raja Saga: From ATI, AMD, Apple and back again to the Stars (Vega & Navi) & Intel

    AMD Inside - The Beginning of the End for Nvidia in PC?


    Raja Koduri Joins Intel as Chief Architect to Drive Unified Vision across Cores and Visual Computing
    Intel to Expand Strategy to Deliver High-End, Discrete Graphics Solutions
    https://newsroom.intel.com/news-releases/raja-koduri-joins-intel/
     
    Last edited: Nov 14, 2017
  10. tilleroftheearth

    tilleroftheearth Wisdom listens quietly...

    Reputations:
    4,330
    Messages:
    11,844
    Likes Received:
    1,874
    Trophy Points:
    631
    You posted an hour of video when I'm heading out the door soon...

    Along with the video, can you at least post your reasons for the 'interesting' part(s) in a few short sentences? :)

    btw, even when I get back to my computer - I still won't have time to see 1Hr of fluff for 10 seconds worth of 'ahhh!).

     
    hmscott likes this.
Loading...

Share This Page