Google TPU vs. Nvidia V100

Discussion in 'Gaming (Software and Graphics Cards)' started by hmscott, May 19, 2017.

  1. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    2,595
    Messages:
    10,583
    Likes Received:
    11,839
    Trophy Points:
    931
    Google as the GPU maker? GPU => Google Processing Unit? :eek:

    Google 'Cloud TPU' takes machine learning lead from Tesla V100
    Mark Tyson on 18 May 2017, 10:01
    http://hexus.net/tech/news/industry/105838-google-cloud-tpu-takes-machine-learning-lead-tesla-v100/

    "...As CNBC reports, the reveal of the Google Cloud TPU last night is "potentially troubling news for Nvidia, whose graphics processing units (GPUs) have been used by Google for intensive machine learning applications."

    In its most recent financial report Nvidia pointed to fast growth in revenues from AI and deep learning, and even cited Google as a notable customer.

    Now Google has indicated that it will use its own TPUs more in its own core computing infrastructure. Google is also creating the TensorFlow Research Cloud, a cluster of 1,000 Cloud TPUs that we will make available to top researchers for free...

    Last but not least Google is happy to help with software and will bring second-generation TPUs to Google Cloud for the first time as Cloud TPUs on GCE, the Google Compute Engine. It will facilitate the mixing-and-matching of Cloud TPUs with Skylake CPUs, NVIDIA GPUs, and all of the rest of our infrastructure and services to build the best ML system.

    So how do Google's new Cloud TPUs perform? Google says each TPU module, as pictured above, can deliver up to 180 teraflops of floating-point performance. That module features 4x Cloud TPU chips (45 teraflops each). These devices are designed to work in larger systems, for example a 64-TPU module pod can apply up to 11.5 petaflops of computation to a single ML (machine learning) training task.

    Roughly comparing a Cloud TPU module against the Tesla V100 accelerator, Google wins by providing six times the teraflops FP16 half-precision computation speed, and 50 per cent faster 'Tensor Core' performance. Inference performance of the new Cloud TPU has yet to be shared by Google.

    Furthermore, Cloud TPUs "are easy to program via TensorFlow, the most popular open-source machine learning framework," says Google."
    3e59629e-e4d9-4990-a606-dca198ded910.jpg
    54520650-4609-43dd-9e49-957bb3a74f84.jpg

    Googleflops?
     
    Last edited: May 19, 2017
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    2,595
    Messages:
    10,583
    Likes Received:
    11,839
    Trophy Points:
    931
    Ashtrix and jaybee83 like this.
  3. Vasudev

    Vasudev Notebook Deity

    Reputations:
    121
    Messages:
    1,245
    Likes Received:
    397
    Trophy Points:
    101
    Maybe its Vega licensed to google.
     
    hmscott likes this.
  4. Prema

    Prema Little Bios Mod

    Reputations:
    5,850
    Messages:
    4,902
    Likes Received:
    10,147
    Trophy Points:
    581
    AI vehicles
    AI advertisement
    AI designers
    AI politicians
    AI judges
    AI doctors
    ...

    Looks like the general society will be less and less capable of thinking for themselves and be more and more reliant on the tech doing it for them...

    I for one like to drive vehicles myself and make my own decisions...and so does brother Jen. ;)
     
    Ashtrix, Vasudev, jaybee83 and 2 others like this.
  5. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    2,595
    Messages:
    10,583
    Likes Received:
    11,839
    Trophy Points:
    931
    AI BIOS Builders / Tuners? ;)
     
  6. Prema

    Prema Little Bios Mod

    Reputations:
    5,850
    Messages:
    4,902
    Likes Received:
    10,147
    Trophy Points:
    581
    Not far away from that, just wait for the next version of Turbo Boost training itself according to each specific chips potential instead of following variables...
    Remember my words: It'll be even more boring for the user to take the backseat, as there will be no more room for manual tweaking...silicon lottery will be the only variable.
     
    Ashtrix, jaybee83 and hmscott like this.
  7. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    2,595
    Messages:
    10,583
    Likes Received:
    11,839
    Trophy Points:
    931
    Soooo... it'll all just be "fun and games" moving forward? :confused:

    Sounds frighteningly horrible :D
     
  8. TBoneSan

    TBoneSan Laptop Fiend

    Reputations:
    4,059
    Messages:
    5,324
    Likes Received:
    5,276
    Trophy Points:
    681
    A part of me would die if I had an AI driving car. Give me stick shift and a windy road any day.
     
    Ashtrix, triturbo and hmscott like this.
  9. ChanceJackson

    ChanceJackson Notebook Evangelist

    Reputations:
    25
    Messages:
    380
    Likes Received:
    155
    Trophy Points:
    56
    Believe I read that it was asic
     
  10. Vasudev

    Vasudev Notebook Deity

    Reputations:
    121
    Messages:
    1,245
    Likes Received:
    397
    Trophy Points:
    101
    Maybe ASIC/GPU designed for one thing that is machine learning.
     
Loading...

Share This Page