Clevo 2019

Discussion in 'Sager and Clevo' started by steberg, Jan 6, 2019.

  1. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    7,956
    Messages:
    50,236
    Likes Received:
    14,146
    Trophy Points:
    931
    What notebook are apple putting the radeon vii in?
     
    raz8020 likes this.
  2. redbytes

    redbytes Notebook Consultant

    Reputations:
    33
    Messages:
    196
    Likes Received:
    54
    Trophy Points:
    41
    Not the Radeon VII specifically, but they're using AMD cards in their current lineup, arent'they? And I see no sign of them switching back to NVIDIA anytime soon...
     
    hmscott likes this.
  3. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    7,956
    Messages:
    50,236
    Likes Received:
    14,146
    Trophy Points:
    931
    A 300w chip is unusable given the market conditions, the smaller ones not so much.

    Also I can understand why they do it too.
     
  4. m1key

    m1key Newbie

    Reputations:
    0
    Messages:
    8
    Likes Received:
    1
    Trophy Points:
    6
    Hi

    Just wondering how come the Clevo 775TM-G with i9 9900K and RTX2080 only needs 1x 330w power supply? I would of though it wouldn't be enough to power the beast? I currently have an MSI GT75 and replaced the 2x 230w with the Eurocom 780w (which I know will work with the Clevo which I plan to buy
     
    hmscott likes this.
  5. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    7,956
    Messages:
    50,236
    Likes Received:
    14,146
    Trophy Points:
    931
    Because stock and even with a tweak it's enough.
     
  6. redbytes

    redbytes Notebook Consultant

    Reputations:
    33
    Messages:
    196
    Likes Received:
    54
    Trophy Points:
    41
    I actually don't. Without CUDA, they lost the scientific computation train. Nowadays you see much less Macs in a computer science department than it used to be.
     
  7. Chastity

    Chastity Company Representative

    Reputations:
    1,201
    Messages:
    6,355
    Likes Received:
    203
    Trophy Points:
    231
    Applications that only do CUDA, and not supporting OpenCL are poorly supported, esp since AMD cards are compute beasts.
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    7,956
    Messages:
    50,236
    Likes Received:
    14,146
    Trophy Points:
    931
    The hassle with nvidia was just not worth it.
     
  9. redbytes

    redbytes Notebook Consultant

    Reputations:
    33
    Messages:
    196
    Likes Received:
    54
    Trophy Points:
    41
    Erm, what? Probably we're talking about different things, because I don't know any decent machine learning/deep learning library that's OpenCL based. Tensorflow, Theano, caffee, etc., they all require CUDA, and the portings to OpenCL are experimental at best. AMD cards may be compute beasts, but if you go and buy cloud GPU instances on AWS or Azure you'll find only NVIDIA GPUs available. Why? No CUDA, no game, boy.

    Don't get me wrong, I would absolutely love an AMD-based competition in the deep learning field, and I would, even more, love an open source library like OpenCL beating the crap out of CUDA. But (and I speak for my field) I think they're both years behind both in terms of performance and of library support.

    (we're going off topic by the way)
     
  10. redbytes

    redbytes Notebook Consultant

    Reputations:
    33
    Messages:
    196
    Likes Received:
    54
    Trophy Points:
    41
    What kind of hassle? Serious question, I'm not trolling.
     
Loading...

Share This Page