*** Official Clevo P870DM/Sager NP9870-G Owner's Lounge - Phoenix has arisen! ***

Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by NordicRaven, Sep 22, 2015.

  1. Johnksss

    Johnksss .

    Reputations:
    11,536
    Messages:
    19,461
    Likes Received:
    12,842
    Trophy Points:
    931
    1: The problem with benchmark scores reading is 99 percent of every one is reading into it wrong. When numbers chasing you look at the score numbers.

    How do these numbers relate.

    2: When gaming is the main focus you are only concerned about FPS.

    You are only concerned about the FPS ONLY! Nothing else. This gives a far more beneficial number in terms of where might stand or are trying to compare performance.

    [​IMG]

    Test 1 shows 120 FPS & Test 2 shows 101 FPS. This is the gamers main concern when trying to see where ones machine stacks up.

    http://www.3dmark.com/fs/7059595

    Test 1 shows 80 FPS & Test 2 shows 66 FPS

    That is how you get a more of a real world idea.19k means nothing other than going for records, but the actual fps average is what you really want to pay attention too.
     
    Mr. Spock, hmscott, USMC578 and 5 others like this.
  2. tgipier

    tgipier Notebook Deity

    Reputations:
    203
    Messages:
    1,603
    Likes Received:
    1,578
    Trophy Points:
    181
    I wonder when will someone will start selling "binned" MXM gpus. Could make a big buck out of benchmarking community though there is a small demand. I know would pay an extra premium against lemon clockers. :D

    @Mr. Fox If each MXM slot can provide 100w and p870dm have an additional power connector, whats preventing nvidia from launching a dual slot large mxm pcb with a gm200 on it besides being ngreedia?
     
    jaybee83, hmscott and Mr. Fox like this.
  3. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,651
    Trophy Points:
    931
    The problem with the idea of binning MXM cards is what to do with the ton of trash left over, and who absorbs the loss on the leftover junk that nobody wants. Somebody has to buy them to bin and if they lose money in the process there is no reason for them to do it. But, yes, definitely, I would be willing to pay extra for the best.

    The danger here is assuming they don't already have that ready to go once they are done milking their customers with the current product line. NGREEDIA does everything deliberately and releases greatness in tiny bite-sized chunks that make them a ton of money off of customers driven by owning the best. Rest assured that they have a basis rooted in profit for not showing all their cards at once. They always save the best for later and retroactively cripple what they sold yesterday using cancer drivers.
     
    Mr. Spock and tgipier like this.
  4. Firebat246

    Firebat246 Notebook Deity

    Reputations:
    50
    Messages:
    764
    Likes Received:
    510
    Trophy Points:
    106
    Thank you for some extra clarification Fox... this pretty much reassures me that I made the right choice. I would love to OC the crap out of the single 980 card.. I have 2 psu's coming anyways so that part doesn't even bother me. At the end of the day if I can run any game maxed out and get 45-60+ fps I think I will be happy. Hopefully that's not an unreasonable thing to ask for... especially with what I'm spending!
     
    Mr. Fox likes this.
  5. marios50

    marios50 Notebook Evangelist

    Reputations:
    62
    Messages:
    456
    Likes Received:
    337
    Trophy Points:
    76
    @Mr. Fox If you had to choose one at this point, and your only desire was managing a decent 4k experience and going for raw gaming power, would you go for the 980M's?
     
    Mr. Fox likes this.
  6. pathfindercod

    pathfindercod Notebook Virtuoso

    Reputations:
    1,940
    Messages:
    2,343
    Likes Received:
    2,345
    Trophy Points:
    181
    4k res will tax a desktop system with sli 980ti's or Titan x's especially if it's a AAA title. So I would definitely go sli if you plan to play at 4K res.
     
    marios50 likes this.
  7. cibass

    cibass Notebook Enthusiast

    Reputations:
    0
    Messages:
    17
    Likes Received:
    7
    Trophy Points:
    6
    My setup was getting just over 4000 (maybe a bit more)on the firestrike ultra benchmark, and it was showing in the results what they thought a 4k gaming PC (desktop)should be roughly, which was sli 980, which got about 5000. Was a logical result since the 980m is seen to be about 80% power of 980. Some of the guys on this forum, I think fox, were getting above that 5000 score by overclocking. I think you'd have to go 980m sli if you have hi res ambitions. Sorry I can't be more accurate with the scores, dont have them to hand
     
    hmscott and marios50 like this.
  8. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,651
    Trophy Points:
    931
    Probably so, since playing games at such as high resolution with good quality settings is going to choke most systems. You'll need every ounce of GPU muscle you can manage to scrape together for that to avoid a slide show.
     
    hmscott and marios50 like this.
  9. marios50

    marios50 Notebook Evangelist

    Reputations:
    62
    Messages:
    456
    Likes Received:
    337
    Trophy Points:
    76
    Hmm it seems SLI might be what I'll be going at as it seems. I have never overclocked at all so naturally I am afraid to do over the top things. Will I be able to do some light OC'ing on the 980's without pushing things too far? I have no idea about voltages and stuff I just want to get the best gaming experience I can without risk.
     
    Mr. Fox likes this.
  10. Mr. Fox

    Mr. Fox BGA Filth-Hating Elitist

    Reputations:
    37,235
    Messages:
    39,339
    Likes Received:
    70,651
    Trophy Points:
    931
    Yeah, just overclock them as far as you can on stock voltage and you should be fine. You cannot go that high without extra voltage, and even if you could some games will pitch a huge fit about overclocked GPUs, even if they're not overclocked very far.
     
    marios50 likes this.
Loading...

Share This Page