Is it true that playing on integrated/discrete graphics is bad for the computer/processor?

Discussion in 'Hardware Components and Aftermarket Upgrades' started by richie1989, Mar 13, 2012.

Thread Status:
Not open for further replies.
  1. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,424
    Messages:
    58,141
    Likes Received:
    17,824
    Trophy Points:
    931
    If you dont game on it the transistors will never be used and wasted. The likely hood is the CPU transistors will fail before the GPU ones.
     
  2. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
  3. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    Yes and no.

    I'm surprised that nobody here has mentioned that the cooling system that critical to keeping any component from overheating and shutting down the computer.

    Laptop manufacturers are looking to cut costs in any way they can, and one way is to use aluminum (cheaper) heat pipes and sinks instead of pure copper (more expensive), using less materiel, cheaper and fewer fans, etc.

    So if you have a gaming laptop with high-performance parts (which is advertised) and a low-quality cooling system (which is never advertised), then the answer to your question is yes; the components will face a much shorter lifespan under constant high-stress usage, and they will often fail prematurely.

    On the other hand, manufacturers who focus specifically on making gaming laptops generally do well in this area by including robust cooling systems: Sager, Alienware, and MSI are examples of such brands.

    EDIT: I read your post too quickly and misunderstood your question. No, a more powerful GPU will not place greater stress on the CPU unless (again) the cooling system is pushed beyond its limits when the system is stressed.
     
  4. WARDOZER9

    WARDOZER9 Notebook Consultant

    Reputations:
    35
    Messages:
    282
    Likes Received:
    8
    Trophy Points:
    31
    Short answer yes and no.

    Long answer:

    If the GPU/CPU repeatedly come within 5- 10* C ( sometimes not even that close ) of the TJmax for extended periods of time, although the part may never reach critical temperature, this will over time cause the part to fail prematurely. The failure will not often stem from the die itself so much as a weakening of the solder which with the new ROHS lead free solder is a problem. Over the time, the solder becomes brittle, much the way that a piece of baking clay wil if you heat it and let it cool repeatedly, over time, the clay will become brittle and prone to cracking/breaking. There is also the threat of permenant failure of capacitors and resistors that come too close for too long to their maximum operting temperature.

    On the other side. If the parts are kept cool like the repplication of TIM's ( Thermal Interface Materials ) with quality TIM's then you reduce the likelhood of premature failure due to heat. Also the use of shims in many instances helps eliminate low clamp pressure due to poor heatsink design by some manufacturers. Think of clamp pressure like this; If you have the worlds best heatsink held onto a CPU with a rubber band, then the heatsink is not being forced close enough to the CPU to be effective reguardless of the TIM you use. If however you keep put a large copper shim between the heatsink and the CPU, the rubber band gets tighter and decreases the ammount of space between the CPU and heatsink causing the TIM to be spread out more evenly and thus becoming more efficient.

    The best way to see if you need a shim or not due to proper/improper clamp pressure is to apply fresh TIM between the heatsink and chip to be cooled, tighten the heatsink all of the way down and then remove the heatsink, if the TIM is all globbed ( technical term, I know ) on the chip and the heatsink and you cannot see anywhere where the TIM has been spred so thin that you can nearly see the heatsink or chip under the TIM then you probably need a shim. If however, you remove the heatsink and you can see towards the center of the TIM that it has been squished ( another tech term ) to the point that you can nearly see the heatsink or chip thru it then you have good clamp pressure and just need to remove the old tim, reapply, reassemble and be done.


    The laymans answer always comes down to this; whether it be a laptop, desktop, phone or any elecronic device: As long as the parts stays cool, it's safe. If however it gets too hot, too often for too long then it will most certainly fail prematurely.
     
  5. miro_gt

    miro_gt Notebook Deity

    Reputations:
    433
    Messages:
    1,748
    Likes Received:
    4
    Trophy Points:
    56
    my G86: OCed that thing to over 50% and for 3.5 years and counting beat hard on it with gaming ... no problems :D


    that is not true, heat pipes transfer heat much faster than copper does and are harder to manufacture, thus the overall cost of a heat pipe could easily surpass the one for the copper rod.

    ----

    to the OPs question: chips usually last really long time, unless there's defect of some sort and/or the chip is not used as it's supposed to. And even then some just keep working :)

    play your games and be happy
     
  6. Bog

    Bog Losing it...

    Reputations:
    4,018
    Messages:
    6,046
    Likes Received:
    7
    Trophy Points:
    206
    I merely stated that heat assemblies using aluminium are commonly used to cut costs instead of using copper, even though the latter metal offers superior heat transfer.

    As far as I know, what I said is not incorrect at all. If you are referring to metal pipes being superior to rods then you'd be correct, but I wasn't talking about that.
     
  7. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,076
    Trophy Points:
    581
    Yes aluminum fins are often used to reduce cost, however, if they are sized properly, they will get the job done just fine. Both my Asus have copper (or some copper alloy) heatpipes with aluminum fins and they are running quite cool.

    Anyways, we're going off topic here. The OP got his answer, as long as the cooling is adequate, there is absolutely no harm in playing games on the Intel integrated graphics.
     
  8. Meaker@Sager

    Meaker@Sager Company Representative

    Reputations:
    9,424
    Messages:
    58,141
    Likes Received:
    17,824
    Trophy Points:
    931
    Aluminium is lighter and gives out heat better than copper (though absorbs it slower) so I can see them being a good combination with heat pipes.
     
  9. TheBluePill

    TheBluePill Notebook Nobel Laureate

    Reputations:
    636
    Messages:
    889
    Likes Received:
    0
    Trophy Points:
    30
    Magnesium has about 2/3rds the conductivity of Aluminum. I wonder how it handles heat dissipation and absorption in comparison?
     
  10. tijo

    tijo Sacred Blame

    Reputations:
    7,588
    Messages:
    10,023
    Likes Received:
    1,076
    Trophy Points:
    581
    Thermal conductivity of copper at near ambient temps ~380W/(m*K)
    Thermal conductivity of aluminum at near ambient temps ~240W/(m*K)

    Convection coefficients are independent of the metal used.

    I don't see where you got the info that aluminum gives out heat better than copper, copper is a much better heat conductor than aluminum and that's what matters. The choice of aluminum is purely cost and/or weight driven, it gets the job done though.
     
Loading...
Similar Threads - true playing integrated
  1. SierraFan07
    Replies:
    1
    Views:
    679
Thread Status:
Not open for further replies.

Share This Page