Hello, I have just gotten the Powercolor 5700 XT Red Devil from my son's PC ( I gave him my 3070 as he plays more FPS games than me ). He has always complained that it gets too hot and/or noisy. Both his case and my PC case have 6 x 120mm case fans. I "tested" with Far Cry 5 New Dawn, 4K monitor. 1. 220W 65 FPS 2. 165W 60 FPS 3. 110W 53 FPS These numbers are not constant everywhere but a good indication of overall performance. 110W is silent and 220W is loud. This card's fans are very noisy. Obviously there is a tremendous ask to expel 220W of heat compared to 110W of heat. Anyway, what I cannot understand is how we have a 100% increase in power draw (from 110W to 220W) with relatively little FPS gain ( from 53 FPS to 65 FPS ). Could someone please explain this? Have all GPUs the same steeply diminishing returns regarding power vs FPS ? Is this more of an AMD problem? How about the new GPUs, are the NVidia better/same as AMD in this respect?