*OFFICIAL* Alienware "Graphics Amplifier" Owner's Lounge and Benchmark Thread (All 13, 15 and 17)

Discussion in '2015+ Alienware 13 / 15 / 17' started by Mr. Fox, Dec 10, 2014.

  1. Juan Phoenix

    Juan Phoenix Notebook Enthusiast

    Reputations:
    7
    Messages:
    28
    Likes Received:
    13
    Trophy Points:
    6
    As I commented in the post above, I have the Alienware 13R3 i7 7700HQ with 2Tb SSD, 32gb DDR4 2667mhz, and AGA first with 2080 SUPER and currently with 2080 Ti, and I have never had a CPU bottleneck, I haven't tested Cyberpunk yet but I always set the settings to ultra at 1440p in all games and the GPU usually goes to 90% and the CPU always stays between 45 and 70% at most, but usually it's between 55 and 65%.

    Obviously an 8C/16T CPU is ideal, but I think there won't be any bottleneck until the 3080, maybe you'll get about 10-12 more FPS out of it with a newer CPU, but I don't think there will be any bottleneck until at least the 3090.
     
    Suxel and etern4l like this.
  2. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    18
    Likes Received:
    12
    Trophy Points:
    6
    Start by installing MSI Afterburner with Rivatuner and launch 3DMark. Display FPS with Rivatuner.
    You will see that your GPU score will be Top but the CPU part will be low.
    My i7-6820HK being 1% more powerful than your i7-7700HQ ... I know what I'm talking about.
    I am also Full SSD with 32GB of Ram and when a game requires a lot of the CPU I drop FPS.
    My GPU is rarely at 100% except with raytracing and I run 60 FPS Ultra on all recent titles.
    Except ... when the CPU is limited as on Cyberpunk or certain openworld passages on Star Wars Jedi Fallen Order or Shadow of the Tomb Raider for example.
    The games are smooth but drop to 45/55 FPS and I see it on monitoring. The CPU is at 100% on some cores.
    Maybe you do not realize or that you are not sensitive to small jerks or that you have not advanced in the big AAA games but I guarantee you that your configuration is CPU limited.
    I myself am CPU limited with an i7-6820HK and an RTX 2070S OC.
    Launch Rivatuner and display FPS. Launch Cyberpunk in Ultra, play a bit and once in town grab a car. Roll and watch your CPU and FPS. You will know what the CPU Limited is
    It is not serious ;).
    You just have to accept that the i7 laptop 4C / 8T is at the end of its life :)
    https://pc-builds.com/calculator/Core_i7-7700HQ/GeForce_RTX_2080_Ti/0KS12nlv/
     
    Last edited: Feb 12, 2021
  3. Juan Phoenix

    Juan Phoenix Notebook Enthusiast

    Reputations:
    7
    Messages:
    28
    Likes Received:
    13
    Trophy Points:
    6
    I think there are 2 years left to play decently with 4C/8T.

    As I said in any game I have not used 100% of my CPU, partly because I play at 1440p, at higher resolutions less bottleneck for the CPU and more load for the GPU, especially in 4k. And my FPS never drop below 65FPS, even with Ray Tracing. I think Cyberpunk plays in the major leagues, I haven't tested it yet, but I know I won't be able to run 60FPS with everything in ultra at 1440p.

    But I think 4C/8T still has no bottleneck problem with 3070 or less.

    As long as you don't use 100% of the CPU there is no bottleneck, it is true that newer CPU's give more FPS, but this is not because of "bottleneck" it is simply because of more cores, more GHz, and new technologies.
     
    Last edited: Feb 12, 2021
  4. FXi

    FXi Notebook Deity

    Reputations:
    345
    Messages:
    1,053
    Likes Received:
    127
    Trophy Points:
    81
    I wonder (thinking on new connectors) if some maker might figure out how to chain two TB4 into a single 8x effective connection. Now that Maple Ridge is out, there are many designs with dual TB ports. Probably not technically easy, but once you've hit 8x in the current world anyway you've got very little bottlenecking left.
     
    etern4l likes this.
  5. Papusan

    Papusan TURDBOOKs Sucks! Dont waste your $$$ on FILTHY

    Reputations:
    35,607
    Messages:
    27,771
    Likes Received:
    52,682
    Trophy Points:
    931
    The new and shiny 4C/8T @ 5.0GHz is a bottleneck even for the castrated "RTX 3070 laptop GPU with 85 watts". And this 4C/8T have a lot higher IPC + clock speed than the old gen mobile chips with same amount cores/threads.
    upload_2021-2-13_0-50-2.png

    http://forum.notebookreview.com/thr...cale-on-laptops.834039/page-104#post-11077763

    Tiger Lake-H35 is too weak for RTX 3070
     
    Last edited: Feb 12, 2021
  6. Juan Phoenix

    Juan Phoenix Notebook Enthusiast

    Reputations:
    7
    Messages:
    28
    Likes Received:
    13
    Trophy Points:
    6
    In the Benchmarks and synthetic test, on the practice (the games), you barely lose a few FPS, depending on the title, and resolution, etc etc, on average it is about 8-10fps, I have a 2080Ti and I have no bottleneck, in performance the 2080Ti and the 3070 are practically identical. (I know I would gain more FPS with a 8c/16T CPU, but as I said, it is not because of "bottleneck" it is because newer I mean more cores, more GHz, better technology, etc...).
     
    Last edited: Feb 13, 2021
    etern4l likes this.
  7. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    18
    Likes Received:
    12
    Trophy Points:
    6

    I think without judgment that it does not play AAA Ultra games such as Assassin's Creed Valhalla, Watch Dogs Legion, Cyberpunk etc which are CPU intensive ... I think if he tested these games in FHD or 4K he would understand that the physics of a game depends on the CPU! ... Even if he put an RTX 3090 on his i7 7700HQ laptop, he would have a big drop in fps when he arrived in an area with a lot of PNJ to manage. Same consequences when arriving in a medieval village with a lot of entertainment to deal with or driving in a crowded megalopolis processed by the CPU. Once again I know what I'm talking about. On my i7-6820HK OC 4Ghz ( ;) ), when I am CPU limited in the cases listed above, whether I am in 720P or 4K, the game jerks (45 / 55fps). I have one or two cores at 100% so drop despite an RTX 2070S at 70/80%. 4C / 8T CPUs are no longer a viable long-term solution with a large GPU. There will always be drops in fps on open area or with a lot of physics on the screen. I really like my 17R3 combo with RTX2070S but I feel more and more the CPU limited in FHD or 4k on TV. I wanted to change this year but the prices are amazing because of the Covid. I hope that in 2022 we will have stock on Nvidia serie 4 ... and CPU 8C / 16T or 16C / 32T at 5Ghz in fixed configuration.
     

    Attached Files:

    Last edited: Feb 13, 2021
  8. Papusan

    Papusan TURDBOOKs Sucks! Dont waste your $$$ on FILTHY

    Reputations:
    35,607
    Messages:
    27,771
    Likes Received:
    52,682
    Trophy Points:
    931
    I know the 7700Hq is a bottleneck. Because the oc'd 4 core 3770K can't keep up with 2080Ti.
    https://www.3dmark.com/fs/22427749

    https://hwbot.org/submission/4365180_papusan_cinebench___r15_core_i7_3770k_922_cb
    [​IMG]

    https://hwbot.org/submission/4090770_krzyslaw_cinebench___r15_core_i7_7700hq_801_cb
    [​IMG]

    https://hwbot.org/benchmark/cineben...Id=processor_5361&cores=4#start=0#interval=20
     
    Last edited: Feb 13, 2021
  9. Juan Phoenix

    Juan Phoenix Notebook Enthusiast

    Reputations:
    7
    Messages:
    28
    Likes Received:
    13
    Trophy Points:
    6
  10. Mike 06

    Mike 06 Notebook Enthusiast

    Reputations:
    7
    Messages:
    18
    Likes Received:
    12
    Trophy Points:
    6
    [QUOTE = "Juan Phoenix, post: 11078253, membre: 737230"] La perte de performance en 1440p est de 10% et en 4K est de 5%, seulement en 1080p il peut y avoir une perte considérable ...

    Test FPS dans les jeux! Pas dans les benchmarks et les tests synthétiques.

    La source:

    https://www.techpowerup.com/review/intel-core-i9-10900k/21.html [/ QUOTE]

    Be specific guys! ...
    High resolution takes the strain off the processor, okay!
    Why? Because the GPU works a lot to calculate THD (4K) textures.
    But the game mechanics, physics (PNG, Visual depth, Shadows, Object movements, Anything that moves ...) uses the CPU!
    In the long run, a "good" processor will provide 100% protection against a good GPU.
    Otherwise it looks like DIY ...
    Obviously, you will drops in fps on the big AAA Ultra games!
    In short! Today for 60fps non / stop in FHD / 4K without drop, CPU 8C / 16T minimum :)
    Tomorrow and for 3/5 years 16C / 32T with RTX 3090 or RTX 4 ...
     
    Last edited: Feb 13, 2021
Loading...

Share This Page