Acer Predator (Vega 56+Ryzen 2) Helios 500

Discussion in 'Acer' started by ThatOldGuy, Jun 3, 2018.

Tags:
  1. TheReciever

    TheReciever D! For Dragon!

    Reputations:
    281
    Messages:
    1,893
    Likes Received:
    1,020
    Trophy Points:
    181
    The video is talking about eGPU as the primary means of handling 3D applications, PhysX will not nearly hog that much bandwidth alone.
     
    hmscott likes this.
  2. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,966
    Messages:
    17,551
    Likes Received:
    21,547
    Trophy Points:
    931
    The eGPU video explaining their results and experiences using eGPU's wasn't just for you either, others have expressed interest as well, and it's of general interest. I'll remove your quote from my original post, since you aren't interested.
    This says to me you want an external GPU box to support a 2nd Nvidia GPU.

    Have you ever set this up in a desktop? A 2nd GPU for Physx off-load?

    I have a few times - years ago, and it's really not worth the effort. You need at least as good of a 2nd card as your Primary GPU to match Physx performance otherwise you actually get worse FPS, and the load isn't enough to utilize a fraction of a 2nd GPU, the off-load from the primary GPU therefore isn't enough to make it worth while.

    I also tried splitting SLI into 2 cards, one with Physx, which was briefly better, but in the long run it wasn't worth it once the game supported SLI, as you get better to let the driver balance the load across the SLI'd cards and get the benefits of SLI as well as Physx. For games that didn't support SLI there wasn't much benefit.

    The worst part is there are few games that really benefit from Physx, the Batman games come to mind as an exception. When the games started supporting SLI sometime after release, it was better to enable SLI.

    If you were talking CUDA work, then a 2nd GPU or more would be worth while if you coded it for multiple discrete GPU's. But, then again the workload would need to not depend on high speed over the bus (PCIe/TB3) or low latency.

    PhysX Cards - 10 years later do they still suck??
    Linus Tech Tips
    Published on Aug 28, 2016
    Dedicated PhysX cards have been a thing for almost a decade now. But do they make any more sense now than they did when they launched?


    Dedicated PhysX Card Experiment - How Powerful Does it Have to Be? Linus Tech Tips
    Linus Tech Tips
    Published on Apr 4, 2011
    NVIDIA PhysX is a neat technology that exists to introduce a new element to PC gaming. Right now it's mostly special effects, but hopefully we'll see some games over the years that really use physics technologies to introduce new styles of gameplay.



    Interestingly the Physx benchmark in FutureMark benchmarks are done on the CPU as a CPU benchmark. With an 8 core Ryzen CPU, it might be worth testing running Physx on the CPU.

    PhysX in 2018: GPU vs CPU
    Two Tech Tutors
    Published on Apr 7, 2018
    Can a CPU deliver better PhysX performance than a GPU?


    2017: Nvidia PhysX on CPU or GPU? 1080 vs 1800x
    Two Tech Tutors
    Published on May 31, 2017
    Nvidia Control Panel has a setting to move PhysX processing to the CPU, but should you make the change? Its Ryzen R7 1800x versus GTX 1080 time!
    In order, benchmarks used OpenGL, DirectX11 & 12, with the last two both using Unreal Engine.
     
    Last edited: Jun 26, 2018
    undervolter0x0309 likes this.
  3. TheReciever

    TheReciever D! For Dragon!

    Reputations:
    281
    Messages:
    1,893
    Likes Received:
    1,020
    Trophy Points:
    181
    Im interested in this machine for getting in to Game dev and I want to learn how to incorporate PhysX, I would prefer not to get a desktop for this as this laptop checks all the boxes save for just one thing, which can be resolved with eGPU.

    No game to date has had fluid water as a local asset and its been bothering me since Mass Effect 2 (despite it being a great game)

    In your second quote of me I stated what I wanted the eGPU for, which is physx, which alone wont need that much bandwidth. I could understand why you would think its useless as its limited in titles and what GPU would suffice to compliment the primary GPU is hardly a hard science either.

    Until we have physx like water in game engines physx is what I have to look at for that particular asset. Which is sad as that tech was demoed 5 years ago.
     
    hmscott likes this.
  4. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,966
    Messages:
    17,551
    Likes Received:
    21,547
    Trophy Points:
    931
    It's an interesting idea, but quite the investment in eGPU hardware and external monitor, keyboard, mouse, etc. As stated in that Hardware Unboxed video, it's better to invest the same $ in a standalone desktop - which can run without the laptop - than to put a large investment in vestigial hardware that requires the laptop to use.

    In the past I've suggested buying a NUC or other compact desktop PC with TB3 to round out the eGPU components into a full standalone desktop PC that's useful without the laptop. That way you have 2 systems available, usually a good idea, laptops / desktops do fail from time to time, so you'd have a backup system.

    Given people have discovered TB3 isn't fast enough, and likely even worse for the next generation of GPU's, it's not a good time to invest in the TB3 solutions, hopefully not too far down the road you can get a faster solution - of course that will likely require a new laptop with the new IO connector too. :)

    Also, why not develop using a technology that can be run on both AMD and Nvidia? Why buy an AMD laptop - supporting AMD, when your goal is to develop software that only supports Nvidia?... it's just weird. ;)

    In your situation maybe it's best to get the Helios 500 with Nvidia GPU?

    Acer Predator Helios 500 (Intel + nVidia)
    http://forum.notebookreview.com/threads/acer-predator-helios-500-intel-nvidia.818249/
     
    Last edited: Jun 26, 2018
  5. TheReciever

    TheReciever D! For Dragon!

    Reputations:
    281
    Messages:
    1,893
    Likes Received:
    1,020
    Trophy Points:
    181
    I agree TB3 isnt enough bandwidth but it also works well under NVME, with only a PhysX workload it shouldnt need a monitor to save the bandwidth.

    Also I am never at a static location for long, if I was I would still have my SFF desktop I had a couple years back, dimensions alone the PC would've been ok but was overweight and would've cost me more than the PC was worth to transport. Airliners almost never check weight of laptops though. Was able to bring 4 laptops in 1 bag with no issues, but a desktop brings immediate problems in transport.

    I want to support AMD, and also want to try using Vulkan API and if possible incorporate PhysX for water. If I find something else that can do it at that point then I will be happy to use that technology, I just havent seen it yet.
     
    hmscott likes this.
  6. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,966
    Messages:
    17,551
    Likes Received:
    21,547
    Trophy Points:
    931
    A SFF case would be the same size or smaller than the smallest eGPU I have seen - not the bare bones PCIE cable jobbers of course, but the eGPU's with a case / power supply / etc.

    I used to fly with UNIX, Mac, PC development systems in shipping cases with me, or better yet shipped ahead for working remotely, what a PITA.

    You could get a Pelican or Zero case with full foam to do your own cut out's or pay for a custom one that would fit the dimensions of your SFF / components, and have it fly in cargo while you only worry about boarding the laptop.

    You might be served (pun intended) with a nice desktop at home or office that you can connect to the desktop over the internet from your remote location - from the laptop, internet cafe, or client PC vs carrying it all with you - which is what I ended up doing instead of lugging everything with me.

    There are still situations that I need the physical hardware, but these days client sites can set up what I need before I get there - way back when hardly any client had spare hardware in the range I needed available for development work so it was a requirement to ship my hardware in, or in some situations with enough lead time ask the client to order what was needed.

    Glad to hear you want to support AMD as well as covering all the options development wise, that's the way to do it. :)
     
    Last edited: Jun 26, 2018
  7. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,966
    Messages:
    17,551
    Likes Received:
    21,547
    Trophy Points:
    931
    Something else just came to mind, Nvidia is a jerk. ;)

    Well, more specifically Nvidia disables functions of it's own cards if it finds an AMD GPU in there too. I think I've seen this a number of times, and recently as well.

    I'll have to look it up, but I wouldn't be surprised if Physx / CUDA / AI / Tensors is one of those things nowadays...

    Edit: yup, Nvidia disables Physx 2 years ago according to this post, when detecting an AMD card, maybe there is a work around, this article isn't real old at 2 years but it's already archived:

    Is what nVidia doing legal? (PhysX)
    https://www.reddit.com/r/Amd/comments/4gcjx1/is_what_nvidia_doing_legal_physx/

    "So I just found out today that nVidia disables PhysX when an AMD card is detected. How is this even allowed? Wouldn't that be like a car maker disabling features you paid for because you bought Brand A tires instead of B?

    I did some testing and PhysX remains enabled when an Intel IGP is used as well. Seems real fishy that they only disable for AMD. How can this even be legal to disable a feature when a certain branded device is detected?

    EDIT: Let me be clear. I'm not saying that AMD can/should run PhysX. I'm saying that nVidia is disabling PhysX on their cards when it detects AMD GPUs. I think it's called Hybrid PhysX."

    SillentStriker 2 years ago
    "Am I the only one seeing that almost every comment here is missing OP's point? There was a point in time previously where you could have an AMD card rendering "the game" and the Nvidia card dedicated to PhysX only within the game, which Nvidia then updated their drivers to no longer allow this to happen because it "might break PhysX" or whatever their excuse was. I'm not sure if its legal but, if it was illegal then we would have already heard something about it."

    Even if this isn't currently the case, Nvidia has pulled things like this before, so you can't really rely on stable AMD GPU + Nvidia GPU operation with coexistant driver compatibility long term, updates on either AMD / Nvidia driver can cause problems, as I found years ago - there was a bit of an art to getting them to work through driver updates back then as I recall.

    Edit: An unofficial solution, and they say Nvidia lifted the lockout in Summer 2016:

    "In the summer of 2016 though, NVIDIA must have had a change of heart and allows the pairing with AMD cards ever since."

    Hybrid PhysX
    http://physxinfo.com/wiki/Hybrid_PhysX

    "Hybrid PhysX is an name of unofficial configurations, where AMD Radeon and NVIDIA GeForce cards are used simultaneously, AMD GPU for graphics, and NVIDIA GPU - for PhysX.

    Hybrid PhysX configurations are not supported and were even chased by NVIDIA (if AMD card is detected in the system, NVIDIA GPU will lock PhysX processing capabilities), thus were available only through special driver hack, known as Hybrid PhysX Mod, developed by a user with nickname "GenL". In the summer of 2016 though, NVIDIA must have had a change of heart and allows the pairing with AMD cards ever since."

    "Important note.
    Hybrid PhysX is not officially supported by NVIDIA and AMD. Use it for your own risk."
     
    Last edited: Jun 26, 2018
  8. TheReciever

    TheReciever D! For Dragon!

    Reputations:
    281
    Messages:
    1,893
    Likes Received:
    1,020
    Trophy Points:
    181
    Yeah the size comparison is right but the difference would be, if I had to sell it dirt cheap to get rid of it, it would just be an eGPU as opposed to a whole PC. A much easier pill to swallow. I had to trade my SFF desktop for a y510p SLI which failed months after I had it.

    I still game a lot but with Gaming in the last 3ish years going full multiplayer and hardly engaging storylines to engage in it has me wanting to learn Game dev so I can make my own story to play in.

    I hate Nvidia's business practices and its why I got the Ranger and ill likely get the RX 480 MXM card for this machine if it works with eDP and the Acer will be the step up upgrade that will basically do everything for me. 8-core CPU and Vega 56. I just hope it doesnt have a 330w limit.

    EDIT:

    yeah that has been an issue in the past, but people have found work around's in the past and at some point nvidia stopped disabling the GPU when AMD was present. Have they restarted that practice again?
     
    hmscott likes this.
  9. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,966
    Messages:
    17,551
    Likes Received:
    21,547
    Trophy Points:
    931
    I followed the trail through and found that Nvidia lifted the lockout in Summer 2016 - refresh and re-read my last post. As I said it's been problematic to run both AMD + Nvidia GPU's in the past, and Nvidia could change their minds and lockout any features they like at any time.

    With Windows 10 updates + Nvidia drivers / AMD drivers updates lagging at a different pace, that would seem to be a large multiplier factor for problems.
     
  10. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    4,966
    Messages:
    17,551
    Likes Received:
    21,547
    Trophy Points:
    931
    Yeah, SFF pre-built's are problematic, I've gone through a few before getting a good one, and even then not so sure it was worth it. Better to build a micro-atx on your own and put together from parts. There are eGPU sized cases for those builds now.

    The good thing is the SFF or small PC will be able to be useful for years, whereas the dedicated eGPU will be worthless to most people as soon as the new IO interface - TB4 or? - ships.

    Only older laptops with TB3 would want an eGPU with locked in TB3, and given the lack of real performance improvement over a 1060 6GB, there isn't much of a performance advantage for many, and even fewer looking for Physx development hardware. ;)

    But, wth, you only live once, and you could always get a NUC to add to the eGPU to make use of it down the road.
     
Loading...

Share This Page