Let's figure out how to convert internal MXM connector an external PCI-E x16 box

Discussion in 'e-GPU (External Graphics) Discussion' started by toshiki, Aug 9, 2009.

Thread Status:
Not open for further replies.
  1. tobynextdoor

    tobynextdoor Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    1
    Trophy Points:
    5
    Thanks for the help and all of your previous work here. I hope it'll work,
     
  2. Sompom

    Sompom Notebook Enthusiast

    Reputations:
    25
    Messages:
    48
    Likes Received:
    31
    Trophy Points:
    26
    One late thought for you, which you may already know: You may have trouble with a Mac for the reason that Apple strictly control their hardware. Make sure you get an eGPU which is "Apple Certified" (Or maybe I'm totally wrong; I haven't worked with Mac hardware much)
     
  3. tobynextdoor

    tobynextdoor Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    1
    Trophy Points:
    5
    Well, I guess nobody knows what's gonna happen. There is only one way to find out..
     
    jackie89 likes this.
  4. I Hunt Demons

    I Hunt Demons Notebook Enthusiast

    Reputations:
    5
    Messages:
    10
    Likes Received:
    7
    Trophy Points:
    6
    Sompom is correct. Apple is in fact very closed source, and choose very specific hardware which are then given proprietary software (I'm not sure if it's custom BIOS's, drivers, or both), and then they'll work with OS X. I believe that if you're wanting to run a different OS than OS X, you may be able to get away with using any card. If you do choose OS X, then you may need to do some extra research. I highly recommend that you research the topic so that you don't waste any time or money. One thing that will definitely help you is researching Hackintosh machines. Hackintoshes are computers not from Apple, but still run OS X. The Hackintosh community has tutorials, build guides, Apple Certified hardware and software, and other information that will prove helpful.

    I hope that this helps! :)
     
  5. allenpan

    allenpan Notebook Enthusiast

    Reputations:
    0
    Messages:
    41
    Likes Received:
    2
    Trophy Points:
    16
    i want to do it otherway around: using MXM on desktop or convert stander PCIE to MXM slot for smaller build
     
    ameixoeiro likes this.
  6. ShadowsNight

    ShadowsNight Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    22
    Trophy Points:
    16
    I'm working on a similar idea, I'll be using Mini DisplayPorts instead and running the plugs to were the heat syncs where in my Alienware m18x, the main I'm hoping to make it so while the eGPU PCB isn't connected it will just run the integrated GPU, I'll let you guy's know as I get further in the project :)
     
    jackie89 likes this.
  7. Sompom

    Sompom Notebook Enthusiast

    Reputations:
    25
    Messages:
    48
    Likes Received:
    31
    Trophy Points:
    26
    Actually, that brings up an interesting question that I had just never thought to ask before. Do laptops with removable GPUs and Optimus work with the GPU removed? I would assume so, provided the BIOS doesn't stop you...

    Another interesting thought occured to me while I was chatting with a friend today. Most modern motherboards/processors don't support enough PCI lanes to do 2x16 for SLI, so run in 2x8 instead. Also, PCI lanes are splittable, meaning one could run two x8 connections from a single x16.

    No promises, but winter break is coming soon, so hopefully I'll get motivated enough to put together a new version of the MXM converter and fix the various problems I've identified. Maybe also think about a way to split the PCI connection in two, but that starts getting into the problem that SLI/Crossfire capable motherboards need some silly chip or another to tell the cards they are allowed to do SLI. There are software ways around that, but they are unreliable.
     
    jackie89 likes this.
  8. ShadowsNight

    ShadowsNight Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    22
    Trophy Points:
    16
    well from memory m18x ran both in x8 considering that most motherboards do that I would expect laptops to just do that, but it was about a year ago I checked, and it is an "old" laptop now, it seems to have issues frying graphic cards I've gone through 5 of them with that laptop :/, I think it's just a power issue... because it works with a new one for a while before destroying them XD

    as far as optimus goes, when my laptop does not detect a gpu when powering up it switches to intel's gpu, I don't think optimus would work with a desktop card, because I don't know if they can send the data back thru the PCI-e lane for the intel gpu to display :/ happy to be wrong though :p
     
  9. Sompom

    Sompom Notebook Enthusiast

    Reputations:
    25
    Messages:
    48
    Likes Received:
    31
    Trophy Points:
    26
    Yikes. That sounds like an expensive laptop to operate!
    As I said, high-end, modern desktop processors do not support more than 16 PCI lanes, meaning cards will run at x8. It would not be surprising if laptop cards do as well, since laptop processors usually cut corners compared to desktop processors.

    Interestingly, they absolutely can. I can't send a link, because Notebookreview really does not like the other website, but look on Google for "eGPU experience [Version 2]". There, you can find a list of benchmarks and recommendations, and notice that laptops with an integrated Intel which is able to do Optimus with an NVidia card has noticeably higher performance than those which can't.
    This is actually the easiest way to get the eGPU working on the laptop's internal screen. For reference to that, you can see a picture of a laptop/eGPU I setup awhile ago: https://scontent-ord1-1.xx.fbcdn.ne...=b3cfe5939ae101007c11618a43f3ef87&oe=56E98ACD (Let me know if that link doesn't work. It was just easiest to link directly from my Facebook upload :p )
    The setup there is to remove the WiFi card and connect the eGPU in its place, which is only an x1 connection, which lets the GPU operate at about 80% performance.

    Interestingly, the reverse does not seem possible. I have not seen a laptop with a desktop processor and Optimus vs. a laptop GPU, nor have I heard of Optimus being used in a desktop to allow the motherboard connectors to be used for high-performance graphics output.
     
  10. ShadowsNight

    ShadowsNight Notebook Enthusiast

    Reputations:
    0
    Messages:
    33
    Likes Received:
    22
    Trophy Points:
    16
    Thats really cool, there'd still be a bottle neck with using optimus but and it's only really using the eGPU for possessing and then gets filtered threw the iGPU as far as I know (it has been a while since I've researched this), while I know it's not much for laptop cards, but it is still noticeable, I don't know what about it slows it down to know if it would be worse with eGPU, but the easiest way to prove the difference with optimus and without is running a bench mark on just your laptop display then HDMI with laptop display disabled, you'll get a better score.
    It's cool that, that works though I didn't think it would. It'd still be cool to have the option of running the GPU display back through the laptop, the only downside would be you'd need to turn off and back on again your laptop to switch between iGPU and eGPU, but that lil bit extra never hurts :p I suppose I don't need to worry about that until after I get it working now :)
    I've just finished fixing up my Inspiron 1720 so it can play Final Fantasy ARR so this is my next project.
    Merry Christmas people :D
     
Loading...
Thread Status:
Not open for further replies.

Share This Page