1. You may have noticed things look a little different around here - we've switched to a new platform (XenForo) and have some new forum styles and features. This how-to guide will help you find your way around. If you find anything that looks strange, post it in this thread.

DIY eGPU experiences

Discussion in 'e-GPU (External Graphics) Discussion' started by master blaster, Sep 18, 2009.

  1. bjorm

    bjorm Notebook Consultant

    Reputations:
    39
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    to make it clear, I gain 80% abilities of GPU through eGPU, not lost 80%, right? ;)

    and one more question: what gonna give more boost for eGPU performance: new QM processor or new GPU (for example GTX580) - in aproximately the same cost.
     
  2. User Retired 2

    User Retired 2 Notebook Nobel Laureate NBR Reviewer

    Reputations:
    4,127
    Messages:
    7,897
    Likes Received:
    2
    Trophy Points:
    0
    <span style="font-size:large">Intel's Series-7 chipset has a x4 3.0 capable northbridge</span>

    Lookie lookie what have we here... Intel managed to sneak in x4 pci-e 3.0 lanes on the Northbridge of their their Series-7 mobile chipset. A Thunderbolt link there would give the equivalent of x16 1.0 or x8 2.0 bandwidth, half of the x16 2.0 of current desktops. That gives 96% of x16 2.0 desktop performance.

    So it may well be true that Intel are holding back the release of Thunderbolt to work out all the bugs needed to give us a x4 3.0 (x16 1.0) near desktop-class eGPU bandwidth. How nice of them if that is the reason for the delayed Thunderbolt chips being released to developers.

    Otherwise our current DIY eGPU hardware uses mPCIe/expresscard slots connected on the Southbridge. The Series-7 chipset use the same pci-e 2.0 signalling there just as the Series-6 chipset did so doesn't offer any advantages other than the faster and more efficient Ivy Bridge CPUs it hosts.

    [​IMG]
    REF:
    (1) Core i7-3720QM: Ivy Bridge Makes Its Mark On Mobility : Understanding Ivy Bridge's Real Target
    (2) PCI-SIG - FAQ - PCI Express 3.0

    Anybody have GTX670 x1.2Opt benchmark results?

    The GTX670 is the most affordable Keplar card for the time being... still a bit rich for some checking in at ~US$400.
     
  3. SimoxTav

    SimoxTav Notebook Evangelist

    Reputations:
    273
    Messages:
    443
    Likes Received:
    0
    Trophy Points:
    30
    I'm sending back my GTX560 on next monday (the procedure for the upgrade was unlocked yesterday for me), so i'm going to have the GTX670 by friday of the same week.

    Yes, given 100FPS @16x you get 80FPS @1x + Optimus Compression.

    About the CPU, i would say that for the Optimus Compressione plain clock matters more than the number of cores or other extra features (cache p.e), so i would aim a high performer dual core instead of an equal "daily performing" quad.

    Simone
     
  4. kizwan

    kizwan Lord Pringles

    Reputations:
    1,500
    Messages:
    3,229
    Likes Received:
    5
    Trophy Points:
    106
    Diablo III

    I just received this game today & not done any game benchmark yet but these are two screenshots running it with eGPU.

    Outdoor:-
    [​IMG]

    Indoor:-
    [​IMG]

    EDIT: I forgot to set FRAPS to capture FPS when taking screenshot. :(

    EDIT 2: I also received BF3 today, a gift from my friend. I noticed it need more than 4GB of RAM. I only have 4GB of RAM & BF3 utilized it almost 100%. It make the game stutters a bit.
     
    Last edited by a moderator: Feb 6, 2015
  5. Silberfuchs

    Silberfuchs Newbie

    Reputations:
    0
    Messages:
    2
    Likes Received:
    0
    Trophy Points:
    5
    I wasn't talking about Thunderbolt, I was asking about the MSI GUS II.
    So it replaces...
    ...the cable.
    ...an enclosure.
    ...a PSU.
    Right? For ~150$ this seems to be something you could easily do yourself with more customization, more money saved and the option to use faster GPUs.
    And the most important issue: You don't have to wait.
     
  6. Fabi3n

    Fabi3n Notebook Geek

    Reputations:
    40
    Messages:
    99
    Likes Received:
    0
    Trophy Points:
    15
    @kizwan : Yes i play bf3 with my egpu too (on internal screen) and to be ok with my 4GB ram, i have to set vido res to 768p and detail on "normal" to be very "smoothie" playable on big multiplayer servers.

    Simoxtav has good result with bf3 and his 8gb ram.
     
  7. bjorm

    bjorm Notebook Consultant

    Reputations:
    39
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    i also play bf3, FHD, high, 64x64, canals - fps ~40-50, when killed drop to 25-30.

    another question, do you think that there will be necessary to reinstall anything if I swap 560ti with, for example gtx580?
     
  8. Fabi3n

    Fabi3n Notebook Geek

    Reputations:
    40
    Messages:
    99
    Likes Received:
    0
    Trophy Points:
    15
    Yes bjorm, on multi, ram is very important.
     
  9. ciccio64

    ciccio64 Notebook Enthusiast

    Reputations:
    39
    Messages:
    43
    Likes Received:
    0
    Trophy Points:
    15
    Hi guys.Helppp!!
    after two successfull implementation on the ferrari one and on a friend computer now i need help..i am trying to implement a egpu on my old laptop a toshiba satellite
    A135-S2286 with a T2060 overclocked to 2.4 with pin mod and an ATI radeon XPRESS 200M.The problem is the egpu is'nt recognized at all.If i stby the system and than resume with egpu connected than i just get the card spinning and nothingh happen.the same happen if i boot with egpu powered and connected.any advice?The egpu is a geforce gt430.



























    Hi guys.Helppp!!
    after two successfull implementation on the ferrari one and on a friend computer now i need help..i am trying to implement a egpu on my old laptop a toshiba satellite
    A135-S2286 with a T2060 overclocked to 2.4 with pin mod and an ATI radeon XPRESS 200M.The problem is the egpu is'nt recognized at all.If i stby the system and than resume with egpu connected than i just get the card spinning and nothingh else.the same happen if i boot with egpu powered and connected (sometime i get a black sceen).any advice?The egpu is a geforce gt430.please help!
     
  10. kizwan

    kizwan Lord Pringles

    Reputations:
    1,500
    Messages:
    3,229
    Likes Received:
    5
    Trophy Points:
    106
    FPS: 20 - 45

    This is my settings:-
    [​IMG]

    [​IMG]
     
    Last edited by a moderator: Feb 6, 2015
  11. Fabi3n

    Fabi3n Notebook Geek

    Reputations:
    40
    Messages:
    99
    Likes Received:
    0
    Trophy Points:
    15
    on internal screen ?
    (i added you as friend on battlelog ;) )
     
  12. bjorm

    bjorm Notebook Consultant

    Reputations:
    39
    Messages:
    110
    Likes Received:
    0
    Trophy Points:
    30
    probably external - there is a 1080p.
     
  13. EpicBlob

    EpicBlob Notebook Evangelist

    Reputations:
    49
    Messages:
    410
    Likes Received:
    16
    Trophy Points:
    31
    Intel has not yet released the TB chips. So until they release them, we can only wait :(. And the TB enclosures have been set to release for long enough that when intel releases the chips, the enclosures will come out very soon after.
     
  14. Fabi3n

    Fabi3n Notebook Geek

    Reputations:
    40
    Messages:
    99
    Likes Received:
    0
    Trophy Points:
    15
    Yes ! i think too ! but never know !
     
  15. Big Lebowsky

    Big Lebowsky Newbie

    Reputations:
    0
    Messages:
    5
    Likes Received:
    0
    Trophy Points:
    5
    Hi,

    every now and then I got time I am trying to get some knowledge on the whole egpu thingy.

    Something I am still confused about is the PCIe 1.0 2.0 1x 2x 4x - configurations... Is there some link on Information, so I can understand what's what and what speed you can get on each connection? I had something in mind about an 15% perfomance drop on 16x vs 1x, but I do not get the 1.0 2.0 comparison. What is the best connection one can get off an Expresscard slot?

    Is there a difference in using an Expresscard34 or Expresscard54?

    I'm running a Lenovo W520, has someone already managed an egpu setting with that machine? Is there something like a step by step DIY guide for this?

    Will the newest desktop graphic cards give a performance boost, or is the whole egpu-setting already at it's maximum performance...

    Last but not least... Is it a plug-and-play setup, since nvidia got optimus on their latest mobile gpus?

    Thx for replies,
    Cheers =)
     
  16. kizwan

    kizwan Lord Pringles

    Reputations:
    1,500
    Messages:
    3,229
    Likes Received:
    5
    Trophy Points:
    106
    I have non-Optimus eGPU, ATI5870 (@x2) eGPU. (Thanks. :))

    Additional Diablo III screenshots.
    [​IMG]

    [​IMG]

    [​IMG]
     
    Last edited by a moderator: Feb 6, 2015
  17. LoneNF

    LoneNF Notebook Consultant

    Reputations:
    0
    Messages:
    113
    Likes Received:
    0
    Trophy Points:
    30
    No. The expresscard for the ViDock and the PE4L are 34. But they have a adapter-piece, so the 34 sits good in the 54 slot firmly.
    Real Expresscard54 is possible only if you solider the adapter yourself (as i do at the moment).
    Only on some systems. For example, on a Dell Latitude E6520 without dedicated graphic card.
    I have a E6520 with a dedicated graphic card, so i will have to use Setup1.x to disable it before starting Windows.
     
  18. mattbuzz

    mattbuzz Newbie

    Reputations:
    0
    Messages:
    4
    Likes Received:
    0
    Trophy Points:
    5
    Hi all

    I am a forum post virgin so please be gentle! I've been reading lots of posts and got quite far given my very limited knowledge and skills but now I have really run out of talent and I'm stuck.

    I have a PE4H and HD5770 connected to an MSI CR620 Laptop(the older one with P6100 processor) with 4GB Ram. Thus meaning I can't use Setup 1.x to overcome the Code 12 Error

    I've managed to update the DSDT table following the awesome information on here, and the Large Memory entry is now in the Device Manager, however I'm still getting the Code 12 error and from the looks of it the TOLUD is still at 3.5gb (although I'm not sure). (Screengrab attached)

    I'm not sure where I need to go from here, however I'm guessing that I need to get the TOLUD down to 3.25gb.

    Any help would be massively appreciated.
    Matt
    [​IMG]
     
  19. Kuro-D

    Kuro-D Notebook Enthusiast

    Reputations:
    55
    Messages:
    36
    Likes Received:
    0
    Trophy Points:
    15
    @kizwan

    I had the Qword at the end originally too, but either way bsod.

    So update. Put the qword at the end of that area.. I used the newest version on iasl, but that just reveiled more errors.

    I fixed a few with the buffer/package replacement tip from your faq, also changed one of the _irq to irq (like you recommended to some else in the thread a while back)

    This resulted 0 errors, and just a couple warnings.

    but even after doing all that, loading the table with asl, still no large memory and bsod on restart.

    attached is my original .dsl, with my newest modified one. If you could take another look through it.. or perhaps even if you were so kinda modify the original .dsl with the edits i need to get this working.

    thanks for much kizwan, you've been helpful so far, but for some reason it just isnt taking the modified dsdt.

    I've also noticed that even if I take my original .dsl and run it through iasl I get errors (without changing a thing). and if I take that aml that is produced using the latest iasl, and load it back in. I get a bsod aswel.

    Is it possible that my original dsdt isnt correct somehow? or that windows server 2008r2 has some issue with loading a modified dsdt?
     

    Attached Files:

  20. carage

    carage Notebook Consultant

    Reputations:
    9
    Messages:
    224
    Likes Received:
    2
    Trophy Points:
    31
    All I can tell you it is a generally plug and play setup if you use a ViDock. Unfortunately, the same could not be said about the DIYeGPU solution.
    I have a W520 and a ViDock 4 +, I was able to pair it with a GTX 460, GTX 560 Ti and now a GTX 670 effectively. The experience is mostly plug and play. Nando's software eGPU setup is not a necessity on the W520 unless you want to use an AMD videocard. I had a lot of headaches when I tried to install a Radeon 7950. I gave up after two weeks of trouble.
    I also have a PE4H-PM3N setup for troubleshooting. It had trouble with the GTX 670. I don't know why I got a Code 10 in Windows 7 x64 and the HDMI device on the GTX 670 refused to be loaded regardless of whether I disable the onboard Quadro 1000M or not.
    Ironically, the ViDock box does not have the same issues and the eGPU, dGPU, and iGPU all coexist peacefully together. However, the ViDock did have some issues regarding placement, the fan whines loudly if I do not flip the ViDock over and let the air holes face up. ViDock support claims it might be a problem with the weight of the card bending the PCB.
     

Share This Page