m.2 SLI GPU

NADRIGOL

Caliper Novice
Original poster
Sep 25, 2017
23
23
Hi everyone, I've searched the forum, and done some googling, but haven't found an example of it yet. I've seen people using a series of adapters to run a GPU from an m.2 port. Mostly the interest seems to be doing this with NUC where PCIe is otherwise unavailable. But has anyone added a second GPU to an ITX board and successfully run it in SLI with the PCIe card?
 

NADRIGOL

Caliper Novice
Original poster
Sep 25, 2017
23
23
I don't think Nvidia GPUs will run at x4. Perhaps CrossFire would work though.

Hmm yes... Good point. I wonder if there's a way to bridge multiple m.2 ports. The new asrock board has two on the back. Would be very cool if there were a way to convert them, bridge them, and have them operate as a single 8 lane PCIe.
 
  • Like
Reactions: SeñorDonut
Mar 6, 2017
501
454
Depends on if they come off the same source or not, if 4 lanes are coming from the CPU and 4 from the PCH, I highly doubt it'll work
 

Chrizz

Average Stuffer
Jan 23, 2017
74
81
I don't know the answer to your question, but you may want to look into pci-e bifurcation. It allows you to run two cards in SLI from one 16x slot.
 
  • Like
Reactions: SeñorDonut

NADRIGOL

Caliper Novice
Original poster
Sep 25, 2017
23
23
I don't know the answer to your question, but you may want to look into pci-e bifurcation. It allows you to run two cards in SLI from one 16x slot.

Yes, I've been reading about it this morning. Does anyone have a good article or recent summary thread with good info on bifurcation. I see a few threads here and links to [H] threads, but they all cross several years and so the information about specific boards and bioses is lacking for recent z370 products.
 

ChainedHope

Airflow Optimizer
Jun 5, 2016
306
459
The newest Bifurication thread here has the most up to date information as far as im aware. @LukeD is also pretty informative about the subject as he has tested multiple setups and done some BIOS level hacking to get it working correctly with SLI.

As to your M.2 SLI question, I wouldnt try it. M.2 runs at x4 which limits your to AMD Crossfire, Nvidia requires x8 (even though it technically could run at x4). If you did go this route you need to consider 2 things.

1) Are both M.2 slots running through the cpu or through the chipset. If chipset, its not possible. If one goes through the chipset and another through the cpu, also not possible. They both need to be connected directly to the CPU. This is because the chipset only get x4 but its split between sata/networking/usb/etc... so you dont actually get x4.

2) Adapter clutter. Its not a clean solution and can cause some headaches as you are introducing new variables for it to not work. You will end up taking a lot of space for adapters and pci-e extension cables. Now if this was for an external eGPU... maybe? Its still a messy solution and to my knowledge multi-GPU doesnt work. I could be wrong about that.
 

LukeD

Master of Cramming
Case Designer
Jun 29, 2016
498
1,305
Yeah I would stay away from m.2 SLI. As mentioned above because of the x4 limitation.
If you want a really small system have a look at ITX + pcie bifuraction and look at getting some mini itx gpu's with waterblocks and putting everything under water :) Also I think I saw Sliger had a case design going on this forum for such a case :)
 
  • Like
Reactions: aquelito

Thestarkiller32

Cable-Tie Ninja
Aug 13, 2017
152
102
The newest Bifurication thread here has the most up to date information as far as im aware. @LukeD is also pretty informative about the subject as he has tested multiple setups and done some BIOS level hacking to get it working correctly with SLI.

As to your M.2 SLI question, I wouldnt try it. M.2 runs at x4 which limits your to AMD Crossfire, Nvidia requires x8 (even though it technically could run at x4). If you did go this route you need to consider 2 things.

1) Are both M.2 slots running through the cpu or through the chipset. If chipset, its not possible. If one goes through the chipset and another through the cpu, also not possible. They both need to be connected directly to the CPU. This is because the chipset only get x4 but its split between sata/networking/usb/etc... so you dont actually get x4.

2) Adapter clutter. Its not a clean solution and can cause some headaches as you are introducing new variables for it to not work. You will end up taking a lot of space for adapters and pci-e extension cables. Now if this was for an external eGPU... maybe? Its still a messy solution and to my knowledge multi-GPU doesnt work. I could be wrong about that.



...could that work over the PLX Chip?
 
Mar 6, 2017
501
454


...could that work over the PLX Chip?

In theory yes, but only if the PLX chip is providing an x16 connection. To my knowledge, because this is a dual GPU card, if it is fed an x8 connection each GPU will only have 4 lanes and will not work. I wouldn't do this anyway because IIRC the 1080 beats a Titan Z anyway.