PCI-E Bifurcation

Biowarejak

Maker of Awesome | User 1615
Platinum Supporter
Mar 6, 2017
1,744
2,262
As far as I'm aware most GPUs with external power delivery draw a negligible amount from the motherboard itself :)
 
  • Like
Reactions: AleksandarK

aquelito

King of Cable Management
Piccolo PC
Feb 16, 2016
952
1,122
Depends of the GPU. Some 980 Ti drew almost nothing from the PCIe slot.
No longer the case with 1080 Ti !

Safest way to do things would be to cut the +12V line from C_Payne PCB riser and have it provided be the PSU through a single 14AWG wire.
 

Elerek

Cable-Tie Ninja
Jul 17, 2017
228
165
Has anyone tried sli with Asrock's x370 mini-itx board yet? The board is supposed to support bifurcation and the x chipset is supposed to support sli, so I would think it should be possible.
 

Elerek

Cable-Tie Ninja
Jul 17, 2017
228
165
Does Ryzen put bifurcated cards in separate iommu groups?
My current setup uses virtio to run vms for myself and my wife each with our own gpus and I'm wondering how small I could make a setup like that.

I've got a build log of my current rig on linustechtips.com, but it's not for the faint of heart. It uses a used dual socket server motherboard, so it's massive :(
(about 2x2... feet)

I had a ton of fun building it, but my dream build has a ryzen 7 itx (or threadripper if that ever comes to itx...) with waterblocked gpus in a slim style case (like sentry, node 202, s4-mini, etc).
 

jeshikat

Jessica. Wayward SFF.n Founder
Silver Supporter
Feb 22, 2015
4,969
4,780
I would have thought I'd feel the cable get warm if it was doing that. 1080ti can't exclusively draw power from the PCIe connectors if it doesn't get enough power from the motherboard does it ?

The cables could be capable of enough amperage for them not to warm up. And as we found out with the RX 480 launch power draw fiasco, motherboards don't have fuses or anything to prevent overdraw from the PCIe slot.
 

LukeD

Master of Cramming
Case Designer
Jun 29, 2016
498
1,305
On a positive note ...

I got Bifurcation and SLI working on the Asrock Z270 ITX/ac variant too :)



It was a little harder than the x370 Asrock ITX/ac board, but it works without a hitch.

 
Last edited:

msystems

King of Cable Management
Apr 28, 2017
781
1,366
On a positive note ...

I got Bifurcation and SLI working on the Asrock Z270 ITX/ac variant too :)



It was a little harder than the x370 Asrock ITX/ac board, but it works without a hitch.


Wow, that performance is really quite impressive...
This can only get better from here. I hope this concept will get more industry support and products.

Toms Hardware's testing of the 1080Ti showed it drawing 40W-50W at from the PCIe slot 12V line under non-overclocked max load use.

I researched this a little when I was looking at risers.
Here is the short version:

What I found is that the reliance on the mainboard's 12V draw during review testing wasn't indicative of the actual (mandatory) power burden placed on a riser. The reason for this seems to be that a 6 or 8-pin PCI-E cable can provide much more than it's 75w/150w "specification" to make up for the resistance in the riser. The PSU doesn't seem to limit this draw, it's up to the graphics card to self-enforce this spec. So when using a riser, a graphics card will be drawing much higher than the rated spec through this connector.

The problems seem to occur in two situations:
1. If there is some circuitry in the graphics card which relies on power draw from the PCI slot only, which can't otherwise be alleviated from the 6/8-pin connector.
2. The draw from the 6/8-pin connector is being pushed so far out of spec (at least 150% of rated spec) that it triggers some enforcement mechanism present either in the PSU or graphics card circuitry.

Any card without an external power connector is asking for big trouble. Other than that, the worst problem card would be a 6-pin RX480.

The GTX 1080 Ti shouldn't have any problem because it has two 8 pin connectors.
 
Last edited:

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
If there is some circuitry in the graphics card which relies on power draw from the PCI slot only, which can't otherwise be alleviated from the 6/8-pin connector.
This is basically the case. Few if any (I am not aware of any, but it's possible one was made in the past) GPUs have any switching circuitry to choose which power phase is connected to which 12V supply rail, and adding 'optional' power phases - e.g. have a phase connected to the card-edge, and a phase connected to the PEG connectors, and only use one at a time - is a waste of components and thus money down the drain.
The incident with the RX-480 power draw shows that with a BIOS update, you can implement an unbalanced phase scheme to shift power draw without causing instability in normal use, but it's notable that once you start to overclock the cards fall back to a balanced phase scheme and draw high power levels from the card-edge rails. It is possible that a card manufacturer could implement a 'dynamic phase unbalancing' scheme to reduce demand from the PCIe card-edge power rails when needed, but there's not much reason to do so (x16 slots can be assumed to be spec compliant and able to supply the full spec current), and also there's no real method of sensing "this particular slot isn't capable of providing the spec current" - especially when the motherboard is reporting that it IS available - other than ramping up the current until the Magic Blue Smoke is emitted.
 

LukeD

Master of Cramming
Case Designer
Jun 29, 2016
498
1,305
PCIe Bifurcation + SLI is also possible on the Gigabyte GA-Z270N-Gaming 5 (F4 bios)
What's more interesting is that the performance on the Gigabyte board is a little better than the Asrock ITX.
Using the same exact CPU and memory (and settings)

 
Last edited:

jeshikat

Jessica. Wayward SFF.n Founder
Silver Supporter
Feb 22, 2015
4,969
4,780
So when using a riser, a graphics card will be drawing much higher than the rated spec through this connector.

Does the GPU change its power draw from the PCIe slot when on a riser though? I would assume the riser is basically transparent to the GPU.
 
  • Like
Reactions: Biowarejak

jeshikat

Jessica. Wayward SFF.n Founder
Silver Supporter
Feb 22, 2015
4,969
4,780
What's the mechanism for that? The motherboard will negotiate with the video card to allow full power on a x16 link but I'm not aware of any way for the motherboard to know to limit power with a x16 riser.
 
  • Like
Reactions: Biowarejak

Biowarejak

Maker of Awesome | User 1615
Platinum Supporter
Mar 6, 2017
1,744
2,262
I really don't know, but could it be something as simple as electrical resistance? I noticed my Sintech riser has resistors near the power pins.
I figured it was something like that. Risers are fairly common and I figure GPU manufacturers know they may not want to pull a full load through it. Surprised we haven't stumbled on a specification standard. Anyone have a contact at one of the engineering departments for the partner companies?
 
  • Like
Reactions: SeñorDonut