Remix.run Logo
aidenn0 2 days ago

Author has an article on that too[1]

To explain what I learned on my own:

If you want to (for example) put 4 NVME drives (4 lanes each) in a 16x slot, then you need two things:

1. The 16x slot actually needs to have 16 lanes (on consumer motherboards there is only one slot like this, and on many the second 16x slot shares the lanes, so it will need to be empty)

2. You need to configure the PCIe controller to treat that slot as 4 separate slots (this is called PCIe bifurcation.

For recent Ryzen CPUs, the first 16x slot usually goes directly to the CPU, and the CPU supports bifurcation, so (assuming your BIOS allows enabling bifurcation; most recent ones do) all you need to do is figure out which PCIe lanes go to which slots (the motherboard manual will have this).

If you aren't going to use the integrated graphics, you'll need a 16x slot for your GPU. This will at best have 4 lanes, and on most motherboards (all 800 series chipset motherboards from all major manufacturers) will be multiplexed over the chipset, so now your GPU is sharing bandwidth with e.g. USB and Ethernet, which seems less than ideal (I've not benchmarked, maybe someone else has?).

In the event that you want to do a 4x NVME in a 16x slot I found that the "MSI X670E ACE" has a 4x slot that does not go throught the chipset, and so does the "ASRock B650M Pro x3D RS WiFi" either of those should work with a 9000 series Ryzen CPU.

ThreadRipper CPUs have like a gajillion PCIe lanes, so there shouldn't be any issues there.

I have also been told that there are some (expensive) PCIe cards that present as a single PCIe device to the host. I haven't tried them.

1: https://utcc.utoronto.ca/~cks/space/blog/tech/PCIeBifurcatio...