I have finally upgraded the name of the thread to "3090" - after re-gaining my confidence that this can actually work, which got shaken a little in the past week.
I had a mild heart attack after getting repeat black out/ reboot cycles as soon as I'd do anything beyond just showing the desktop with the 3090 plugged in to the system- and for I while I was concerned I had my math for the PSU wrong, as typically this indicates that the GPU doesn't get enough power...
As I still haven't seen anything packing more power into something that fits the S4M than the Supermicro 804p, this would have been a hard stop for this project. Eventually I got things to run stable with the GPU at an 80% power limit which brings its TDP down from 350W (or 347, to be precise) to the 280W I had running successfully with my 2080ti - but that can't be the solution - and 800w single rail power should indeed be able to cope with what I want to throw at it.
So, next hypothesis: The wiring has too much resistance, and there are a lot of sources on the web explaining how to NOT wire a 3090 to a single power feed and just double up to two PCI connectors at the GPU end, which is what I did for my 2080ti and have recycled here. Most sources claim that is to use two independent power rails at the PSU which makes sense for an ATX PSU, but in my case that should not make a difference as having a single 800w 12V rail - unless there is simply not enough cross section to get the power through.
I had dimensioned my wiring as a single pair of AWG 14 as the bottle neck, which in theory is rated for 80A - that's 960W at 12V so this can't be it, really.
The 3.5mm bullet connectors, however, are much closer to their limits - rated for 40A, they max out at 480W.
Whilst this is all within spec, of course, temperature and resistance goes up with load, and driving something close to capacity for a long period is never a good idea.
Looking at the bullet connectors of my original wiring confirms this - these used to be golden, but now look like they have spent some time in an oven, and that certainly does not contribute to conductivity, and of course they age mechanically, contributing to what may be a resistance problem.
For a test of my hypothesis I solder a new set of wires together, using twice the cross section for the most part bringing down resistance (that applies regardless of rating), and having a set of brand new bullet connectors at the end, for the moment trying to avoid soldering on additional connectors to the PSU before I know that this is actually the problem.
And voila - things run stable at 100% power limit!
As a general precaution, and in preparation for the next challenge in the form of cooling, I have read a bit into undervolting (OptimumTech has great guides on both the
5950x and
3080/3090). With my current settings - 15 (strangely unit-less) "counts" down in Precision Boost Overdrive (that's anything between -36 and -60 mV) for the R5950x, and 862mV @ 1866 MHz for the GPU, I beat stock performance for both (north of 20,000 GPU score in TimeSpy), whilst saving 49W TDP for the GPU. I cannot quantify the power savings on the CPU, but even disregarding that, I am now within 14(!)W of my 2080TI/ 3950x setup which I have successfully cooled with my current radiator configuration.
The fan upgrade to a Noctua NH-A14 Industrial should be able to deal with the Delta - that's the plan, at least!
There is one new item on my shopping list for this build though as a result of this detour: A second set of bullet connectors for the PSU-GPU link.
Whilst things now work, the bullet connectors run too close to maximum capacity for comfort, and taking temperature readings whilst running benchmarks confirms:
83 Degrees at the connector! No wonder the last set started looking a bit sad after a while...
And that whilst the PSU is not crossing 40 degrees - so the connectors indeed seem to be the bottle neck in power delivery. Clearly, there is some more soldering ahead... and hopefully some custom parts in the post, soon, to tackle the GPU mod!