ALL PHOTO CREDIT - COOLER MASTER

Not content with simply having the most powerful SFX PSUs, Cooler Master has shown the new SFX-L V series models. Like their smaller SFX brothers, the V1100 SFX-L Platinum and V1300 SFX-L Platinum are both 80 Plus Platinum certified, and feature 1100 and 1300 watts respectively. Both also feature three PCI-E 6+2 connectors for six connections, and a PCI-E 5.0 connector. What separates them from the SFX units is that the SFX-L units have a 120mm FDB bearing fan instead of a smaller 92mm unit. In theory, this should allow for quieter operation under load.

Currently, the most powerful SFX-L PSU is the Silverstone SX1000 Platinum, which we recently used to build our 12900K / RTX 3080 SFF machine. The SX1000 is dead quiet up to about 200 watts. It will be interesting to see if Cooler Master can push that silence barrier further with their V-Series.

V-SFX-L-Platinum-1.png V-SFX-L-Platinum-2.png V-SFX-L-Platinum-3.png V-SFX-L-Platinum-4.png V-SFX-L-Platinum-5.png V-SFX-L-Platinum-6.png V-SFX-L-Platinum-7.png V-SFX-L-Platinum-8.png V-SFX-L-Platinum-9.png V-SFX-L-Platinum-10.png

Here are the full specs for the PSUs:

V1100 SFX-L Platinum



V1300 SFX-L Platinum





You can check out the official website by clicking HERE. However, there are some errors on it at the time of writing. I recommend you download the spec sheets located HERE.
 

Skripka

Cat-Dog Perch Manager
May 18, 2020
461
567
What would be funny and ironic...if by the time there are any GPUs to finally be able to buy (2023 at least), GPUs go down in power draw back to sane levels and 1kW isn't necessary anymore.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,937
4,951
What would be funny and ironic...if by the time there are any GPUs to finally be able to buy (2023 at least), GPUs go down in power draw back to sane levels and 1kW isn't necessary anymore.
It seems like GPUs are becoming more power hungry, with rumors about the RTX 4080 being way over 400W and AMD considering multi GPU designs for a while now. Maybe these high wattage SFF PSUs are signaling this to be true.
 

GuilleAcoustic

Chief Procrastination Officer
SFFn Staff
LOSIAS
Jun 29, 2015
2,981
4,416
guilleacoustic.wordpress.com


I can't help but think that double the performances at the cost of double the power draw is not progress ....
 

tinyitx

Shrink Ray Wielder
Jan 25, 2018
2,279
2,338
It seems like GPUs are becoming more power hungry, with rumors about the RTX 4080 being way over 400W and AMD considering multi GPU designs for a while now. Maybe these high wattage SFF PSUs are signaling this to be true.
I am afraid this is going in that direction. Another hint is that Nvidia's 12-pin PCIe connector.
Thermal challenge down the road for all SFFers for sure.
 

Skripka

Cat-Dog Perch Manager
May 18, 2020
461
567
It seems like GPUs are becoming more power hungry, with rumors about the RTX 4080 being way over 400W and AMD considering multi GPU designs for a while now. Maybe these high wattage SFF PSUs are signaling this to be true.

My thought there is that the next immediate generation of GPUs is as much of a total loss as RDNA2 and Ampere was. Only people that win a lottery to buy them above MSRP will get them.

Going back to multi-GPU would be funny....given how abysmally it worked. I don't think anyone who ever ran SLI/crossfire would ever say 'You know I want to experience that again'. 🤣
 

Revenant

Christopher Moine - Senior Editor SFF.N
Original poster
Revenant Tech
SFFn Staff
Apr 21, 2017
1,733
2,806
My thought there is that the next immediate generation of GPUs is as much of a total loss as RDNA2 and Ampere was. Only people that win a lottery to buy them above MSRP will get them.

Going back to multi-GPU would be funny....given how abysmally it worked. I don't think anyone who ever ran SLI/crossfire would ever say 'You know I want to experience that again'. 🤣

I ran SLI in the Voodoo2 era, and again in the GeForce era with 570s and 770s. My father has been running SLI 1080s since the 1080 came out.

I NEVER want to go back to that. Micro-stutter is a very real and very annoying thing.
 

Phuncz

Lord of the Boards
SFFn Staff
May 9, 2015
5,937
4,951
Oh I wasn't refering to multiple GPU chips on a single card that are basically 2 GPUs in Crossfire, but multiple GPU chiplets that are on one substrate. Like AMD has done with the Zen-based CPUs.
At that stage they could go between cutting costs (2x 24 working CUs from binned 32 CU chiplets to form 48 CUs) or increase performance (2x 32 CU chiplets to form 64 CUs) with just one design of chiplet. That would allow a whole slew of performance classes with a single design, just like their CPU chiplet design.
 

GuilleAcoustic

Chief Procrastination Officer
SFFn Staff
LOSIAS
Jun 29, 2015
2,981
4,416
guilleacoustic.wordpress.com
Oh I wasn't refering to multiple GPU chips on a single card that are basically 2 GPUs in Crossfire, but multiple GPU chiplets that are on one substrate. Like AMD has done with the Zen-based CPUs.
At that stage they could go between cutting costs (2x 24 working CUs from binned 32 CU chiplets to form 48 CUs) or increase performance (2x 32 CU chiplets to form 64 CUs) with just one design of chiplet. That would allow a whole slew of performance classes with a single design, just like their CPU chiplet design.

I'm pretty sure that I've read somewhere that Intel is also heading that way ... at least for HPG.
 

thelaughingman

SFF Guru
Jul 14, 2018
1,413
1,566
Can never have enough wattage I suppose with 12900k power draw and next gen GPU expected TDPs LOL the PCIE gen 5 cable is quite sweet actually, much better than 3x 8pin cables IMO
 

XNine

Cable-Tie Ninja
Jan 11, 2022
180
254
Also, will these have a bunch of double wires? Why, WHY can't we get 1-1 cables for pinouts? Double wires just makes sleeving even more of a chore than it is already.