GPU RX 480 Powergate a Non concern for early adpoters

ITXtended

Efficiency Noob
Original poster
Jul 2, 2016
6
6
itxtended.com
Here's a tidbit of info that I've come to realize and I feel like absolutely needs to be shared, people are worried that this graphics cards is going to fry their motherboards because of all the reports that sprung up after the NDA was lifted on the 29th. Well I have news for you, the card only draws 16-50W at the PCIe slot while at idle and under light loads i.e. web surfing, which should be mentioned is well within PCI-SIG specs. So the notion that this particular card is going to fry your motherboard upon installation and first boot up is ridiculous!

Secondly, this quote direct from Tom's Hardware (the culprit of this whole powergate fiasco): "AMD's Radeon RX 480 draws 90W through the motherboard’s PCIe slot during our stress test. This is a full 20 percent above the limit. To be clear, your motherboard isn't going to catch fire."

I have a really big problem with this because the game they tested was Metro Last Light at 4K resolution and it's very vague because they didn't specify at what settings (to really stress the card to its max I'm sure it was Ultra). Now, people are saying the target audience for the RX 480 is very cheap and will throw this card in a $50 motherboard and burn their house down (lmao)! Now comes the kicker! This cheap class of PC gamers (referring to us), know that this particular card is NOT a 4k playable GPU! So why on earth would we buy a $200 GPU and a $50 motherboard to try and game in 4K??? SERIOUSLY???

I, for one, would have loved to see what the 480 pulled from the PCIe slot on any given game at 1080p in ultra settings in Tom's findings, I guarantee it's not 90W! The vast majority of the people who are buying this card do not even own the hardware capable of displaying 4k resolution (4k monitor)! I feel like they set out to kill this card well before it got to the hands of the consumers because they were worried about how blown away we all would be by the fps per dollar value in this "cheap segment"! Also, if you are like me, and have lots of experience with system building, you won't even hesitate to get a 480, but if you are new to all this, you shouldn't worry about killing off any hardware that you don't even have the capability to destroy for lack of high end equipment (again, 4k monitor).

The problem lies with gaming at 4K under heavy settings. Seeing that the 4K spectrum exists, AMD has to address it. They've already stated that they are providing a fix via bios in the coming days. Anyone gaming at 1440p and lower are not going to experience the heavy watt pulls from the PCIe slot that would destroy a motherboard! MOST people aren't going to be stress testing their card at 4K, they are going to be casually gaming at 1080p and the occasional 1440p if they have the monitor to support it! The problem doesn't lie within these confines so those of you raising pitch forks need to STOP with the notion that this card is a constant power hog that will unleash the gates of Hades on your motherboard!

I continue to debunk the claims that this card is going to cause a motherboard failure because it just isn't going to happen under normal circumstances. I've purchased 3 cards from three different manufacturers, and will be concluding my own testing in the coming days with various system designs and benchmarks! Every claim that I have made has a basis of truth from research and testing and YEARS of system building!

Before anyone comments saying that these remarks are coming from a fanboy, please be advised that I advocate for ALL GPUs by both Nvidia and AMD, it just so happens that AMD is under current scrutiny. If you want a 1070 or 1080 FE, I'm all for it! You want a Maxwell or Grenada GPU because of the price reductions? I'm all for that as well, there is still lots of value in the previous generation! I say Go Red or Go Green, we should all reap the benefits of competition for years to come!
 

confusis

John Morrison. Founder and Team Leader of SFF.N
SFF Network
SFF Workshop
SFFn Staff
Jun 19, 2015
4,168
7,135
sff.network
well written! I expect this issue to be as easy to fix for AMD as founders edition fangate was for nVidia. How they are going to do it, nfi, but software is a lot more in control than it used to be.
 

EdZ

Virtual Realist
May 11, 2015
1,578
2,107
I don't expect there to be a software fix for this issue. The core power delivery phases are physically connected to the PCIe slot and PEG connector: half on the PEg connector, and half to the PCIe slot. This is why the power draw for both rise in lock-step rather than PEG power draw rising and PCIe draw staying steady. It is a hardware-level issue, without a board revision there is no way to limit PCIe current other than limitjng core power draw (hard power cap, below the current stock power draw) or by killing one or more of the phases drawing from the PCIe slot and hoping the card remains stable with fewer phases.
Rendering the situation even more bizarre, is the PEG connector is also non-compliant: Pin 5 is not wired as a Sense connection as per the standard, but is directly connected to the same ground plane as pins 4 and 6. This was clearly done in order to increase the current availability at that connector, but the PCB was not designed to take advantage of this by drawing the majority of power from the bodged PEG connector. I truly have no clue what on earth AMD were thinking with this PCB design.

I, for one, would have loved to see what the 480 pulled from the PCIe slot on any given game at 1080p in ultra settings in Tom's findings, I guarantee it's not 90W!
PCPer also ran power tests. At stock clocks at 1080p, on multiple games, the 12V MB power alone was either pegging 70W, or fluctuating between 70W and 75W. Both cases are above the 66W 12V max specified by PCI-SIG (12V @ 5.5A). With the contribution of the 4W/5W from the 3.3V line, this pushes the card well over the total slot draw spec of 75W in every case.
Overclocking, of course, pushes that way up, over 100W from the slot in some situations.

As for danger to boards: for any 'gamer' board, even a budget one, the change of actual damage to hardware is low. The chance of instability on a budget gamer board is somewhat higher: high current draw means voltage droop, and this can cause strange Things to happen to components that are not designed to tolerate that abuse. I've heard reports of odd audio issues (dropouts & 'snaps') and general instability.

But that's not the real problem. The real problem is for OEMs: aftermarket motherboard manufacturers overbuild boards because they expect the end user to do strange things with them (overclocking, shoving in extra cards, etc), and they'd rather the board not fail immediately. A board designed for a pre-build however (i.e. not a standard ATX board), hews much closer to the letter of the specs because it can. The manufacturer knows that any card they put in will conform to spec, they can deny warranty for end-user modifications, and they save cost through avoiding overbuilding.
That means that a card that does not conform to spec is not a great prospect for incorporating into existing pre-built machines, and future machines intended to include the RX 480 needs a motherboard deliberately designed to tolerate overdraw, which means more expense (both in a new design, and in increased component cost). Even if the component cost increase is marginal, the upfront cost of a redesign may cancel out the price/perf savings of the RX 480.
It;s also bad news fr someone who has brought a pre-build and wants to upgrade may need to be more wary of the RX 480 than they would be otherwise. Even in the case that the PC doe snot use semicustom power connectors (allowing for an uprated PSU to be installed) there is little they can do to increase power delivery to the slot.
 
  • Like
Reactions: Ceros_X