How To PROPERLY Power High End GPUs (With 12vhpwr Or 3x PCIe Connectors)
In a previous videoI discuss how it’s unusually not advisable to power your low or mid range GPU with a single PCIe cable, due to inherent limitations that exist in the PCIe cable – especially if it’s a budget (low guage) one. But what about high end graphics cards, like cards from NVIDIA like the RTX 4080 or RTX 4090, or AMD cards like the RX 7900 XTX or RX 6900 XT?
And what should you do if your PSU doesn’t have enough CPU/PCIe ports free (maybe it only has two – one for the CPU, one for the GPU)? I answer these questions and morein this video.
If you prefer text over video, please read on for the guide/transcript version of this video.
Video Transcript And Guide
Hey everyone, in a previous video I discussed why you should (usually) not power a mid range GPU with a single “devilspawn” PCIe cable – or a pigtail cable to use its proper name. And essentially that boils down to the fact that a single PCIe cable can sometimes only deliver 150 watts of power, which isn’t sufficient for many mid range GPUs like my own AMD RX 6700 XT or the NVIDIA RTX 3070… well, the versions of it that still accept 8 pin connectors, I guess. Many of NVIDIA’s recent cards actually use a DIFFERENT cable altogether which is a topic I will discuss a LOT in this video.
But I ALSO wanted to discuss how you should ideally power higher end cards that require 3 PCIe connectors, like theAsus TUF version of the RX 7900 XTX. Do you really need an 850 watt power supply and THREE SEPARATE PCIe cables like they recommend?
And whether you have a higher end NVIDIA or AMD card, what should you do if your PSU simply doesn’t support all these new power requirements? Well I’ll answer all these questions in this video, and point out where you CAN ‘cut corners’ – and also where you definitely should NOT.
ATX 2.0 PCIe Power Limits
So let’s start at the beginning… because starting at the end would be confusing I guess. THIS is a “standard” PCIe cable, one that started to get popular just after the millennium, when the ATX 2.0 power standards got formalized. It can have 6 pins or 8 pins at its end, depending on what the actual PCIe device needs. Some GPUs just need 8 pin connectors, some just need 6 pins, others need a mix – likemy RX 6700 XTthat needs one “8 pin” connector and one “6 pin” connector.
When these cables first starting coming out, which was more than two decades ago, a 6 pin PCIe cable would typically only provide 75 watts of power – especially if it was a cheaper gauge cable (known as a 20 to 22 AWG cable). And an 8 pin variant would provide up to 150 watts of power. ALSO the ATX standards specify that amotherboard PCIe slotshould supply 75 watts of power to plugged in cards. All this means that if I was looking to power a lower range GPU that needed a single 6 pin PCIe cable, and I had a cheap PSU with cheap cables, this might only be supplying 150 watts of power in total – 75 watts from each of the motherboard and the cable. This would naturally rise to 225 watts of power for an 8 pin card.
Hopefully that makes sense, so now I’m going to make one final point about low and mid range cards, because it’s important to understand how we can properly power HIGHER end cards. So what happens if a GPU needs multiple PCIe connectors, for example in the case of my own GPU that has a TDP of 230 watts? Well since it requires a 6 and 8 pin connector, couldn’t I just use the devilspawn connector that Corsair gave me? I could plug one 8 pin from this end, and the 6 pin connector from this end, right? Pfft. Now here’s where things get confusing. If I had a super budget PSU then I probably would NOT be able to reliably power my RX 6700 XT card using the single devilspawn cable, because it might max out at 225 watts of power delivery – below my card’s rated maximum. This would result in crashes and freezes, or just general poor performance, especially in very demanding games that push my card’s power use to the limits.
However the reason I said that this is confusing is because a QUALITY power supply unit will often supply BETTER cables than the bare minimum, two decade old ATX specifications allow for. If I have 18 AWG or better cables, then these can often supply up to 288 watts of power per cable – which isn’t part of the ATX specifications, but it’s a safe calculation based on the theoretical limits for these higher quality cables. And as luck would have it,many PSU makers will list the cable gaugesin the manual or online – and THESE PCIe cables from Corsair ARE actually 18 AWG – so a single one of these could probably supply around 288 watts to my GPU, which is more than enough to cover its TDP.
High End AMD And NVIDIA ‘2000’ GPUs
Phew, class is over. The theory side of things is done now, so we can specifically discuss how to power higher end graphics cards from AMD and NVIDIA – like the RX 6900 XT or 7900 XTX that has three PCIe connectors, or many of NVIDIA’s 3000 and 4000 series cards that require new cables entirely, like NVIDIA’s proprietary 12-pin cable or the 16-pin cables that were introduced by Intel in the ATX 3.0 and 3.1 standards.
So let’s tackle older NVIDIA cards and all AMD cards first, so that I can finally stop holding these like a madman. When a card “asks” for 3 separate PCIe connectors, it’s very unlikely that you will actually NEED to run three separate PCIe cables from your PSU. That’s because the TDP of a card like the 6900XT is 300 watts, and even the most budget PCIe cables should be able to supply 150 watts each – meaning that two separate cables will cover these power requirements, not to mention the motherboard PCIe slot that adds a further 75 watts of power supply. Even the 7900XTX only goes up to 355 watts. So in the case of AMD, you should be fine just to run two separate PCIe cables from your PSU, and for the third connector, you can just use the horrible pigtail end to “tick the box”. An electrical engineer might not agree with my wording there, but that’s essentially it.
In other words, unless AMD bring out a much more power hungry GPU or you try and overclock your 7900XTX card, you’ll be fine with just two separate cables from your PSU. This does have two downsides of course – not every PSU supports this, and also pigtail connectors look UGLY. But I’ll cover those points later in the video, now it’s NVIDIA’s turn.
12vhpwr & 12vh-2×6 For NVIDIA Cards
Because graphics cards have been getting so power hungry, and ‘traditional’ PCIe cables are a bit confusing, NVIDIA started rolling out a 12-pin cable for some of their 3000 series of GPUs. This new cable was designed to deliver at least 400 watts of power which is pretty impressive to be honest, and this cable had two ‘standard’ PCIe ends that would plug into the power supply unit. In other words, your PSU would need at least TWO PCIe slots free (plus at least one for the CPU, if your PSU has combined CPU and PCIe slots like my Corsair PSUs). If you have a budget PSU with only one free PCIe slot then you’re probably out of luck but I’ll discuss this point later.
You would then connect the single 12-pin end into your NVIDIA card and because the cable adaptor was included in your GPU box, this was a fairly straightforward process in most cases.
Enter the 12vhpwr connector. shudders This 16-pin cable is part of the ATX 3.0 standards from Intel, and it’s designed to supply up to 600 watts… but it has a fundamental flaw. It melts. Well not in every case, but this DID havea lot of bad pressbecause many people DID actually discover that the cable connectors were melting, ruining the cable in the best case and people’s cards in the worst case. But for NVIDIA’s high end 4000 series cards, you have no choice but to use a 16-pin cable.
The way these work is pretty much the same as the 12-pin cable – you have two PCIe ends that plug into your power supply unit, and then you insert the full 16-pin end into your graphics card. BUT make sure that you insert it all the way in, with NO GAPS AT ALL. That’s part of the reason why the 12vhpwr cables were melting in the first place… well, that and also some teething issues with third party cables from CableMod and others. But if you DO have an ATX 3.0 PSU like I have in my Homelab NAS, OR you purchased an official cable from your PSU maker (like buying a Corsair 12 volt cable directly from Corsair, not a third party) then you SHOULD be safe to power your high end NVIDIA card with this.
But if you still have concerns, that’s where the ATX 3.1 standards came in because they tweaked the design of this connector a bit to have shorter sensing pins and longer conductors, bringing about the 12v DASH 2×6 connector… which isn’t a very snappy name, but 12vh-p.w.r wasn’t either really. What’s worth noting, though, is that this new design doesn’t require you to go out and purchase new cables. The ‘older’ 12vhpwr cables are still fine – the changes from ATX 3.1 are to the GPU connectors, not the cable design itself. So as long as you have one of THESE cables from an official source (in other words, not a random no-name supplier from Aliexpress) then it will work fine – as long as it’s properly inserted into the GPU slot.
What If My PSU Doesn’t Support This?!
Hopefully that makes things a bit clearer – in many cases, using two separate PCIe cables – or the newer 12vhpwr cable – to power your high end GPU should be fine. BUT what should you do if your PSU doesn’t support these. In other words, if you have a budget PSU that only has two “PCIe slash CPU” slots, how can you power your RTX 4090?
Well the answer is: you can’t. If you DO have a card that uses 375 watts of power (for example) then I personally would NEVER try to come up with a solution that works with a single cable. It’s very unlikely that you’ll properly bepowering your graphics card. The only exception here though is that if your card has a 300W TDP and you have an 16 or 18 AWG PCIe cable, then this should be fine on a single cable because the cable SHOULD be able to supply 288 watts of power – and the motherboard PCIe slot should “top this up” to meet the card’s power requirements. But there’s a lot of “shoulds” in there. I personally like to err on the side of caution and run multiple cables for any GPU with a TDP over 225 watts, but hopefully through watching this video you can make an informed decision about what’s best for you.
How To Make PCIe Cables Look Better
One downside of running multiple cables is that it DOES look a bit rubbish with the standard cables that come with your PSU… especially if you have 8 pin cables because the pigtail connectors hang around your case and look terrible.
There’s two main solutions here. You could try zip tying the pigtail end to the rest of the cable, and this will at least hide it a little bit. But you still have the issue of these cables being quite big and bulky. What I PREFER to do is buy individually sleeved cables – and you can get these from reputable companies like CableMod, but if your PSU maker sells them direct then that’s even better. These cables look a LOT nicer, and you can often opt for “single end” cables – in other words, ones that don’t have thehorrible devilspawn Y splitter end. This means that you can then just plug these in, and it makes your case look a lot better, and you can often buy these for all types of PCIe cables.
There are loads of different PSU cables and if you wanted to learn more about what each one does then I have afull video guide you can check out HERE.
Tristan has been interested in computer hardware and software since he was 10 years old. He has built loads of computers over the years, along with installing, modifying and writing software (he’s a backend software developer ‘by trade’).Tristan also has an academic background in technology (in Math and Computer Science), so he enjoys drilling into the deeper aspects of technology.Tristan is also an avid PC gamer, with FFX and Rocket League being his favorite games.