The Six Ways Of Powering Graphics Cards

Powering our GPUs (graphics cards) used to be EASY. If it was a low or mid range card, you could just stick it into the motherboard’s PCIe slot and it would be solely powered by the 75 watts the motherboard would supply to it.

When cards got a bit more powerful, you could then supply an extra PCIe cable to it – giving an extra 150 watts (or so) of power.

But GPUs have now got a LOT more powerful – and power hungry. This has led to a range of different PCIe cable options, including pigtails, running multiple cables, and various ‘NVIDIA only’ cables (right now) – although admittedly some of these ‘NVIDIA’ cables aren’t solely from NVIDIA.

I discuss this and morein this video.

If you prefer text over video, please read on for the guide/transcript version of this video.

Video Transcript And Guide

Hey everyone, I was recently thinking about building PCs, as you do, and I realised that powering our graphics cards has never been more complicated. That’s because there’s now 6 different ways of powering them, although admittedly some methods only apply to certain GPUs – so there “only” 3 ways to power NVIDIA’S most powerful cards, or 2 or 3 ways of powering AMD cards – and older NVIDIA cards. It’s all a bit confusing though, so I wanted to shoot this video to unpick this completely confusing cable conundrum.

The Easy Method

So if I back up a bit and give a bit more background, there WAS a time when many graphics cards could be powered solely through the motherboard’s PCIe slot. That’s because this slot supplies up to 75 watts of power to the GPU, unless you had one of thedodgy RX 480 modelsthat drew more than this and then fried people’s motherboards – yikes.

I actually had a RX 480 card but thankfully never had that issue, BUT in general, motherboards can supply UP TO 75 watts to the card. That means that if you had a card like the venerable GTX 1050 Ti, or a card like this actually, you wouldn’t need any extra power because this card’s TDP rating – essentially the maximum power that it can draw – is 75 watts or below.

You actually CAN still buy cards like this even today, with theAMD RX 6400springing to mind, but in general things have moved on from this because progress waits for no PC gamer (that’s totally the quote) and so we all want MORE. More pixels, more textures, more frames, more shadows, and so graphics cards have got increasingly more powerful.

Enter the PCIe power cable which you would plug directly into the GPU from the power supply unit, to deliver extra power to it. I mean, this HAS been around for almost two decades now, but my point is that more and more GPUs started requiring these because 75 watts just wasn’t enough anymore. This brought about an interesting shift in GPUs too, because around a decade ago, low end and mid range cards (like the GTX 1050 Ti) would just be powered by the motherboard slot, and only higher end cards needed this separate PCIe cable. But nowadays, nearly all GPUs – even BUDGET ones – need one or even two of these cables. And higher end GPUs might even need THREE of these, or even a special new cable that delivers so much power that it MELTS and breaks everything. Okay that’s a slight exaggeration, but I’ll discuss that more later on. My point is that there was a bit of a paradigm shift in how often these separate cables were required, due to our increasing expectations from PC games – and so our GPUs.

Devilspawn

So that brings us onto powering our graphics cards with a PCIe cable, which are directly plugged into the PSU. Some of these cables are just 6 pin ends, while some have 8 pins on the end to provide a little extra power to the graphics card. Or what some PSU makers (like Corsair) do is give you 6+2 ends that allow you to either insert just the 6 pins, or the full 8 pins – depending on what your graphics card needs. However this cable can get a little complicated because sometimes it just has a single end on it, while other times it has another end on it – most people call this a Y splitter end, or a pigtail connector. I call it devilspawn, because it lulls PC builders into a false sense of security. For example if you have an AMD RX 6800 XT GPU (which requires two PCIe connectors to be plugged into it), you MIGHT think that you can use the devilspawn connector and plug both ends of the SAME cable into your GPU. But you’d be wrong.

Basically a SINGLE PCIe cable like this can typically deliver around 150 watts in total – whether you use the pigtail connector or not. Actually that’s not 100% accurate (as I discuss in another video) but it’s true enough for the purposes of this video. So if you were to try and power your RX 6800 XT card using a single cable (and the two pigtail ends), you might only supply 225 watts of power to it – 75 watts from the motherboard PCIe slot, and 150 watts from the actual cable. But because the card actually requires 300 watts of power, you would be underpowering it – and so it might crash or freeze up when playing PC games. You might be wondering WHY this horrible devilspawn cable exists, then? Well it’s to power multiple, LOWER POWER PCIe devices. If you had two devices that needed 125 watts each, THIS would be perfect because the PCIe slot delivers 75 watts – then these cables would ‘top up’ with another 50 watts (to each device).

Getting Trickier

But for modern GPUs, this cable often isn’t sufficient, which brings us onto the THIRD way to power our graphics cards. Using multiple PCIe cables. This is particularly important if your card’s TDP is over 225 watts, so that you don’t underpower it. In this case you should make sure that you plug separate PCIe cables into your PSU, and then run these to your graphics card. This will then ensure a much higher (and more stable) power delivery to your card, because you can then deliver at least 375 watts with two cables – and at least 525 watts with three cables, if you did have a really high end GPU. This is what I do withmy own RX 6700 XT GPUbecause while the TDP is “borderline” at 230 watts, I didn’t want to risk potentially underpowering it if I was maxing out my card playing certain games. I just preferred to run multiple, separate cables to it.

There are two downsides here though: not every PSU supports multiple PCIe cables, and also running multiple pigtail cables looks UGLY because you have all these horrible Y splitter ends pointlessly hanging around your case.

As a result, some PSU makers will supply “single connector” cables in the PSU box so you can then naturally use these for your GPU and they look neater than running multiple devilspawn cables to your GPU. However if you weren’t supplied with these, you can sometimes buy extra cables directly from your PSU maker (or you can buy them from a reputable companylike CableMod– just be sure to avoid buying third party PSU cables from a random no-name brand on Amazon). I actually purchased individually sleeved cables from Corsair which don’t have the extra pigtail end, so they look really nice inside your case.

And that brings me back to the other issue that I mentioned: if your PSU only supports 1 PCIe cable. If your GPU requirestwo PCIe cablesto be plugged in, what should you do? Well there are sadly only two options here: you either need to changeyour PSUentirely, OR you can explore UNDERVOLTING your card. This is where you use the bundled software to make it run slightly slower than the rated speed. This can sometimes save a LOT of power, meaning that you can then power it with a single PCIe cable (via the pigtail connector). It’s probably worth trying this before buying a brand new PSU, to be honest – just be sure to watch out for any crashes or freezes when gaming, though.

NVIDIA Go Rogue

Right, so that covers off THREE different ways of powering your GPUs so far – and what I’ve said applies to ALL AMD graphics cards, and manyNVIDIA cards– up until the last few years. NVIDIA had some concerns about how GPUs were previously powered, so starting with the 30 series of cards, they introduceda new 12-pin cable. Not all 30-series GPUs required this (just to be extra confusing!), because it was dependent on the board partners too, but SOME of the 30 series required this – in which case that cable (or an adaptor) was often included in the graphics card box.

It Burns!

However this proprietary 12 pin cable was kinda NVIDIA going rogue: it wasn’t part of any standard. So in 2022 a new cable design got bundled into theATX 3.0 and PCIe 5.0 specs– ready for NVIDIA’s upcoming 40-series launch. Yes I’m talking about the new, special “melting cable”. I mean, that’s not the official name – it’s actually known asthe 12vhpwr cable,standing for the 12 volt power cable. This 16 pin cable is a BEAST, since it can supply up to 600 watts of power. That’s more than many gaming laptops use IN TOTAL, for example. HOWEVER when this first launched, there were aFLOOD of storiesbecause some of the cables ended up melting or burning up. At first NVIDIA claimed that these connector ends were melting due to user error – saying that people weren’t pushing them all the way in. But this claim was kinda disproven, and the issues were more due to some design issues (of this cable).

ATX 3.1 Saves The Day?

As a result, ATX 3.1 was rolled out that redesigned this cable a little bit – and this introducedthe new 12v DASH 2 TIMES 6 connector. It’s becoming a bit of a mouthful now, BUT this new cable has slightly shorter sensing pins – and longer conductors inside. This means that it’s more likely to be plugged in “properly” so that voltage is delivered in a way that’s safe and avoids potential melting issues for your GPU and things like that. We hope. It does seem to be safer in general, but there’s not that many ATX 3.1 PSUs out there yet – for example I recently purchased a new Corsair PSU and it’s still stuck at ATX 3.0 and comes with the 12vhpwr cable, not the newer 12v-2×6 one.

Hopefully ATX 3.1 PSUs become much more widespread soon though – having a super expensive NVIDIA card melt away your hopes, dreams and cash is never fun, of course.

So it’s kinda interesting really, just how much powering our GPUs has changed in recent years. Even though the 16 pin cable is part of a standard, really NVIDIA have been responsible for turning a simple process into something a bit confusing – and scary. That’s my view, but what do you think? I’d love to hear your thoughts either way down in the comments. I also go into more detail onhow to power “standard” 6 or 8 pin GPUs in another videoif you wanted to check that out.

Tristan has been interested in computer hardware and software since he was 10 years old. He has built loads of computers over the years, along with installing, modifying and writing software (he’s a backend software developer ‘by trade’).Tristan also has an academic background in technology (in Math and Computer Science), so he enjoys drilling into the deeper aspects of technology.Tristan is also an avid PC gamer, with FFX and Rocket League being his favorite games.