NVIDIA’s Flagship RTX 3090 GPU Will Have A TGP Of 350W, Complete Breakdown Leaked

NVIDIA’s next-generation RTX 3090 flagship GPU will have a significantly increased TGP of up to 350Ws. The information comes from Igor, who has been an incredibly reliable source of leaks in the past and we have also independently confirmed the same. In other words, the TGP and its breakdown is not a rumor, although we are still not a 100% sure on the nomenclature (NVIDIA may decide to call it something other than an RTX 3090 or play around etc).

NVIDIA’s Flagship GPU RTX 3090 Gets TGP Breakdown, GPU Core TDP Itself Is 230W

NVIDIA has usually played nice with power consumption on its graphics cards for the sake of power efficiency but gamers don’t usually care for the same. I don’t know of any gamer that would rather have a power-efficient card instead of a powerful one in the same price. This is why this particular leak is very interesting.

Before we go any further, however, let’s contextualize this leak: the RTX 2080 Ti had a maximum TGP of 260W on some custom variants. The RTX 3090 (or whatever NVIDIA decides to call it) will have a TGP of 350W and a TDP (GPU only) of 230W. This is a massive increase in power consumption – one that will likely be accompanied by some truly phenomenal performance gains.

This also explains the heatsink we saw in leaks earlier because you would need an absolutely mammoth cooling solution to tame this beast. It would appear that NVIDIA is planning to go out guns blazing with a chip that is designed to run extremely fast and output a ton of heat. Jensen is clearly missing his record quarters and it is showing. The complete breakdown of wattages (approximated to a few watts) is given below:

Estimated Power Consumption / Losses
 
Total Graphics Power TGP 350 Watts
24 GB GDDR6X Memory (GA_0180_P075_120X140, 2.5 Watts per Module) -60 Watts
MOSFET, Inductor, Caps NVDD (GPU Voltage) -26 Watts
MOSFET, Inductor, Caps FBVDDQ (Framebuffer Voltage) -6 Watts
MOSFET, Inductor, Caps PEXVDD (PCIExpress Voltage) -2 Watts
Other Voltages, Input Section (AUX) -4 Watts
Fans, Other Power -7 Watts
PCB Losses -15 Watts
GPU Power approx. 230 Watts

The RTX 3090’s GA102 GPU core itself will draw roughly 230 watts of power with the memory drawing 60 watts. The MOSFETs will consume a further 30 watts of power with the fan coming in at 7 watts. PCB losses are expected to be in the range of 15 watts and input section consumption will be around 4 watts. All of this totals to a total board power of 350 watts for the RTX 3090 and keep in mind this is the FE. The custom boards are likely going to go even higher.

So what exactly is happening here? Well, what you are seeing is NVIDIA unleashing all of its fury in one single product. Shifting to the 7nm node will actually lower power consumption – all things held constant – but clearly NVIDIA wants to trade away the power efficiency gains from 7nm and instead roll out an absolute beast of a GPU. We have already heard rumors that they have up-specced the RTX 3080 (and pretty much all their lineup) so users will be looking at double-digit performance gains (think 40-50%) going from the RTX 2000 series to the 3000.

I also have a very strong suspicion [opinion] that you are going to see NVIDIA slash performance per dollar pricing across the board as well. The RTX 2000 series was relatively deadpanned compared to the reception of the GTX 1000 series and I think NVIDIA will likely want to capitalize on the opportunity of the new process by increasing their Total Available Market (TAM).

Since Computex is canceled, NVIDIA will likely be going with a virtual event again although we expect the timeline to be unaffected (still around September). With the next-gen consoles rocking 10+ TFLOPs of power, NVIDIA will also be feeling a lot of pressure to not only offer higher performance but in a comparable price bracket. All this is very good for the consumer. That said, I am still fairly certain that the NVIDIA premium will remain higher than Radeon counterparts – just not as high as the RTX Turing series [/opinion].

Which GPU in the NVIDIA Ampere lineup are you most excited for?



[ad_2]