• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce Ampere Architecture, Board Design, Gaming Tech & Software

Dear W1zzard, can you test also with smaller PSUs as what is recommended? I currently have Corsair SF450 Platinum Plus and afraid with 3080 will explode, also 3070 questionable. Or if undervolting possible?
And if you could also write the size, the thickness i do not find at Nvidia website, just 2 slot card. And a good old 3Dmark11 score would be awesome too.
Given NVIDIA's 320 W TDP I seriously doubt a 450 W PSU will be able to handle it, unless you have a low-power CPU. Your PSU will not explode, it will simply shut down when OCP or OPP is triggered

Dimensions are always written and displayed in my reviews. I also have a photo for thickness, but no measurement, which is kinda redundant, card either needs two or three slots

No plans for 3DMark11 or other such synthetic tests
 
I rather see RTX 3080 with 320-bit 20 GB VRAM i.e. RTX 3080 Ti type product which is between RTX 3080 (320-bit 10GB) and RTX 3090 (384-bit 24GB).
 
With 20 series Nvidia stated that on average for every 100 FP32 operations per clock there are 35 integer. So we are looking at 75% to 25% ratio for the shaders or 50% better IPC where 8704 cuda are in reality performing as 6400fp32+2304 integer.

SF450 is able to provide 450 watts, so 320 watt plus 95 watt CPU load is perfectly fine with 90% efficiency. And you can underpower the GPU to 80% keeping the stock clock speeds at the same time, PSU intake fan at the bottom of the case for cool air.
 
SF450 is able to provide 450 watts, so 320 watt plus 95 watt CPU load is perfectly fine with 90% efficiency. And you can underpower the GPU to 80% keeping the stock clock speeds at the same time, PSU intake fan at the bottom of the case for cool air.

So because PSU can not explode i will try carefully the 3080 with my SF450 Corsair and with power meter measuring the power input from connector. If i read less number than 450W, plus i do undervolting, i have to be good to go...
 
So because PSU can not explode i will try carefully the 3080 with my SF450 Corsair and with power meter measuring the power input from connector. If i read less number than 450W, plus i do undervolting, i have to be good to go...
With no headroom a transient spike or the inrush current will probably trip out the psu quite regularly.
The 3070 is a maybe the 3080 would be a nightmare on that PSU IMHO.
 
I will buy 3090 and can play for a long time. I dont buy every year a new card.
 
With no headroom a transient spike or the inrush current will probably trip out the psu quite regularly.
The 3070 is a maybe the 3080 would be a nightmare on that PSU IMHO.

yeah, i have doubts too. But this guy running 2080ti +Ryzen 3950 with weaker quality 450W PSU, that gives me hope,
My CPU will be just an overclocked 3300X
 
With 20 series Nvidia stated that on average for every 100 FP32 operations per clock there are 35 integer. So we are looking at 75% to 25% ratio for the shaders or 50% better IPC where 8704 cuda are in reality performing as 6400fp32+2304 integer.

SF450 is able to provide 450 watts, so 320 watt plus 95 watt CPU load is perfectly fine with 90% efficiency. And you can underpower the GPU to 80% keeping the stock clock speeds at the same time, PSU intake fan at the bottom of the case for cool air.
Was talking exactly about this in my previous comment. People seem to be mislead by the massive number of shaders
 
According to Hardware Canucks, the NDA on the reference cards expires on Sept 14th, but the NDA on partner cards remains in place until the 17th.

Hardware Canucks
2 days ago
Hey guys. SO sorry about misspeaking. Got the review NDA dates (14th for FE, 17th for board partners) mixed up with the day you can buy it. Release date to retailers is September 17th.

FWIW - I plan on getting a partner 3080 around Xmas. Not bothered about the mythical 20GB variant as the 10GB is already a massive upgrade from my current 1080.
 
Was talking exactly about this in my previous comment. People seem to be mislead by the massive number of shaders

They're simply not comparable to past gen counts, that is really all. We've gotten lazy, with the small changes to shader counts and perf/shader being somewhat samey between generations, but its not new really. These things change from time to time.

Whatever, bigger number is more faster, not hard :) In the end, only real life perf matters and the shader is just a small part of that.
 
"The GeForce RTX 3080 is the first product to be launched on the new architecture, and for some reason NVIDIA is referring to it as "the new flagship."

Since the original rumors of the 3080 / 3090 started to drop, I started referring to the 3080 as hr consumer / gaming flagship and the 3090 as the pro pro / hybrid card that the Titan was. I don't think the marketing department and the design development groups talk to each other at nVidia.

In the end, only real life perf matters

That's true with anything .... cores .... die size .... folks blow out a lotta breath about ....

a) new technology = better
b) more cores is better
c) more VRAM is better
d) more off [just about anything] is better

But, in the end ,,,

a) a tool is not necessarily better unless it actually performs better .... this knife is better cause it cuts cans
b) more cores is not better unless your apps / games actually use them
c) more VRAM is not better when the same card with haf the VRAM peformas identically at your resolution
d) more off [just about anything] is only better when it shows an actual performance difference in actual real world conditions to which it will be used.
 
"The GeForce RTX 3080 is the first product to be launched on the new architecture, and for some reason NVIDIA is referring to it as "the new flagship."

Since the original rumors of the 3080 / 3090 started to drop, I started referring to the 3080 as hr consumer / gaming flagship and the 3090 as the pro pro / hybrid card that the Titan was. I don't think the marketing department and the design development groups talk to each other at nVidia.
I. for one, don't think 3090 (like 2080Ti before it) is worth discussing. Feels kinda like discussing Ferraris and Lamborghinis on a Toyota forum.
 
Back
Top