Tuesday, March 21st 2023

INNO3D RTX 4070 Box Picture Confirms 8-pin PCIe Power Connector

A picture of the INNO3D GeForce RTX 4070 has leaked online, confirming earlier rumor that at least some RTX 4070 graphics cards will not come with new 16-pin 12VHPWR connector, but rather the standard 8-pin PCIe one. This is in line with earlier rumors that NVIDIA will probably have two variants, premium and the one that will be sold closer to the MSRP.

Unfortunately, the box shot of the INNO3D RTX 4070 does not reveal a lot more information, other than it will have a dual-slot, dual-fan cooler with detachable fan blades, and a heatsink with copper base for the GPU, aluminium base for the memory, and nickel plated heatpipes. The RTX 4070, based on the AD104 GPU and packing 12 GB of GDDR6X memory on a 192-bit memory interface, is scheduled to launch on April 13th, with first reviews showing up on April 12th.
Sources: @9550pro Twitter, via Videocardz.com
Add your own comment

24 Comments on INNO3D RTX 4070 Box Picture Confirms 8-pin PCIe Power Connector

#1
Vayra86
Oh my 4070 the value king, its coming!
Posted on Reply
#2
N3utro
Wait, does this mean this card wont set on fire by itself like the others? It's not a real 4xxx series card then.
Posted on Reply
#3
wNotyarD
Vayra86Oh my 4070 the value king, its coming!
Yours for only 699 dollars plus AIB markup, seller markup, taxes, shipping and sacrifice of three lambs.
Posted on Reply
#4
Vayra86
wNotyarDYours for only 699 dollars plus AIB markup, seller markup, taxes, shipping and sacrifice of three lambs.
The value is found in the fact Nvidia isn't including that free adapter because they don't have to :p
Posted on Reply
#5
N/A
And limited to 200W, 30% slower than 70 Ti.
Posted on Reply
#6
Hxx
they looove this crippled 192bit memory bus dont they. its like an obsession to cripple their mid/higher tier cards with this damn bus. 256bit s/b the standard on any card over $4-500 and 384bit on any card 1K or higher. nvidia just listen ffs this card has 12GB on it why would u cripple that damn bus width
Posted on Reply
#7
TheDeeGee
N/AAnd limited to 200W, 30% slower than 70 Ti.
My 4070 Ti only uses 155 Watt at 55% power limit at the cost of 5% performance.
Posted on Reply
#8
oxrufiioxo
TheDeeGeeMy 4070 Ti only uses 155 Watt at 55% power limit at the cost of 5% performance.
Shame it's so meh even at full power...
Posted on Reply
#9
kiakk
TheDeeGeeMy 4070 Ti only uses 155 Watt at 55% power limit at the cost of 5% performance.
1.Do you use wall plug power meter or software? Undervolt or just PWR limit?
2. What about the perfromance loss with a benchmark, like 3DMark Time Spy, Superposition? I am sceptic a bit (sry, but no offens) about that -45% power limit cause only 5% 3D performance loss. Maybe your 3D applications only partial load the GPU? I am just curious about the clarification.
I did heavy undervolt with my all recent GPU (RX470, RX580, GTX1060, GTX1070) and with 50-65% of PWR limit the performance loss usually more than 10%, more like ~12-13%.
But all-in-all, no doubt the 3D perf./Watt Effiviency is greatly encreasing, avoiding the heavy OC factory made boosting techs that would like to fry the GPU chip, with heavy voltage and frequency boost.
Posted on Reply
#10
Chaitanya
Vayra86Oh my 4070 the value king, its coming!
$699 for a 60 series of GPU rebadged as 70 series, now thats progress.
Posted on Reply
#11
RegaeRevaeb
A two-slot design is at least nice, thank Dog. And Inno was also one of the only AIBs to offer that on a 4070 Ti.
Posted on Reply
#12
Lew Zealand
kiakk1.Do you use wall plug power meter or software? Undervolt or just PWR limit?
2. What about the perfromance loss with a benchmark, like 3DMark Time Spy, Superposition? I am sceptic a bit (sry, but no offens) about that -45% power limit cause only 5% 3D performance loss. Maybe your 3D applications only partial load the GPU? I am just curious about the clarification.
I did heavy undervolt with my all recent GPU (RX470, RX580, GTX1060, GTX1070) and with 50-65% of PWR limit the performance loss usually more than 10%, more like ~12-13%.
But all-in-all, no doubt the 3D perf./Watt Effiviency is greatly encreasing, avoiding the heavy OC factory made boosting techs that would like to fry the GPU chip, with heavy voltage and frequency boost.
I did these tests a few days ago and my numbers are similar to yours, taking down my 6800XT from top clocks to undervolted, underclocked and at 54% power usage it retained 87% of the frames, down 13% like your numbers.

"Limited" to 200W the 4070 will retain the vast majority of its performance vs. a 240 or 280W power limit.
Posted on Reply
#13
Vayra86
Hxxthey looove this crippled 192bit memory bus dont they. its like an obsession to cripple their mid/higher tier cards with this damn bus. 256bit s/b the standard on any card over $4-500 and 384bit on any card 1K or higher. nvidia just listen ffs this card has 12GB on it why would u cripple that damn bus width
The answer:

en.m.wikipedia.org/wiki/Planned_obsolescence
TheDeeGeeMy 4070 Ti only uses 155 Watt at 55% power limit at the cost of 5% performance.
Nice token of euhhh... shitty product out of the box.
Posted on Reply
#14
TheDeeGee
oxrufiioxoShame it's so meh even at full power...
Better than my old GTX 1070.
Posted on Reply
#15
evernessince
TheDeeGeeMy 4070 Ti only uses 155 Watt at 55% power limit at the cost of 5% performance.
That's going to vary a lot based on what game you are playing. On average though you should be seeing a much larger hit.

The 4090 sees a 5% hit at 80% power. At 70% and below the hit starts increasing exponentially and at 50% you are talking 23.5%.

The 4080's performance profile is identical at the same % based power limits and I'd assume that would apply to the 4070 Ti as well.

55% is where you will see maximum efficiency before performance drops off a cliff but it's going to come with a decent hit to performance. Essentially your card will perform between a 3070 Ti and a 3080.
Posted on Reply
#16
oxrufiioxo
TheDeeGeeBetter than my old GTX 1070.
Better than a card that is nearly 7 years old that launched at 380 usd #Progress.
Posted on Reply
#17
wolf
Performance Enthusiast
TheDeeGeeBetter than my old GTX 1070.
It's a shame salty people feel the need to crap all over your purchase, and this thread. Enjoy the 4070ti.
Posted on Reply
#18
TheDeeGee
wolfIt's a shame salty people feel the need to crap all over your purchase, and this thread. Enjoy the 4070ti.
I guess some people got 2 kilos of salt in their 4090 box instead of a GPU :D
evernessinceThat's going to vary a lot based on what game you are playing. On average though you should be seeing a much larger hit.

The 4090 sees a 5% hit at 80% power. At 70% and below the hit starts increasing exponentially and at 50% you are talking 23.5%.

The 4080's performance profile is identical at the same % based power limits and I'd assume that would apply to the 4070 Ti as well.

55% is where you will see maximum efficiency before performance drops off a cliff but it's going to come with a decent hit to performance. Essentially your card will perform between a 3070 Ti and a 3080.
Was tested in Fortnite with everything maxed out, even Lumen and Nanite.
Posted on Reply
#19
oxrufiioxo
wolfIt's a shame salty people feel the need to crap all over your purchase, and this thread. Enjoy the 4070ti.
It's a shame Nvidia thinks gamers are stupid and will buy anything it puts out regardless of how gimped it is for it's 800+ usd asking price. I didn't really care for the 3080 10G and technically this is it's successor from a pricing perspective only worse because it came out 2 years later offers barely any more performance and is also only really suited for 1440p/1080P. It's only positive is the power consumption really. My 3080ti an equally bad product maybe even worse vs the 4070ti imho has already been relegated to 1440p which it's ok at I guess.

If someone looks at a 4070ti and goes geez 800 usd for that is awesome good for them. I honestly mostly just feel bad for anyone who needs a gpu and at under 1000 usd this is the best they can get at least from Nvidia anyways. A lot of people who waited out the last couple years stuck on Pascal/Turing just to basically get screwed by Nvidia.

Even though I prefer Nvidia gpus I would definitely buy a 7900XTX over it even with the slightly lesser feature set and AMD drivers because at least it offers something new from a rasterized performance perspective if my budget was less than 1k anyways.
TheDeeGeeI guess some people got 2 kilos of salt in their 4090 box instead of a GPU :D
It is pretty salty. They must have packaged it with the tears of all the gamers who now have to spend 800+ usd on a 3060ti/3070 successor in disguise. :laugh:

I'm guessing you did your own research on the product and decided it was what was best for you the same as I did with the 4090. Neither card is overly appealing giving their price and this is likely the new norm which is just kinda sad honestly. Hopefully I'm wrong.

Seeing all the 4070ti/4080 just sitting on store shelves collecting dust does give me some hope. Same with the equally bad 7900XT.
Posted on Reply
#20
evernessince
TheDeeGeeWas tested in Fortnite with everything maxed out, even Lumen and Nanite.
That makes sense, fortnite tends to draw 252 watts on a 4090 stock upcapped and maxed out at 2K.

Considering that you have a 1080p 63 Hz monitor, it makes sense how you got your 5% drop figure based on that game and monitor setup. You are essentially using only a small portion of that GPU's power.
Posted on Reply
#21
TheDeeGee
evernessinceThat makes sense, fortnite tends to draw 252 watts on a 4090 stock upcapped and maxed out at 2K.

Considering that you have a 1080p 63 Hz monitor, it makes sense how you got your 5% drop figure based on that game and monitor setup. You are essentially using only a small portion of that GPU's power.
1200p actually :p

It draws 220 Watt at stock settings.
Posted on Reply
#22
Why_Me
N3utroWait, does this mean this card wont set on fire by itself like the others? It's not a real 4xxx series card then.
Nvidia meant it as an IQ test. Nvidia sent this to those few customers who were incapable of properly plugging in their cards.

Posted on Reply
#23
RegaeRevaeb
Why_MeNvidia meant it as an IQ test. Nvidia sent this to those few customers who were incapable of properly plugging in their cards.

No no, that was the beta version; the one sent in the post had at least one square for a round hole (make of that what you will).
Posted on Reply
Add your own comment
Apr 28th, 2024 04:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts