• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

About to order a new pc. How good is my GPU should I Swap to 4070?

Status
Not open for further replies.
Joined
Sep 17, 2014
Messages
20,777 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Yes, prices has increased over time - what a shocker. Heard about the concept of inflation ?
Ah, we had 50% economical inflation between Turing and Ampere? :laugh:
The real context is Ampere gained that xx103 SKU and Nvidia carved out a new market segment 'for creators'.

Either way, I'll concede! You were right about more examples closer to 70%. Especially the 980ti > 1080ti one surprised me, even if its +67%. Still significant.
So maybe general advice with caveat: unless you are only on top end of the stack.
 

bug

Joined
May 22, 2015
Messages
13,163 (4.07/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Eh? Its not factually incorrect if you found one exception to a rule, especially one called 'generic advice'. Thát is the whole point.
Exactly. I said in most cases it doesn't make sense to upgrade from one gen to the next. Even if with some generations the fastest card is much faster than the fastest card in previous gen, that still leaves like 8-10 other SKUs that do not sport the same improvements. So in most cases, it still doesn't make sense to jump on every gen.

Also, he cheated a bit, because the 3090 is the successor of the Titan RTX, not of the 2080Ti. But Nvidia made such a mess of their lineup, you almost need a PhD to keep up.
 
Joined
Mar 29, 2023
Messages
920 (2.51/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
Ah, we had 50% economical inflation between Turing and Ampere? :laugh:
The real context is Ampere gained that xx103 SKU and Nvidia carved out a new market segment 'for creators'.

Prior to turing the pricing had been the same for a very long time - they had to increase them eventually. Did they increase them too far? Possibly. But if you compare the price of a 4090 ano 2023 vs 8800 ultra ano 2007 adjusted for inflation, they aren't actually that far from each other. Launch price of the 8800 ultra was 829 usd, which in todays money is 1219 usd.

As for getting hung up on whether they call it 102 or 103... again, doesn't really matter. What matters is : is it the big chip of that gen, yes or no.

When they gave the gtx 680 the midrange 104 chip, that was an entirely different matter though... which is why there was a very big gap in performance between the 680 and the titan / 780 ti... and it's exactly the same thing they did with the 4080 (unlike the 3080 that did get the big chip) where it got the much more whimpy midrange chip (now ironically numbered 103 - possibly to deflect critism of the 4080 getting a smaller chip ?).

Either way i will give you this much - nvidia has been skimping alot more on the midrange cards since the 1000 series, in various ways - some cards on power, others on vram (mostly vram though).
 
Last edited:
Joined
Sep 17, 2014
Messages
20,777 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Prior to turing the pricing had been the same for a very long time - they had to increase them eventually. Did they increase them too far? Possibly. But if you compare the price of a 4090 ano 2023 vs 8800 gtx ano 2006 adjusted for inflation, they aren't actually that far from each other.

As for getting hung up on whether they call it 102 or 103... again, doesn't really matter. What matters is : is it the big chip of that gen, yes or no. Yes, 102 was the number of the big chip prior to ampere, and yes 103 is the big chip with ampere, and now with ada it's numbered 102 again.

When they gave the gtx 680 the midrange 104 chip, that was an entirely different matter though... which is why there was a very big gap in performance between the 680 and the titan / 780 ti... and it's exactly the same thing they did with the 4080 (unlike the 3080 that did get the big chip) where it got the much more whimpy midrange chip (now ironically numbered 103 - possibly to deflect critism of the 4080 getting a smaller chip, so they gave it the same number as the big chip of the prior gen, to appease people that get hung up on the chip number?).
This whole exercise in GPU history is exactly because it does matter what SKUs are in a stack. You even point that out the very paragraph after telling me not to get hung up on SKUs :) You too have to concede that the 680 and 780/ti are entirely different products due to the SKU underneath.

That's also what bug is saying. Nvidia's stack gen to gen is a mess. So I get what you're saying in that 'you just look at the top end of each gen' to make your comparison / come to percentages. Absolutely.
But about that 103... that's an EXTRA SKU in the stack. This means there is a bigger gap between GPUs around that SKU. The 104, the 103, and the 102 are all in Ampere. 103 didn't come on top of it, it covers the x80 segment. The stack literally got bigger. This also means the gen-to-gen perf jump is potentially a lot bigger.


But, again... not so much unicorns from your POV, I admit.
 
Joined
Mar 29, 2023
Messages
920 (2.51/day)
Processor Ryzen 7800x3d
Motherboard Asus B650e-F Strix
Cooling Corsair H150i Pro
Memory Gskill 32gb 6000 mhz cl30
Video Card(s) RTX 4090 Gaming OC
Storage Samsung 980 pro 2tb, Samsung 860 evo 500gb, Samsung 850 evo 1tb, Samsung 860 evo 4tb
Display(s) Acer XB321HK
Case Coolermaster Cosmos 2
Audio Device(s) Creative SB X-Fi 5.1 Pro + Logitech Z560
Power Supply Corsair AX1200i
Mouse Logitech G700s
Keyboard Logitech G710+
Software Win10 pro
This whole exercise in GPU history is exactly because it does matter what SKUs are in a stack. You even point that out the very paragraph after you too have to concede that the 680 and 780/ti are entirely different products due to the SKU underneath.

That's also what bug is saying. Nvidia's stack gen to gen is a mess. So I get what you're saying in that 'you just look at the top end of each gen' to make your comparison / come to percentages. Absolutely.
But about that 103... that's an EXTRA SKU in the stack. This means there is a bigger gap between GPUs around that SKU. The 104, the 103, and the 102 are all in Ampere. 103 didn't come on top of it, it covers the x80 segment. The stack literally got bigger. This also means the gen-to-gen perf jump is potentially a lot bigger.


What im saying is that what chip it actually is, is what matters - not the numbering. The numbering is just nvidia naming shenanigans, which they have been up to since forever.

The chip class that they currently number as 103 was just previously numbered 104, but it didn't change anything in regards to the actual silicon - and what actual silicon you are getting can seen by the specs - namely the die size. But yes, they did move the numbering of the entire stack, aside of the 102. Meaning the chip class numbered 104 today is a significantly smaller chip than it was prior to the new numbering, as seen with the rtx 2080, which had 104 chip that was ironically much larger than the 103 chip of the 4080.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
3,967 (1.74/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 57ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply Corsair SF750 Platinum, transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
Might be worth it to take it to DMs as this is off topic and the OP has already left.
 
Joined
Nov 9, 2010
Messages
5,649 (1.16/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
The 4070 has AV1 recording, better ray tracing performance and lower power draw. 6950XT has better rasterization performance and 4GB more VRAM. Pick what is more important to you.

And in that decision, somewhere in there you need to factor in whether Ray Tracing matters to you,. because clearly Nvidia's engineers are pushing that far more than AMD.

That said, Nvidia are also now perusing Neural Texture Compression, a software means of dealing with the very high VRAM demands of some games, and I'm not sure that will pan out well. It's probably going to come down to what GPU models support it, what they cost, whether the games have to support it, and if so, how many developers jump on board. And THAT'S if NTC even WORKS as well as they say it will.
 
Last edited:
Joined
Dec 25, 2020
Messages
4,440 (3.73/day)
Location
São Paulo, Brazil
System Name Project Kairi Mk. IV "Eternal Thunder"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard MSI MEG Z690 ACE (MS-7D27) BIOS 1F
Cooling Noctua NH-D15S + NF-F12 industrialPPC-3000 w/ Thermalright BCF and NT-H1
Memory G.SKILL Trident Z5 RGB 32GB DDR5-6800 F5-6800J3445G16GX2-TZ5RK @ 6400 MT/s 30-38-38-38-70-2
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 1x WD Black SN750 500 GB NVMe + 4x WD VelociRaptor HLFS 300 GB HDDs
Display(s) 55-inch LG G3 OLED
Case Cooler Master MasterFrame 700
Audio Device(s) EVGA Nu Audio (classic) + Sony MDR-V7 cans
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Logitech G305 Lightspeed K/DA + Logitech G840 XL K/DA
Keyboard Logitech G Pro TKL K/DA
Software Windows 10 Enterprise 22H2
Benchmark Scores "Speed isn't life, it just makes it go faster."
If you already have a 6950 XT, its a sidegrade. Your card will chug power in comparison, but its not worth upgrading to anything other than the RTX 4090 if you have a 6950 XT or RTX 3090/3090 Ti

cheers
 

mathohardo

New Member
Joined
Feb 23, 2023
Messages
5 (0.01/day)

my list but can I swap to a 4070 or stick with my 6950
its NEVER a good idea to replace any computer part 1 gen later unless u got money to burn hell linus with 12 million subs on u tube cheating a bit and not counting the controler and building it himself couldn't match consoles....

gen on gen games get at time 3-20% perforce and that is SSSSSSOOOOOOO BAAADDDASDADAD how so? keep in mind the 2013 x box one was 1.3 teraglops then 2020 now old one is 12.. i promise you the 4070 will be about 20-30% weaker then the cheap 500-600 dollar next gen consoles that are entire pc system with cables and a controller when it comes out this is all basied on the last decade of tech
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.61/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Stick to the 6950
 
Status
Not open for further replies.
Top