• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RTX 4070 Ti performance unveil.

Joined
Jun 21, 2013
Messages
544 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
you guys are aware there has been like 20-30% devaluation of most currencies around the world in the past 2 years, right?
Funny how that does not apply to any other PC hardware.

It's just another rebranded card, formerly know as 4080 12 gb. So it is now called 4070 ti and maybe 100 usd cheaper. Well guess nvidia has seen how well 4080 16 gb sells:roll:
Have to wonder just how screwed AIB are, if nvidia was selling them the chips for 750~800 to begin with.
 
Joined
Jan 8, 2017
Messages
9,098 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Funny how that does not apply to any other PC hardware.

Are you really sure ? Take CPUs, Ryzen 7700X is 100$ more than 5700X was at launch for example.

For USA.
*2022 estimated

1.4+7+7.1=15.5
This is not for 2 but for 3 years.

You realize this should be compounded not added, right ? And if you do that it's about 20% like I said and this is just the dollar, the main reserve currency of the world, obviously many other currencies faired even worse.
 
Last edited:
Joined
Jan 3, 2015
Messages
2,908 (0.85/day)
System Name The beast and the little runt.
Processor Ryzen 5 5600X - Ryzen 9 5950X
Motherboard ASUS ROG STRIX B550-I GAMING - ASUS ROG Crosshair VIII Dark Hero X570
Cooling Noctua NH-L9x65 SE-AM4a - NH-D15 chromax.black with IPPC Industrial 3000 RPM 120/140 MM fans.
Memory G.SKILL TRIDENT Z ROYAL GOLD/SILVER 32 GB (2 x 16 GB and 4 x 8 GB) 3600 MHz CL14-15-15-35 1.45 volts
Video Card(s) GIGABYTE RTX 4060 OC LOW PROFILE - GIGABYTE RTX 4090 GAMING OC
Storage Samsung 980 PRO 1 TB + 2 TB - Samsung 870 EVO 4 TB - 2 x WD RED PRO 16 GB + WD ULTRASTAR 22 TB
Display(s) Asus 27" TUF VG27AQL1A and a Dell 24" for dual setup
Case Phanteks Enthoo 719/LUXE 2 BLACK
Audio Device(s) Onboard on both boards
Power Supply Phanteks Revolt X 1200W
Mouse Logitech G903 Lightspeed Wireless Gaming Mouse
Keyboard Logitech G910 Orion Spectrum
Software WINDOWS 10 PRO 64 BITS on both systems
Benchmark Scores Se more about my 2 in 1 system here: kortlink.dk/2ca4x
Funny how that does not apply to any other PC hardware.


Have to wonder just how screwed AIB are, if nvidia was selling them the chips for 750~800 to begin with.
According to nvidia. AIB's would be compensated for the shift in branding and properly price changea as well. Well if not, more AIB could do an "EVGA" towards Nvidia.

So the question is rather would nvidia risk more AIB do as evga by not compensating for there loss or risk losing even more AIB's. Not doing so could be a risky thing to do, as more of the bigger AIB's line asus, gigabyte and msi. They have other things to produce. So I think they can afford to leave Nvidia if needed and since one all ready leaves nvidia i think others are now more willing to do the same, specially if nvidia keeps screwing with there partners. They will eventually be feed up and leave them.
 
Joined
May 31, 2017
Messages
877 (0.34/day)
Location
Home
System Name Blackbox
Processor AMD Ryzen 7 3700X
Motherboard Asus TUF B550-Plus WiFi
Cooling Scythe Fuma 2
Memory 2x8GB DDR4 G.Skill FlareX 3200Mhz CL16
Video Card(s) MSI RTX 3060 Ti Gaming Z
Storage Kingston KC3000 1TB + WD SN550 1TB + Samsung 860 QVO 1TB
Display(s) LG 27GP850-B
Case Lian Li O11 Air Mini
Audio Device(s) Logitech Z200
Power Supply Seasonic Focus+ Gold 750W
Mouse Logitech G305
Keyboard MasterKeys Pro S White (MX Brown)
Software Windows 10
Benchmark Scores It plays games.
Are you really sure ? Take CPUs, Ryzen 7700X is 100$ more than 5700X was at launch for example.
You have to compare the 5800X ($449) and the 7700X ($399) as both were the 8 core CPUs at launch. The 5700X came later.

No other product market (tech) has seen such insane price increases as GPUs since covid. It's not all inflation, it's mostly greed. Seems to me you are indeed defending billion dollar companies.
 
Joined
Jun 21, 2013
Messages
544 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
Are you really sure ? Take CPUs, Ryzen 7700X is 100$ more than 5700X was at launch for example.
Is that a fair comparison? 5700X was released a year after 5800X, 7800X does not even exist, 7700X was launched instead with the future 7700X3D taking the place of 7800X.
 
Joined
Jan 8, 2017
Messages
9,098 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Is that a fair comparison? 5700X was released a year after 5800X
That makes it even worse, doesn't it ? And you can bet that 7700X3D is going to be more expensive than the 5800X was.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.55/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Duopoly(Intel is not enough to make real competition in performance/watt/drivers quality) cartel behavior, even if there is no real collusion behind the scenes.
Id love to see PowerVR Imagination Techologies make a Graphics Accelerator Card that is 100% Compatible with Windows as their mobile gpus are with Android...

Is that a fair comparison? 5700X was released a year after 5800X, 7800X does not even exist, 7700X was launched instead with the future 7700X3D taking the place of 7800X.

The 5700X is a 5800 OEM.
 
Joined
Jun 21, 2013
Messages
544 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
That makes it even worse, doesn't it ? And you can bet that 7700X3D is going to be more expensive than the 5800X was.
Sure, most people were shitting on AMD for releasing the overpriced 5600X and 5800X instead of 5600 and 5700X.
7700X and 7900X were both overpriced as well at launch, but you have to compare the 7700X to 5800X, after all, the 5700X was released in 2022 and was priced the same as 5600X in 2020.
5800X launch price was $449, while 7700X was $399.

By the time 7700X3D or 7800X3D (whatever they name it) is out, the 3D variant should launch at 7700X MSRP or max at $449, which is 5800X and 5800X3D MSRPs.
 
Joined
Jan 8, 2017
Messages
9,098 (3.37/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
the 3D variant should launch at 7700X MSRP or max at $449, which is 5800X and 5800X3D MSRPs.

That's incredibly wishful thinking but whatever, we'll see.
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.05/day)
..."classic performance"?

Is this seriously some kind of bull%$&% marketing drivel nVidia is trying to ram down our throats or something? It's not "classic" it's just plain raster. You don't need to invent new words for it and frankly no matter how much they try and push this the average gamer just doesn't care about raytracing. The problem with it is raster already got really really good over the last two decades at faking it coupled with the hairworks tier performance issues that even 3 whole generations deep nVidia has not yet overcome, to such a degree they tried shilling an upscaler as "a FeatureTM not a bug" when we all know nobody would be using upscalers at all or even talking about them if the hardware was good enough to run things natively (and I still won't take anything seriously that starts talking about non-native, it's literally saying "I can't actually do it").

What matters now is the race to 4k144hz affordability. Because this was the dream last generation, was playing things at 4k, which even raster RT-off Cyberpunk 2077 pushed the highest end hardware, and 4k panels cost too much. Even on release 4k60 panels were expensive af when 3080 and 6800XT hit. So it became more about having an affordable 4k panel, and getting cheaper and more affordable cards.

Thanks to nuVidia that didn't happen. The cost of 4k monitors dropped to the point where most people could easily afford them, no longer being like $600 for a 4k75hz we now have $300 monitors that are fairly decent. You can easily get a 4k144hz monitor now.

A year ago this is what everybody was talking about, not "4k RT" we already saw that, instead it was "oh wow I wonder if I can game on a 4k monitor with a 7700XT or 4070!" That clearly was thinking about mostly raster, and maybe super smudgy upscaling RT at best. Which to be fair, you could've done on lots of older games anyway with a 6800XT, so in fairness, 4k gaming already was a reality with the 1080ti, which iirc comparable 5700XT and 2070super can play Witcher 3 all ultra 4k60, sure. But we didn't have 4k144 before, and the buzz is that was the reality.

Instead the reality is epic disappointment as there's been no new uplift, just telescoping the halo range out which is why nuvidia dropped prices on its higher end, because these are largely just re-brands of their high end RTX 3000 series, with a halo range pushed out further with equally higher TDP, and at tremendous cost. That's the literal definition of stagnation right here. So there's nothing new and nothing to be excited about because it costs at least as much if not more and consumes more power.
True. If they would've used TSMC 6N I wouldn't have minded 15% less performance for 30% less price, as an example.
I feel like they'll consider subscription-based performance, in the coming years :eek:
You buy a RTX 5050 and for a measly 50 $ / month you can turn it into a RTX 5070 :laugh:
They have already attempted doing this with bull---- like Stadia etc. the major tech companies all have been trying to move to subscription models for ages now with even M$ Outlook being entirely online b.s. than actually downloading an application or installing Microsoft Works off disc. This is mainly because it's not just more lucrative but more reliable, you want to hook someone on heroin so they keep coming back, not just buy them a brick. "It won't kill you though" idk nvidia's working on killing you in a housefire with 3090s and 4090s :D So this isn't really something new, and they've been trying to find a way to do it for many years now where "you'll own nothing and be happy" and you're basically just renting GPU utilization from some server farm. This is why people like Jensen LOVED the mining craze, it just wasn't reliable as they hoped.

It comes down to economic models and yeah I'm f'ing jaded it's basically turning into every last thing people warned me "the commies will do this nightmare to you!" and instead I find it's the Capitalists doing it. Like I don't trust or like Valve simply based off their model, and the fact I don't own my games. I was blessed enough Poland exists, so at least I can mainly buy games from GOG, but they don't carry every game yet. Meanwhile the sh--ier companies are all trying to move to always-online b.s., they've been trying to move to subscription based services for AGES now. Like you remember that RealID b.s. with Blizzard? Well, they've been trying to find a way to get WoW-like online gaming, everyone has, it's not just about the lootbox gambling which itself if you look into it computer phones/tablets have had gambling cartels trying to suck people into that, which really the phone-slots aren't a lot different from all the farmville crap they put on there anyway. It's because they want steady and reliable returns, nobody wants to make something actually enjoyable outside a few devs, the companies themselves want addicts and a revenue stream, not a series of purchases.

I could go on about this one for ages but basically in my lifetime it's gone from the promise of machines being there to free us to them literally being tools to enslave us. I'm not even just talking about the nonstop GPS tracking, social media all that stuff of the last 10+ years, the worst being phones, I mean the entire "IoT" and locked down creepy infrastructure being specifically designed to take as much power and ownership out of the user's hands as possible and deliver it squarely into the shareholder's. So, yes, nVidia has literally been trying to find a way of pulling that off, and I'm beginning to wonder if them artificially pricing GPUs so skyhigh since Pascal (they did this every year with Pascal, Turing, and Ampere, this has f---ing NOTHING to do with inflation) they can eventually push the overton window to such an extent that "renting GPUs" becomes some terrible thing absolutely monstrously disingenuous tech bloggers hail nVidia as some saviors of gaming because they're now offering "cheap" GPU rental service, and in fact I could write word for word what these insipid c---s will say. "Oh but you'll replace your GPU anyway!" "It's so much cheaper on a day by day basis than to buy one!" "but you'll be saved the depreciating cost and nVidia absorbs it, isn't that charitable of them!" "it's really cheaper this way we promise!" "just look at those GPU prices!"

You can screenshot this, I swear to God I can write the exact talking points right this very minute to where nVidia's shills will come out pandering to gamers born in 2019 about how great this is that nobody owns a GPU anymore and why would you it costs $3000 anyway.
 
Joined
Sep 26, 2022
Messages
1,619 (2.65/day)
Location
Brazil
System Name G-Station 1.17 FINAL
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 Wi-Fi
Cooling DeepCool AK620 Digital
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire Pulse RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Thermaltake Level 20 MT
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
Id love to see PowerVR Imagination Techologies make a Graphics Accelerator Card that is 100% Compatible with Windows as their mobile gpus are with Android...
The worst piece of graphics tech I ever touched was a PowerVR creation with an Intel name. God, how I do NOT miss the time I was forced to use the GMA500...
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.05/day)
Sure, most people were shitting on AMD for releasing the overpriced 5600X and 5800X instead of 5600 and 5700X.
7700X and 7900X were both overpriced as well at launch, but you have to compare the 7700X to 5800X, after all, the 5700X was released in 2022 and was priced the same as 5600X in 2020.
5800X launch price was $449, while 7700X was $399.

By the time 7700X3D or 7800X3D (whatever they name it) is out, the 3D variant should launch at 7700X MSRP or max at $449, which is 5800X and 5800X3D MSRPs.
Nah, they're a soulless corpo screwing us, it's just Intel and nVidia is slightly eviler and more soulsucking on the average day. This didn't stop AMD from "recommending" in my software to upgrade to a 5950x and 6900XT all pandemic long, it's like, don't use my software to shill your crap I am never going to need (and I was already planning to buy a 5950x used when it fell under $300 anyway).

The real scandal to me was that they jacked prices then took out the cooler. It was a simple economic decision because the AMD boards often were cheaper, true. The cores were cheaper, sure. But what happened is the end of the day, a 75w or 65w or whatever CPU is not needing that beefy cooler, which is on top of having a stock cooler included "for free." I really liked my Wraith Prism. Most people did. It looked way better and performed way better than those piece of crap RGB coolers they stick in every prebuilt gaming PC. You know, the UFO ones. So as a result of that it just helps tie my PC together like a nice holographic ARGB rug and does the job so well there's no point getting an air tower that would've cost me $50. So, to me, not having that stock cooler+jacking the cost drove it directly into Intel territory where I am now paying $100 extra for equivalent SKU. Which is in other words what finally got me to switch to AMD from Intel because I didn't gaf about those two extra cores all that much, it was just nicer having those because I saved so much money switching from Intel.

But atm, would I go AMD? Eh. They brought stock coolers back, and the IHS is really neat looking, I'm glad it's on LGA finally. Idk it all boils down to cost to performance, really. I think this huge gain in frequency and seemingly being really good chips works, but Zen3 was not great in price at all. It simply matched Intel.

This logic btw is why I trash nVidia so frequently because it's not like I magically print money from the air, the total cost of upgrade and total build cost counts no matter how much b.s. some marketing team tries to push numbers around. The TOTAL cost on Intel back then was higher for worse performance. That's why I switched, because I got better perf for the same dollar. Same goes for Radeon. The problem with nVidia not only their performance per dollar was worse, but the watt-dollar went to s*&^ also. This is a problem, because if someone has a 600w or 750w or whatever PSU, they need to replace that PSU. Which means they now need more money for that exact same upgrade. So if AMD can release a lower TDP it's got nothing to do with your monthly electric bill, it's even the upfront build cost spiking because now you need a better PSU and 1kw or 1200w PSUs aren't cheap. So if you get the same performance but it's pulling 200w less, then you're saving not just monthly electric bills, but saving money directly on the build itself. That's the marvel of what AMD has done, because no 350w jank dumped into its socket, just breezy cool 53w while gaming, no need for AIOs or any pricier coolers, just $300 straight out of box while Intel is like $400 for the same CPU+$50 for an air tower+$50 for the board+tip. If you really look at the higher end and start factoring in PSU costs it strongly disfavors Intel/nVidia vs AMD/Radeon. AMD simply loses the edge if they take out the stock cooler and jack costs. (yes I know why they did that I realize 105w is harder to cool and that goes back to the original point)
 
Joined
Nov 26, 2021
Messages
1,372 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
You realize this should be compounded not added, right ? And if you do that it's about 20% like I said and this is just the dollar, the main reserve currency of the world, obviously many other currencies faired even worse.
While you're right, the values are small enough, and the series short enough that addition is close enough. The product of these numbers is 1.014*1.07*1.071 or 1.162 (rounded to three significant figures). That is 16.2 % inflation.
 
Joined
Jun 14, 2020
Messages
2,678 (1.85/day)
System Name Mean machine
Processor 13900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I love this board. Looking at the link down below I'm going to go out on a limb here and say $800 RTX 4070 Ti 12GB< RTX 3080 Ti 12GB in regards to price. Now the next big question is will the RTX 4070 Ti 12GB outperform the RTX 3080 Ti 12GB. I'm going to go out on another limb here and say 'yes I think it will'.

With way lower power consumption as well mind you

You have to compare the 5800X ($449) and the 7700X ($399) as both were the 8 core CPUs at launch. The 5700X came later.

No other product market (tech) has seen such insane price increases as GPUs since covid. It's not all inflation, it's mostly greed. Seems to me you are indeed defending billion dollar companies.
Oh really? Ok then, lets compare the 5600x (cheapest 6 core at 2020) to the 3600 or the 2600 or the 1600 (cheapest amd 6 core before 2020). The 5600x was 50 freaking % more expensive than the previous parts. Fifty freaking percent....
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.05/day)
Do people actually buy used gpu's online?
idk if this is supposed to be some weird flex or something, but yeah obviously? If you're a budget gamer it's more sensible and usually you can pretty much tell who's a good seller vs. bad ones because it's going to look well taken care of, include all the manuals or stickers and whatever, not clogged in filth. Meanwhile some guy that can't even arse himself to clean the cheetohs between the keys of his gaming laptop? Yeah hard pass. The only time those are good buys is when someone is like "parts/not working, idk it just don't turn on" because you can probably fix it yourself and maybe flip it for profit. Maybe. Possibly.

Seriously I just realized this is an nVidia talking point lmao dude, the used market is nVidia's biggest competitor other than AMD. I'm sure Jensen et al didn't forget how much GTX 1080ti used massively outsold brand new RTX 2080, because why tf would you ever buy a new 2080? Have fun with the space alien artifacting bugs. Like it took me a minute to realize you're literally pushing something, yes, buying used GPUs online is a far superior option to buying a brand new GPU from nVidia, in fact this is what the majority of nVidia GPU owners actually do is basically wait until the next launch where the market gets flooded with used GPUs. This very forum is a great case in point you'll notice how many people are saying they're just gonna get a 6800XT or 3080 or 3070tie or 3090 or 6900XT or whatever, rather than bother with getting a Lovelace GPU. It happens especially commonly following a crypto crash where you can snatch up cheap gaming GPUs because miners are having a firesale desperately hoping to recoup what they can before their hardware is totally worthless because it's solely a monetary determination to miners. Because 2080 was so gouged (every year nVidia raises the price enough they basically compete with themselves on the used market and their own used GPUs often outcompete them) it happened after a mining boom so it wasn't just the GTX 1080ti was a legendary card, there were strong market/supply factors determining why so many gamers just bought a used 1080ti than bothering with the RTX 2080.
In games much slower than synthetics are suggesting, only 30% faster than a 3070 Ti and so obviously a 3080 Ti level, that is until proper GDDR7 in 50 series.
oh god, lmao for some reason it was THIS exact comment that made me realize Lovelace literally *is* Fermi numbering too. That's right hahaha RTX 4000=GTX400, let's see if they do GTX500 series tier. I was going to make a raytracing joke but I just realized, these will age like DOG S&%$ because of the fact it's so horribly performing that, basically, if anything requires RT baked right into the engine to switch into running UE6 or whatever that basically well just compare the raytracing performance of the RTX 2080. Even the 2080ti was getting 30fps @1080p native, and DLSS1.0 was legendarily bad.

So basically what I am saying to you is, that you should buy what you think is worth it now, and not con yourself into believing this is "futureproofing" because I guarantee you all the people that bought old-RTX cards traded them in immediately already. Like nobody is still actually using an RTX 2070 for its RT. You would've sold that S then you'd be thinking about dropping your 3070 and getting a 4070, because some people on the hamsterwheel are already de-facto on a subscription model. There is no such thing as futureproofing with RT. Sorry laughing about Fermi just made me remember all the dumb boneheaded S people said over the last several years I seen online, like "just get a 2060 dude DLSS is making native resolution obsolete! You'll be able to play on a 4k monitor with your 2060 with DLSS!" or "everyhing will be raytracing someday so you should just get a 2070 lol AMD cards are going to be unable to play any games" etc. lol those aged poorly, as did "oh man you better sell your 2080ti RIGHT THIS MINUTE FOR $300 IT'S NOT EVEN GOING TO BE WORTH $400 WHEN THE 3070 RELEASES" hahaha the amount of stupid things I read this pandemic
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,482 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
You're essentially getting the performance of a 3090ti at half the RRP, using almost half the power, with support for newer tech, and only really slower in 4K resolution due to the memory config, where you should be using a RTX x80 or RX x9xx series card anyway. Plus it's the most efficient GPU generation in existence.

Still, people will complain.

With a 70% power limit you lose around 10% of performance according to Der8auer, so that makes it feasible even for passive cooling builds.
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.05/day)
You're essentially getting the performance of a 3090ti at half the RRP, using almost half the power, with support for newer tech, and only really slower in 4K resolution due to the memory config, where you should be using a RTX x80 or RX x9xx series card anyway. Plus it's the most efficient GPU generation in existence.

Still, people will complain.

With a 70% power limit you lose around 10% of performance according to Der8auer, so that makes it feasible even for passive cooling builds.
>support for newer tech
>with support for
Name it. Name one. Single. Thing. That stupid thing has that's "supporting newer tech" over the 3090ti. This is why I don't take people like you seriously at all and why some people sound like they're being paid off, or overdosing on copium after scamming themselves into buying something wildly, outlandishly overpriced they deserve being beaten by their wife for buying. Getting a 4080 or 4090 is like Negan buying a $600 jacket while he's unemployed.
>the most efficient gpu generation in existence
lmao
Dude it's Fermi at outrageously high prices even by nuVidia standards.

They peaked at Pascal. Everything since then has been a downhill ride. This includes price, performance uplift to value and TDP, to just general reliability. It includes them being such an outrageously dickheaded company that everyone dropped them, now including finally coming to a long and glorious end of EVGA cards. There will never be an EVGA Classified edition again. That includes them having all kinds of driver problems and driver bloat causing me to frequently see people on Steam forums complaining about their vaunted 3070 or 3080 or whatever having severe performance problems, sometimes from CPU bottlenecking, sometimes other reasons. It's not just that the 3090 burst into flames on new world, or 4090 burst into flames from its connector, or the way you can't even fit many of these cards into normal cases. It's not even just the general scumbag way nVidia treated tech reviewers, partners, or the general public, with solely their own buyers being simps enough to keep handing their money over. It's a broad, repeated standard of behavior that was only loosely tolerated in the past because they offered great performance at sane enough prices people just sorta put up with it, since AMD used to be unreliable.

People are complaining for a reason, and those reasons are very very valid. It's super inefficient that hogs almost as much TDP as it does money and it doesn't offer one single new thing to anybody. That's why this is going to be a pretty poor generation and why people are just going to get AMD cards instead, because what everyone was waiting for was affordable 4k144hz anyway.
 

Attachments

  • 1605739249906.jpg
    1605739249906.jpg
    95.8 KB · Views: 63
  • Lovelace tldr.png
    Lovelace tldr.png
    460.2 KB · Views: 63
  • the absolute state of nuvidia 3.png
    the absolute state of nuvidia 3.png
    1 MB · Views: 61

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
40,435 (6.55/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The worst piece of graphics tech I ever touched was a PowerVR creation with an Intel name. God, how I do NOT miss the time I was forced to use the GMA500...
Dreamcast
 
Joined
Feb 3, 2012
Messages
201 (0.04/day)
Location
Tottenham ON
System Name Current
Processor i7 12700k
Motherboard Asus Prime Z690-A
Cooling Noctua NHD15s
Memory 32GB G.Skill
Video Card(s) GTX 1070Ti
Storage WD SN-850 2TB
Display(s) LG Ultragear 27GL850-B
Case Fractal Meshify 2 Compact
Audio Device(s) Onboard
Power Supply Seasonic 1000W Titanium
Joined
Jun 1, 2011
Messages
3,963 (0.84/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
i hope the entire the gpu sector crashes, all these cards are over priced
 
Joined
Sep 1, 2020
Messages
2,065 (1.51/day)
Location
Bulgaria
Why ppl wrong called dlss and RT wanted by them? These are things that Nvidia forced upon us. Of course, with the conditional freedom not to use them if we don't want to, but unconditionally paying their price included in the purchase.
 

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.05/day)
Oh oops I didn't even post the right one. Or did I already, idk doesn't matter I forget
i hope the entire the gpu sector crashes, all these cards are over priced
That's the thing! It already did! I just don't get, like I think that MSRP for the 4080 got fantasized to existence by people during the mining boom, and they must've then told their shareholders this or something, usual Capitalistic short-term gain long-term pain kinda thought process like an alcoholic or gambler or something, got themselves locked into it and had to tell people they were selling for so-and-so price, I don't even blame Jensen necessarily, I think he and his board pretty much got attached to it for some stupid rationale. I can't imagine these people being insane or evil enough to think an 80 card going up to $1600 is reasonable. But likewise, I'm speechless how many people I've seen trying to rationalize it. It's like, you're not voting for your favorite political party here. You're arguing why the Umbrela Corp is perfectly justified in charging 300% more than the GTX 980's MSRP--after the great crypto crash and GPU selloff.

There's no reason to buy a new card at all. I definitely made the right choice in abandoning my plans to upgrade before the launch.

Why ppl wrong called dlss and RT wanted by them? These are things that Nvidia forced upon us. Of course, with the conditional freedom not to use them if we don't want to, but unconditionally paying their price included in the purchase.
Because they had more market share they were able to try and ram things down everyone's throats in a monopolistic way, and they absolutely did this by force. Even Linus and Jay, both basically nVidia fans, got super pissed about the way nVidia was treating guys like HWUB, that in turn led to a lot of bad blood in the tech community for their efforts at bullying and silencing smaller channels for not pretty much putting RT front and center. Yeah, it was definitely forced. I think a lot of us (myself included) really missed when Duke Nukem type games had "mirrors" using different tricks, and then coming to Dead Space 2 and Deus Ex: Mankind Divided or whatever and all the mirrors are frosted over because they couldn't use the same tricks. I think RT, unlike plenty of other gimmicks, has a future, it's just they spent so long working on raster tricks that raster became very very good at making itself lifelike to the point games like Control are basically tech demos masquerading as games since it's mainly reflections that you see a real difference.

The problem, RT is so intensive just one bounce smashes framerates, and since nVidia was planning to jack prices they needed something better than https://forums.anandtech.com/thread...e-2080ti-provides-around-30fps-1080p.2552891/ 30fps lol. So it was their jerryrigged duct tape solution to the problem, then they decided selling as "a featureTM not a bug." I saw some people (to this day idk if some were marketers or dumb kids) trying to claim DLSS would make native irrelevant. Obviously, that's a few layers of idiotic, but basically encapsulates nuVidia marketing strategy, to focus all attention possible on the most proprietary thing possible, while blithely ignoring the real TDP/price/performance metrics which actually matter to us. Really it's their bullying that left such a foul mark
even Linus, who famously despises AMD and has said so (his whole business is making videos, so I'm sure oldschool AMD issues there were frustrating lol), laid into them
The reason nVidia was acting this way was because they "went all in on raytracing" and so hoped they could get all the tech bloggers in line to push this raytracing narrative, that clearly would give them an undisputed advantage, and the gamers just didn't care about enough. I still think lots of games like the more atmospheric or survival horror benefits from this, it became a chicken-and-egg problem where you needed the tech to work for game makers to use it, but you needed gamedevs to start implementing to sell it, since all like two games existed when Turing launched that used that.

I think the broader problem you hit on though is just their general attitude, where DLSStm was this proprietary thing shackled to the expensive new card, that would never work on a 1060, whereas AMD in a power move released opensourced FSR that could be used by a GTX 970. nVidia just feels super controlling and toxic, basically. Like they hate any concept of freedom. It's something tempting to every tech company try and start their own ecosystem, and it often fails because why would yo buy the $160 set of Corsair fans and can't even use it with your other fans needing an additional $160. I mean, the 2 wires and the 3pin connector sucks. Very true. But at end of the day, technology wants to be freely useable and more interchangeable, it's why USB 2.0 and 3.0 exist and became so prolific. So I don't even think DLSS will last much longer. I'd bet they drop it after 4.0 on RTX 5000 series, if it even gets implemented then. Literally it has no purpose beyond making RT playable fps.
 

Attachments

  • the absolute state ...6.png
    the absolute state ...6.png
    3.3 MB · Views: 44

zer0day777

New Member
Joined
Jan 4, 2023
Messages
23 (0.05/day)
1440P is what this card is geared for.

Laughable. I know I, personally, was stoked for the 4070 vs 7700XT matchup because it meant 4k gaming was finally here for those of us that didn't want to spend over three grand on a system. To me, it meant that 4k144hz was finally a real possibility. Not the ti--the regular 4070.

7800XT or 4080, I imagined, would be more than sufficient to power the vast majority of my games, achieving 144fps on Witcher 3 for example. I mean, a 5700XT can do like 83fps all ultra at 1440p on that game. I was imagining all but the most incredibly demanding titles like Cyberpunk 2077 being easily 4k60 minimum on a 4070. I'm not even sure the point you're trying to make with that, CP2077 is unoptimized jank and everyone knows that. For the vast majority of titles the midrange this gen should be fine for 4k. The problem, affordable 4k gaming is what the majority of us were stoked to see.
That's completely not true. Most positions aren't paying 30% or 300% more. These are 2080ti prices for substantially inferior hardware. No, it doesn't matter that it's "still more powerful than the previous gen" if it gives 21% more power and costs 30% more money, that's just shifting the names around (pulling numbers out of thin air to make a point obv). If I can get 50% more uplift, and it costs 50% more too, then it's just extending the halo range deeper. 3090ti is still halo-tier prices, it's just the things they decided to term "4070ti" also are halo cards now. No, it doesn't matter that it's more powerful than a 3080, since it costs way more than a 3080's MSRP ever did, and no, it does not matter that it's more powerful than a 3080 two years later as that's the entire point of technological innovation in computer hardware you get increased power for the same amount of money or less.

literally stuff like this made me stop believing in Capitalism

If nVidia wants to call a 5060ti a 5080 and charge 5090ti prices for it it doesn't matter, it's still that cost. Which is another thing, I really don't care if the AMD card costs less and is being competed with a tier lower, performance matters, so in game recommendations it doesn't matter if a RTX 3050 costs $400, and you compare it to 6600XT, if the game runs on 6500XT settings. It's just moving names and numbers around to make nVidia's bad deals look less like the bad deals they are, because it's paying more and receiving less.
 
Top