Monday, August 14th 2023

NVIDIA Blackwell Graphics Architecture GPU Codenames Revealed, AD104 Has No Successor

The next-generation GeForce RTX 50-series graphics cards will be powered by the Blackwell graphics architecture, named after American mathematician David Blackwell. kopite7kimi, a reliable source with NVIDIA leaks revealed what the lineup of GPUs behind the series could look like. It reportedly will be led by the GB202, followed by the GB203, and then the GB205 and GB206, followed by the GB207 at the entry level. What's surprising here, is the lack of a "GB204" succeeding the AD104, GA104, TU104, and a long line of successful performance-segment GPUs by NVIDIA.

The GeForce Blackwell ASIC series begins with "GB" (GeForce Blackwell) followed by a 200-series number. The last time NVIDIA used a 200-series ASIC number for GeForce GPUs was with "Maxwell," as the GPUs ended up being built on a more advanced node, and with a few more advanced features, than what the architecture was originally conceived for. For "Blackwell," the GB202 logically succeeds the AD102, GA102, TU102, and a long line of "big chips" that have powered the company's flagship client graphics cards. The GB103 succeeds AD103, as a high SIMD count GPU with a narrower memory bus than the GB202, powering the #2 and #3 SKUs in the series. There is curiously the lack of a "GB104."
NVIDIA's xx04 ASICs have powered a long line of successful performance-thru-high end SKUs, such as the TU104 powering the RTX 2080, and the GP104 powering the immensely popular GTX 1080 and GTX 1070 series. The denominator has been missing the mark for the past two generations. The "Ampere" based GA104 powering the RTX 3070 may have sold in volumes, but a its maxed out RTX 3070 Ti hasn't quite sold in numbers, and missed the mark against the Radeon RX 6800 (similar price). Even with Ada, while the AD104 powering the RTX 4070 may be selling in numbers, the maxed out chip powering the RTX 4070 Ti, misses the mark against the RX 7900 XT with a similar price. This has caused NVIDIA to introduce the AD103 in the desktop segment—a high CUDA core-count silicon with a mainstream memory bus width of 256-bit—out to justify high-end pricing, which will continue in the GeForce Blackwell generation with the GB203.

As with AD103, NVIDIA will leverage the high SIMD power of GB203 to power high-end mobile SKUs. The introduction of the GB205 ASIC could be an indication that NVIDIA's performance-segment GPU will come with a feature-set that would avoid the kind of controversy NVIDIA faced when trying to carve out the original "RTX 4080 12 GB" using the AD104 and its narrow 192-bit memory interface.

Given NVIDIA's 2-year cadence for new client graphics architectures, one can expect Blackwell to debut toward Q4-2024, to align with mass-production availability of the 3 nm foundry node.
Source: VideoCardz
Add your own comment

71 Comments on NVIDIA Blackwell Graphics Architecture GPU Codenames Revealed, AD104 Has No Successor

#51
ViperXZ
mb194dcThere's almost no market for the very high end? That's my guess as to why no 4090 ti or AMD high end next gen.

Most users are still stuck on 1080p which 2 gens back can handle.

Something like a 6800xt can handle 4k 60 and be found for circa 500 these days.

After the leaps ahead of that gen, looks like we're in for stagnation for a while?
Yea you only need more than that if you want RT on in every game, or 4K Ultra with more than 60 fps (for me 80 minimum feels the best). The performance uplift we got end of 2020 was huge and changed a lot of things.
Posted on Reply
#52
Assimilator
ViperXZWhat a terrible take, that has unfortunately not much to do with reality. Poorly managed, at the end, yes, not in general. Otherwise you're spot-on-wrong.
Poorly managed from the start. Refusing to include 2D rendering hardware and writing their own API instead of using DirectX or OpenGL were decisions made in the early days, and ultimately strongly contributed to their demise.
Posted on Reply
#53
AusWolf
AssimilatorPoorly managed from the start. Refusing to include 2D rendering hardware and writing their own API instead of using DirectX or OpenGL were decisions made in the early days, and ultimately strongly contributed to their demise.
Yep. S3 also had their own API, and look where they ended up. At least they had 2D and basic DirectX support, though. Universally supported APIs were the way forward, ATi and Nvidia saw that, 3DFX didn't. The modern-day equivalent is owning a graphics card that only does Vulkan (no DirectX or OpenGL), and has to be paired up with something like a GT 710 to put the Windows desktop on screen.
Posted on Reply
#54
ViperXZ
AssimilatorPoorly managed from the start. Refusing to include 2D rendering hardware and writing their own API instead of using DirectX or OpenGL were decisions made in the early days, and ultimately strongly contributed to their demise.
Doesn't make much sense. If 3DFX were poorly managed from the start, the company would've never had any success. Though they had big success for some time. Again, history doesn't approve what you're saying.
Posted on Reply
#55
Prima.Vera
AssimilatorChiplets are nothing like 3dfx's approach. You are comparing apples to pigs.
No need to be rude.
I wasn't comparing anything here, you totally misunderstood. I was talking about an idea, that instead of using 1 big monolithic chip approach, you can use multiple smaller chips or chiplets, where, in theory, you can double the performance of a Video card buy just adding more of those on a 2 ratio, therefore making High-end or Enthusiast cards quite "easily" to manufacture.
Posted on Reply
#56
ViperXZ
Prima.VeraNo need to be rude.
I wasn't comparing anything here, you totally misunderstood. I was talking about an idea, that instead of using 1 big monolithic chip approach, you can use multiple smaller chips or chiplets, where, in theory, you can double the performance of a Video card buy just adding more of those on a 2 ratio, therefore making High-end or Enthusiast cards quite "easily" to manufacture.
Unfortunately nobody picked up the idea 3DFX had, of designing a GPU from the ground up to be used in tandem with multiple others (up to 8!). Though it could also be a technical limitation of more modern GPUs, my money is on this.
Posted on Reply
#57
Assimilator
AusWolfYep. S3 also had their own API, and look where they ended up. At least they had 2D and basic DirectX support, though. Universally supported APIs were the way forward, ATi and Nvidia saw that, 3DFX didn't. The modern-day equivalent is owning a graphics card that only does Vulkan (no DirectX or OpenGL), and has to be paired up with something like a GT 710 to put the Windows desktop on screen.
And S3 lasted a lot longer than 3dfx.
ViperXZDoesn't make much sense. If 3DFX were poorly managed from the start, the company would've never had any success.
That's not how anything works. History is littered with flash-in-the-pan companies that were the talk of the town for a handful of years and then faded away - DeLorean Motor Company is another good example. 3dfx had a good shot at succeeding in the long term, but later poor decisions compounded by the early, fundamental bad ones left them no space to correct either.
Prima.VeraI wasn't comparing anything here, you totally misunderstood. I was talking about an idea, that instead of using 1 big monolithic chip approach, you can use multiple smaller chips or chiplets, where, in theory, you can double the performance of a Video card buy just adding more of those on a 2 ratio, therefore making High-end or Enthusiast cards quite "easily" to manufacture.
But you can't, because that's not how technology works. You don't just add more hardware and expect more performance, otherwise SLI and CrossFire would have succeeded. Chip design is hard, things have to be balanced, and drivers have to understand how to utilise what's there.
ViperXZUnfortunately nobody picked up the idea 3DFX had, of designing a GPU from the ground up to be used in tandem with multiple others (up to 8!). Though it could also be a technical limitation of more modern GPUs, my money is on this.
Ugh, this old chestnut again.

3dfx weren't magical visionaries decades ahead of everyone else, the simple reason they developed SLI is that their GPUs were no longer competitive and SLI was a way to make up this performance gap without a fundamental redesign. SLI worked for 3dfx because GPUs were a lot simpler back then, so a multi-GPU implementation could also be simple.

It was an interesting stopgap at the right time, but ultimately it was another bad decision that killed the company - because 3dfx ended up focusing on it as the magic bullet to overcome the fundamental limitations of their architecture, instead of making the necessary design changes to regain competitiveness. Thus the doomed and company-dooming Voodoo 5, because it turns out that when your graphics card needs 4 GPUs to be competitive with a single GPU from NVIDIA or ATI, it makes that graphics card really freaking expensive for the same level of performance - and nobody's going to pay a lot more for the same performance.
Posted on Reply
#58
Prima.Vera
AssimilatorBut you can't, because that's not how technology works.
Now you are just trolling and refusing to accept anything except your own "truth".
Everything worked and works, including SLi, SLI, nvLink, Chiplet designs, Data Centers GPU farms, etc, what th are you even talking about??
Hell, even inside the big GPU are smaller GPUs or Cores, that are interconnected.....ufff. The key, is to use different packages to stack them, by not adding latency, reduce efficiency, etc. The latest tech is the Chiplet design AMD is going to use, and there is even an brand new article linked to this NEWS forum.

Btw, you know what's funny? For example, the nVidia's RTX 4090 has the power of 16.384 VooDoo2 cards. Considering that a VooDoo2 GPU had ~4 million transistors, when you do the math, it results almost the exact amount of a modern monolithic GPU transistor count...
Wonders of miniaturization.
Posted on Reply
#59
ViperXZ
AssimilatorThat's not how anything works. History is littered with flash-in-the-pan companies that were the talk of the town for a handful of years and then faded away - DeLorean Motor Company is another good example.
Comparing 3DFX with DeLorean is just laughable. You're grasping at straws trying to save your lost argument. No, no, 3DFX had huge success for a while and then was mismanaged LATER, this is well known by everyone who really knows what happened back then.

a) 3DFX had huge success initially, for years - not equivaling a "flash in the pan". Especially as the IT sector can not be compared to the car sector, which again, is laughable. 3DFX was a pioneer of the 3D graphics branch, well respected among people who understand a thing about PCs.

b) they decided to buy a factory and produce GPUs themselves, also denying anyone else to produce cards for them (for the Voodoo 4 and 5 line, which makes it obvious how late this was)

c) as the Voodoo 4 and 5 cards weren't huge successes and they overspent on that endeavor as well, they basically lost the company by going bankrupt
AssimilatorUgh, this old chestnut again.
If you aren't interested in the discussion, then you don't have to partake. But save us your boring attitude, please. Aside from the many mistakes you make while being so sure of yourself.
Assimilator3dfx weren't magical visionaries decades ahead of everyone else,
Nobody ever said that, are you sleeping or reading this discussion?
Assimilatorthe simple reason they developed SLI is that their GPUs were no longer competitive
Sorry but utter nonsense. The fact is that SLI was their philosophy of doing tech back then, as simple as this, and not whatever you just made up here, for the sake of having an argument that you can't win (due to history and facts not agreeing with you, still). SLI scaled so far, that no other company could compete with it, there were arcade machines that used 8 of these chips in SLI, together. Calling this "inferior tech", is just ridiculous, aside from the fact that they used this tech for many years and has nothing to do with going "SOS" on tech.
AssimilatorIt was an interesting stopgap at the right time, but ultimately it was another bad decision that killed the company
Not really, but I have already explained what their ultimate failure was.

I will add to it more explanations:

- marketing, Nvidia marketing was strong, same as it is now (and more even, their marketing was very toxic as well), and they convinced everyone that 3DFX cards are inferior because they didn't have a few features that mattered 0 back then. While also touting their "T&L" horn (among other things), which also wasn't very relevant, a handful of games (if that) supported it.
AssimilatorThus the doomed and company-dooming Voodoo 5, because it turns out that when your graphics card needs 4 GPUs to be competitive with a single GPU from NVIDIA or ATI, it makes that graphics card really freaking expensive for the same level of performance
Too bad that this is utterly wrong as well.

- The Voodoo 5 6000 was never released, so never played any role in anything, you're basically talking up fantasies that never happened

- The Voodoo 5 6000 was proven to be WAY FASTER than any other card, IF it had been released, by reviewers much later that happened to get a few of these cards in working shape and review them.

- If a Voodoo 5 6000 would have been released with way higher performance than the competition, the top dollar it would've cost, would've been worth it. So you're wrong with that assertion as well.

- The Voodoo 5 5500 was fast enough to compete with the competition at the time. 3DFX lost the company due to other reasons already explained in this post.

Ultimately, if 3DFX hadn't invested to buy up a company to produce their own cards, they would've not gone bankrupt. God knows what would've happened later as 3DFX was already leaving the SLI-route for next-gen and inventing single GPU cards again, the successor to the "VSA-100"-chip and architecture.
Posted on Reply
#60
Assimilator
ViperXZNo, no, 3DFX had huge success for a while and then was mismanaged LATER, this is well known by everyone who really knows what happened back then.
Assimilator3dfx had a good shot at succeeding in the long term, but later poor decisions compounded by the early, fundamental bad ones left them no space to correct either.
ViperXZa) 3DFX had huge success initially, for years - not equivaling a "flash in the pan". Especially as the IT sector can not be compared to the car sector, which again, is laughable. 3DFX was a pioneer of the 3D graphics branch, well respected among people who understand a thing about PCs.

b) they decided to buy a factory and produce GPUs themselves, also denying anyone else to produce cards for them (for the Voodoo 4 and 5 line, which makes it obvious how late this was)

c) as the Voodoo 4 and 5 cards weren't huge successes and they overspent on that endeavor as well, they basically lost the company by going bankrupt
Do you know how long 3dfx existed for? Less than a decade. And their success came mostly in the first half of said decade. If that isn't a flash in the pan I don't know what is.

And that initial success was solely due to the fact that they had no competition. As soon as NVIDIA and ATI stepped up, 3dfx began to falter.
ViperXZSorry but utter nonsense. The fact is that SLI was their philosophy of doing tech back then, as simple as this, and not whatever you just made up here, for the sake of having an argument that you can't win (due to history and facts not agreeing with you, still). SLI scaled so far, that no other company could compete with it, there were arcade machines that used 8 of these chips in SLI, together. Calling this "inferior tech", is just ridiculous, aside from the fact that they used this tech for many years and has nothing to do with going "SOS" on tech.
No other company competed with it because they didn't need to. Because those other companies had simpler technology that could accomplish the same performance. It was inferior because it was unnecessary.
ViperXZ- The Voodoo 5 6000 was never released, so never played any role in anything, you're basically talking up fantasies that never happened
It was never released because it doomed the company and a company that no longer exists can't release products.
ViperXZ- The Voodoo 5 6000 was proven to be WAY FASTER than any other card, IF it had been released, by reviewers much later that happened to get a few of these cards in working shape and review them.
Its speed was not commensurate with its price point and it lacked hardware T&L, which despite your earlier claim was incredibly important.
ViperXZ- If a Voodoo 5 6000 would have been released with way higher performance than the competition, the top dollar it would've cost, would've been worth it. So you're wrong with that assertion as well.
See previous point.
ViperXZUltimately, if 3DFX hadn't invested to buy up a company to produce their own cards, they would've not gone bankrupt.
The STB purchase was a manifestly stupid idea in multiple aspects, but ultimately 3dfx simply lacked a strategy to compete with NVIDIA and ATI. That would've doomed them regardless of how much or how little capital they had.
ViperXZGod knows what would've happened later as 3DFX was already leaving the SLI-route for next-gen and inventing single GPU cards again, the successor to the "VSA-100"-chip and architecture.
What happened was that NVIDIA put the 3dfx engineers to work on GeForce, and the result was the GeForce FX series, widely considered to be one of if not the worst series of GPUs ever released by the company.
Posted on Reply
#61
uuee
AssimilatorChiplets are nothing like 3dfx's approach. You are comparing apples to pigs.
In a way Voodoo 2 were just as much of a chiplet design as Navi31 with it's 3 chip desing (FBI + 2 TMUs).
AusWolfThe modern-day equivalent is owning a graphics card that only does Vulkan
I'd be happy if nV (as the market leader) would turn away from DirectX completely, forcing the gaming industry to also abadon it. DX is the reason MS can get away with anything...

well, at least if the chosen api would be Vulkan and not some proprietary one like Glide:P
Posted on Reply
#62
ViperXZ
AssimilatorDo you know how long 3dfx existed for? Less than a decade. And their success came mostly in the first half of said decade. If that isn't a flash in the pan I don't know what is.

And that initial success was solely due to the fact that they had no competition. As soon as NVIDIA and ATI stepped up, 3dfx began to falter.


No other company competed with it because they didn't need to. Because those other companies had simpler technology that could accomplish the same performance. It was inferior because it was unnecessary.


It was never released because it doomed the company and a company that no longer exists can't release products.


Its speed was not commensurate with its price point and it lacked hardware T&L, which despite your earlier claim was incredibly important.


See previous point.


The STB purchase was a manifestly stupid idea in multiple aspects, but ultimately 3dfx simply lacked a strategy to compete with NVIDIA and ATI. That would've doomed them regardless of how much or how little capital they had.


What happened was that NVIDIA put the 3dfx engineers to work on GeForce, and the result was the GeForce FX series, widely considered to be one of if not the worst series of GPUs ever released by the company.
Not needed to drag this discussion any longer, as facts are facts, history is history and there's no need to discuss hyperbole / and made up statements. You can maintain your opinion, but it's only that, an opinion. I'm done with that discussion. Everything important was already explained and said.
Posted on Reply
#64
AusWolf
ARFAnd now nvidia has the perfect opportunity to kill AMD, once and for good...
RTX 5090 speccs were leaked, rumoured performance uplift is 70% :eek::





videocardz.com/newz/nvidia-geforce-rtx-5090-rumors-2-9-ghz-boost-clock-1-5-tb-s-bandwidth-and-128mb-of-l2-cache

If this is real, and AMD abandons the high-end all together, then it's safe to assume that nvidia has a perfect chance to rmake the whole AMD lineup obsolete and dead as is...
Are you seriously drawing a conclusion regarding a whole company and a whole lineup of products based only on rumours of a single top-end GPU that 95% of gamers will never care about? :kookoo:

1. If not making the single fastest GPU in the world could kill a company, then AMD would be a name long forgotten by now.

2. Even if AMD decided to stop selling GPUs altogether (which they won't as long as Sony and Microsoft are a thing), that would help you how exactly?
Posted on Reply
#65
ARF
AusWolfAre you seriously drawing a conclusion regarding a whole company and a whole lineup of products based only on rumours of a single top-end GPU that 95% of gamers will never care about? :kookoo:

1. If not making the single fastest GPU in the world could kill a company, then AMD would be a name long forgotten by now.

2. Even if AMD decided to stop selling GPUs altogether (which they won't as long as Sony and Microsoft are a thing), that would help you how exactly?
Are you seriously not reading what I wrote and so heavily pro-AMD biased? Must throw your pink glasses and see the whole picture! :D :kookoo:
Ok, I will chew it down for your convenience.
70% uplift in the highest end means that more or less the whole RTX 5000 generation will move up by more or less the same number 70%.
When AMD's fastest is still that pathetic garbage RX 7900 XTX, it will line up somwhere closer to RTX 5050 than RTX 5090.
Game over for AMD :D
AusWolfAre you seriously drawing a conclusion regarding a whole company and a whole lineup of products based only on rumours of a single top-end GPU that 95% of gamers will never care about?
I begin to think that you are a troll.
The RTX 5090 will have a strong influence on the overall company reputation and will sell the other parts well.
And even if "only" 5% of the gamers buy the RTX 5090, it still means that large profit margins and revenue will flow towards nvidia, not towards AMD.
ARFAnd now nvidia has the perfect opportunity to kill AMD, once and for good...
RTX 5090 speccs were leaked, rumoured performance uplift is 70% :eek::

If this is real, and AMD abandons the high-end all together, then it's safe to assume that nvidia has a perfect chance to rmake the whole AMD lineup obsolete and dead as is...
Posted on Reply
#66
AusWolf
ARFAre you seriously not reading what I wrote and so heavily pro-AMD biased? Must throw your pink glasses and see the whole picture! :D :kookoo:
Ok, I will chew it down for your convenience.
70% uplift in the highest end means that more or less the whole RTX 5000 generation will move up by more or less the same number 70%.
When AMD's fastest is still that pathetic garbage RX 7900 XTX, it will line up somwhere closer to RTX 5050 than RTX 5090.
Game over for AMD :D
I'm trying to talk pure business, and you accuse me of bias and trolling. Lol! :roll:

If the 7900 XTX is pure garbage, then so is the RTX 4080 and everything below. If we follow that logic, 0.76% of gamers have a proper video card (4090) according to the Steam survey, and what the rest of us use is garbage. The way you think the one single flagship GPU determines a whole company's success or failure is mind-boggling!

Like I said, it's not over for AMD as long as Sony and Microsoft keep buying their GPUs for their consoles, and as long as they price their cards right.

You also haven't answered the question: what would you benefit from AMD's demise? Do you want to pay $3,000 for a 5090 and maybe $4,000 for a 6090?
ARFlarge profit margins and revenue will flow towards nvidia, not towards AMD.
Like it's been the case since decades. And?

Don't forget that AMD is a much smaller company than Nvidia with lot less spending. They don't need as much cash to make a profit.

By the way, I'm not talking for AMD. I'm talking against your nonsense.
Posted on Reply
#67
ARF
AusWolfBy the way, I'm not talking for AMD. I'm talking against your nonsense.
You can't sustain business with wrong policies, mind-boggling decisions and startegies which make the whole existence of the company disputable and unsustainable.
That bubble will explode sooner or later. Nothing on Earth is eternal, even the businesses :D

I think you have to reread everything again and stop for some time to rethink and reconsider the whole situation.
RTX 4090 is already 24% faster than RX 7900 XTX, and this is before we put the RT games into account, which makes the gap even wider - 50% faster than RT 7900 XTX



Do you know what will happen when you add 70% on top of those 50%?
I will let your imagination find itself the answer...
Posted on Reply
#68
AusWolf
ARFYou can't sustain business with wrong policies, mind-boggling decisions and startegies which make the whole existence of the company disputable and unsustainable.
That bubble will explode sooner or later. Nothing on Earth is eternal, even the businesses :D
What are you talking about?
ARFI think you have to reread everything again and stop for some time to rethink and reconsider the whole situation.
RTX 4090 is already 24% faster than RX 7900 XTX, and this is before we put the RT games into account, which makes the gap even wider - 50% faster than RT 7900 XTX
So what? The 7900 XTX is nearly £700 cheaper at the moment. These two cards are in entirely different leagues!
ARFDo you know what will happen when you add 70% on top of those 50%?
Yes. Both the 7900 XTX and the 4090 will be the same cards that they are today. Not everyone is obsessed with having the latest and greatest, you know, especially not when it means paying nearly £1,600 for it.
Posted on Reply
#69
ARF
AusWolfThe 7900 XTX is nearly £700 cheaper at the moment.
Of course it is. All AMD cards are cheaper and considerably so because the company reputation is very weak. You simply can't concince that many people to buy Radeon.
And because Radeon is weak for mining, RT, lacks DLSS, lacks other valued features...
Because geforce is the famous luxury brand, while Radeon is more like low quality, cheap alternative in eyes of the buyers.

The bad thing is that so many years have passed and AMD did nothing to change - its marketing must be dismissed as useless.
AusWolfThese two cards are in entirely different leagues!
They are not. They are direct competitors. But since AMD has serious problems with the RDNA 3 architecture, that card is much slower.

Speculation times:

RTX 4060 - 100% for 300$
RTX 4060 Ti - +20% for 400-500$
RTX 4070 - +30% for 600$
RTX 4070 Ti - +25% for 800$
RTX 4080 - +30% for 1200$
RTX 4090 - +30% for 1600$

So... if:

RTX 5090 = RTX 4090 +70%
RTX 5080 = RTX 4080 +70%
RTX 5070 Ti = RTX 4070 Ti +70%
RTX 5070 = RTX 4070 +70%
RTX 5060 Ti = RTX 4060 Ti +70%
RTX 5060 = RTX 4060 +70%

then:

RTX 5060 for 300$ ~ RX 7800 XT ~ RX 6900 XT
RTX 5060 Ti for 450$ ~ RX 7900 XT
RTX 5070 for 600$ ~ RX 7900 XTX OCed
Posted on Reply
#70
AusWolf
ARFOf course it is. All AMD cards are cheaper and considerably so because the company reputation is very weak. You simply can't concince that many people to buy Radeon.
And because Radeon is weak for mining, RT, lacks DLSS, lacks other valued features...
Because geforce is the famous luxury brand, while Radeon is more like low quality, cheap alternative in eyes of the buyers.
Cheap alternative? Yes. Low quality? Lol! :laugh:

Enjoy looking smug in your $35k BMW 5-series and all of its "extras and luxuries" cruising down the highway while my $15k Fiesta ST takes me to my destination just as quickly with just as much fun. ;)

The Radeon brand is weaker than GeForce only because of Nvidia's strong marketing campaign around its "luxuries". People believe that you need RT and DLSS and DL-whatnot to play games because that's what they're told by the media, when in fact, you don't.
ARFThey are not. They are direct competitors. But since AMD has serious problems with the RDNA 3 architecture, that card is much slower.
A $900 GPU directly competing with a $1600 one? What drugs have you been taking? Or is the 7600 directly competing with the 4070 Ti now? And the 6400 with the 3060 Ti? :roll:
Posted on Reply
Add your own comment
May 23rd, 2024 01:10 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts