• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Possibly Maxes Out AD107, NVIDIA's Smallest Ada Silicon

Possibly 3070 performance using 1/3rd the power.
 
4060 le super ti extreme for 600e incoming, brace ya selfs :D
 
Degusting nGreedia.
Childish name calling aside, what exactly are they carefully tasting to appreciate fully? given the name calling, I'ma go with profits? :roll:
you have a card that's guaranteed to be limited.
Anything not on a fully enabled AD102 was always going to be limited, certainly something this low in the stack. Beyond architectural deep dives and academics, I don't really care that much about how big the bus is and so on, what matters is the resulting performance of given workloads. I'll gladly crucify cards where the price is out of line with that performance, but I won't crucify innovative and efficient design work.
 
Last edited:
I don't know what's worse, that a 1440p graphics card is $800 in 2023 or that you have to spend $1,200 to get one for 4K.

But does the $1200 give you a 4K capable card? In a "game of the year" you don't even achieve 60 FPS average with the RT OFF!


performance-3840-2160.png


1% low is 45 FPS! And if you turn on RT, it's slide show - 25 FPS, 17 FPS 1% lows!

I know, DLSS, frame doubling to the rescue... But still, ouch.
 
Anything not on a fully enabled GA102
Uh, GA102 is Ampere.

Not quite sure what Adas codename is, but that can't be it because they used it.

But does the $1200 give you a 4K capable card? In a "game of the year" you don't even achieve 60 FPS average with the RT OFF!


performance-3840-2160.png


1% low is 45 FPS! And if you turn on RT, it's slide show - 25 FPS, 17 FPS 1% lows!

I know, DLSS, frame doubling to the rescue... But still, ouch.
Yes, nothing runs Hogwarts Legacy well. Games performance sucks on PC, period.
 
But does the $1200 give you a 4K capable card? In a "game of the year" you don't even achieve 60 FPS average with the RT OFF!


performance-3840-2160.png


1% low is 45 FPS! And if you turn on RT, it's slide show - 25 FPS, 17 FPS 1% lows!

I know, DLSS, frame doubling to the rescue... But still, ouch.
Is that on Nvidia or Avalanche Software?
 
Uh, GA102 is Ampere.

Not quite sure what Adas codename is, but that can't be it because they used it.
Indeed my oversight, it's AD102 and I corrected my post.
 
Indeed my oversight, it's AD102 and I corrected my post.
Darn, and here I was thinking I had the only unlimited card here...

Oh wait I do. Too bad it still performs like a 4070ti lol.
 
Does it matter? I have friends who are upgrading their old PCs so they can play this game on 1080p with RT off.
Not exactly sure how a crappy coded game fits into this thread.
 
That being said, $300, the best I can do.
 
Not exactly sure how a crappy coded game fits into this thread.

It is way off topic. It was just a debate on how a $1200 does not guarantee a 4K gaming now.

And surely graphics cards are bought just to play games (right now, when cryptomining is far from being profitable), and a badly coded one is right now a game that's flying off the shelves like we haven't seen since... Previous badly coded game, Cyberpunk 2077. :p
 
I bet Nvidia will enforce board makers to use the 12vhpwr even on these sub 200W cards
In that case, they lost me on this segment too.
 
Does it matter? I have friends who are upgrading their old PCs so they can play this game on 1080p with RT off.
Nvidia will release new drivers, it will work better after that, as usual. It always does. Makes you wonder if they really optimize everything or just decrease some details ;)
 
In that case, they lost me on this segment too.
Really? Because of a connector?

What if AMD migrates to it in the future? You're gonna buy Intel i guess.
 
Really? Because of a connector?

What if AMD migrates to it in the future? You're gonna buy Intel i guess.
Keyword: the future. Where I'm not forced to go into silly adapters or buy a new PSU for a GPU that doesn't even remotely require it.

There is no reason whatsoever to push a 115W card on anything other than pci 6 pin right now.
These transitions are entirely pointless for me. They only cost me money and add hassle. Why pay for it? I have a perfectly capable quality PSU.
 
Possibly 3070 performance using 1/3rd the power.
And given how greedy Nvidia is, probably at 3070 price too. It really sucks how stangnated the GPU market has become for anything even remotely affordable.
It is way off topic. It was just a debate on how a $1200 does not guarantee a 4K gaming now.

And surely graphics cards are bought just to play games (right now, when cryptomining is far from being profitable), and a badly coded one is right now a game that's flying off the shelves like we haven't seen since... Previous badly coded game, Cyberpunk 2077. :p
Even if the game were optimized the Nvidia fans would just complain its an "AMD optimized" title, IMO the 4070Ti being a 192 bit card at $800 for 1440p gaming is unacceptable.
 
Keyword: the future. Where I'm not forced to go into silly adapters or buy a new PSU for a GPU that doesn't even remotely require it.

There is no reason whatsoever to push a 115W card on anything other than pci 6 pin right now.
These transitions are entirely pointless for me. They only cost me money and add hassle. Why pay for it? I have a perfectly capable quality PSU.
You're like my dad who refuses to learn anything, and still uses a Nokia 3310.
 
You're like my dad who refuses to learn anything, and still uses a Nokia 3310.
No, I'm weighing pros and cons and not seeing any need or reason to upgrade into something that's an offensively bad deal all around.

Other than that, I work in IT, learning is a daily exercise. As is filtering what's required and what's not. I'm pretty good at it too.

I mean really... upgrading now, for what? Some RT effects that kill performance, diving deep into Nvidia's proprietary hellhole at a staggering premium, having to look at a spiderweb adapter in my case, and overall a whole bunch of meagre triple A nonsense titles that fill exactly no niche whatsoever for me, its all more of the same. And whatever is not more of the same and/or truly a decent + finished game, runs damn fine still on a 2016 card - which is obvious because the majority of the market is still on that level of performance and devs cater to it. At the same time I'm seeing new titles with terrible performance because of lacking optimization coincide with latest GPU release, I've been here before, and the story's gotten old for me. I'm done chasing that so called cutting edge commercial clusterfuck :) It has nothing on offer except frustration, engineered to create new demand so you can keep buying.

Yet I still game 3-5 hours per day.
 
Last edited:
And given how greedy Nvidia is, probably at 3070 price too. It really sucks how stangnated the GPU market has become for anything even remotely affordable.

Even if the game were optimized the Nvidia fans would just complain its an "AMD optimized" title, IMO the 4070Ti being a 192 bit card at $800 for 1440p gaming is unacceptable.
What sub $800 card would you recommend for gaming at 1440P?
 
Really? Because of a connector?

What if AMD migrates to it in the future? You're gonna buy Intel i guess.

That connector really is kind of crap. Maybe that can improve it with iteration but it should have had higher safety tolerances to begin with.
 
xx107 class GPU is basically bottom of the barrel, going back to 1000 series the 1060 had 1/3th of the shaders of the top end GP102, this 4060 would now have 1/6th of the shaders of a fully enabled AD102, insane downgrade in the product stack over the years.

Possibly 3070 performance using 1/3rd the power.
This would categorically be slower than a 3070 while using maybe 60-70% of the power.

What if AMD migrates to it in the future?
Then I wouldn't have a choice but that doesn't mean the change is not idiotic.
 
Last edited:
RTX 4060 115W mobile version have apparently been tested and performance is between desktop 3060 and 3060ti, unless there are some driver optimization surprise when 4060 is out, it doesn't look good at all.

Well, unless NVIDIA price is accordingly, but we all know NVIDIA is NVIDIA.
 
Back
Top