Wednesday, June 21st 2023

Geekbench Leak Suggests NVIDIA GeForce RTX 4060 Nearly 20% Faster than RTX 3060

NVIDIA is launching its lower end GeForce RTX 4060 graphics card series next week, but has kept schtum about the smaller Ada Lovelace AD107 GPU's performance level. This more budget-friendly offering (MSRP $299) is rumored to have 3,072 CUDA cores, 24 RT cores, 96 Tensor cores, 96 TMUs, and 32 ROPs. It will likely sport 8 GB of GDDR6 memory across a 128-bit wide memory bus. Benchleaks has discovered the first set of test results via a database leak, and posted these details on social media earlier today. Two Geekbench 6 runs were conducted on a test system comprised of an Intel Core i5-13600K CPU, ASUS Z790 ROG APEX motherboard, DDR5-6000 memory and the aforementioned GeForce card.

The GPU Compute test utilizing the Vulkan API resulted in a score of 99419, and another using OpenCL achieved 105630. We are looking at a single sample here, so expect variations when other units get tested in Geekbench prior to the June 29 launch. The RTX 4060 is about 12% faster (in Vulkan) than its direct predecessor—RTX 3060. The gap widens with its Open CL performance, where it offers an almost 20% jump over the older card. The RTX 3060 Ti presents around 3-5% faster performance over the RTX 4060. We hope to see actual in-game benchmarking carried out soon.
Sources: Tom's Hardware, VideoCardz, BenchLeaks Tweet
Add your own comment

30 Comments on Geekbench Leak Suggests NVIDIA GeForce RTX 4060 Nearly 20% Faster than RTX 3060

#1
Vayra86
That's... hilariously bad
Posted on Reply
#2
Dirt Chip
The 4060x is the greatest side-grade of all time. yet.
Posted on Reply
#3
oxrufiioxo
Vayra86That's... hilariously bad
The whole generation has been bad (Both amd/nvidia) from a naming/pricing perspective. In a vacuum this card is probably fine the problem is anyone who doesn't care about upscaling is better off with a 6700 XT.

Performance isn't the worst part it's nvidia decision to reduce the vram amount is just stupid. This same card with 12GB while not overly exciting would have been a much better buy.
Posted on Reply
#4
fancucker
despite the usual reservations this effectively negates the RX 7600, it will handily trounce it in RT and provide the superior upscaling implementation, DLSS, to boot. you also need to take the price of tsmcs n5 process into consideration.
Posted on Reply
#5
Denver
5% faster than the RX 7600 which is already at around $250!
It would be nice if those mid-end GPUs under $300 had at least 10GB of Vram...
Posted on Reply
#6
fancucker
oxrufiioxoThe whole generation has been bad (Both amd/nvidia) from a naming/pricing perspective. In a vacuum this card is probably fine the problem is anyone who doesn't care about upscaling is better off with a 6700 XT.

Performance isn't the worst part it's nvidia decision to reduce the vram amount is just stupid. This same card with 12GB while not overly exciting would have been a much better buy.
For the millionth time, VRAM allocation != VRAM usage, several games allocate to the full complement of memory without usage. And memory is at a premium in terms of manufacturing cost. The only plausible scenarios for which it would be an issue would be extreme 1440p/high 4K use, which would be idiotic in itself because this isn't the designated card for it. Like Huang says, the VRAM is effectively a cache and AMDs desperate attempts to compensate for its languishing infinitycache do little to sway consumers.
Posted on Reply
#7
GhostRyder
fancuckerdespite the usual reservations this effectively negates the RX 7600, it will handily trounce it in RT and provide the superior upscaling implementation, DLSS, to boot. you also need to take the price of tsmcs n5 process into consideration.
At this performance area, are you really turning on Ray Tracing? I mean seriously, Ray tracing is only still really useable on the higher end Nvidia offerings unless you enjoy sub 30 FPS gaming. Yes DLSS is probably a nice helper and probably significantly more important on this card, however even that has its drawbacks and should only be considered as a bonus and not to be relied on.

Besides, until we see actual game tests Geekbench is to be taken with a grain of salt on the performance side. It will be interesting to see how this thing runs in the wild at the estimated price point (If its true) of $300.
Posted on Reply
#8
R0H1T
Dirt ChipThe 4060x is the greatest side-grade of all time. yet.
Wait till you see 5060 Super Ti Se uber fans suckers limited edition :nutkick:
Posted on Reply
#9
N/A
50% faster than a 3060 8GB 128 bit.
Posted on Reply
#10
TheinsanegamerN
fancuckerFor the millionth time, VRAM allocation != VRAM usage, several games allocate to the full complement of memory without usage. And memory is at a premium in terms of manufacturing cost. The only plausible scenarios for which it would be an issue would be extreme 1440p/high 4K use, which would be idiotic in itself because this isn't the designated card for it. Like Huang says, the VRAM is effectively a cache and AMDs desperate attempts to compensate for its languishing infinitycache do little to sway consumers.
For the millionth time, we have demonstrable proof of games today suffering far lower 1% lows, suffering from texture load issues, and using demonstrably lower settings on 8GB cards.

You can scream until you are blue in the face about utilization and how 8GB is still plenty, but once you are done clutching that pearl you will realize the day of the 8GB GPU is long over, much like 4GB cards before them, and no $150+ GPU should have that kind of frame buffer. FFS the 580 had a standard 8GB framebuffer in 2017. 6 years ago.

Any company that pushes 8 gb in 2023 is greedy beyond belief. Period.
Posted on Reply
#11
oxrufiioxo
TheinsanegamerNFor the millionth time, we have demonstrable proof of games today suffering far lower 1% lows, suffering from texture load issues, and using demonstrably lower settings on 8GB cards.

You can scream until you are blue in the face about utilization and how 8GB is still plenty, but once you are done clutching that pearl you will realize the day of the 8GB GPU is long over, much like 4GB cards before them, and no $150+ GPU should have that kind of frame buffer. FFS the 580 had a standard 8GB framebuffer in 2017. 6 years ago.

Any company that pushes 8 gb in 2023 is greedy beyond belief. Period.
Honestly I have no issue with sub 300 usd cards having 8GB my issue is the very meager generational improvements and the fact that someone can get a similarly priced 12GB gpu with better rasterization performance. My argument that it should have come with 12GB is simply due to the fact that it would be more appealing. This seems like overall a step back from it's predecessor that not only offered 20% more performance at launch vs the 2060 but also included 100% more vram while being technically cheaper launch msrp vs launch msrp.

While I think in 2023 people should not be buying 8GB cards if someone loves running games at 1080p medium and just cares about fps then 8GB is fine.
Posted on Reply
#12
sethmatrix7
fancuckerdespite the usual reservations this effectively negates the RX 7600, it will handily trounce it in RT and provide the superior upscaling implementation, DLSS, to boot. you also need to take the price of tsmcs n5 process into consideration.
The 7600 is nearly $50 cheaper and there are definitely people that don’t care about RT in a handful of games or fake frames.
Posted on Reply
#13
dlgh7
fancuckerdespite the usual reservations this effectively negates the RX 7600, it will handily trounce it in RT and provide the superior upscaling implementation, DLSS, to boot. you also need to take the price of tsmcs n5 process into consideration.
DLLS is only really beneficial at high frame rates. If a game has an issue and doesn't process a graphic right. Like it doesn't load. Which in 4060 ti testing happened a lot. Well even if DLSS gives you more fake frames it is unable to fix the issues with textures not loading. So it still looks like garbage. Also who cares about RT on a card like this. It is basically useless.

The memory bandwidth is such an issue and the fact that card manufacturers the last gen and this one seem to be removing features valuable in Linux have not been helpful at all. In fact some of the devs for different emulators have come out and said the last gen cards trounce this gen cards for emulation because the memory bandwidth is that bad unless you go up to the 4080 or higher.

And to the guy that said under $300 card. This is a $300 card. $299 is $300. Don't care what anyone says. 99 percent of people pay sales tax so it is going to be over $300 regardless. $279 would be under $300 but still in some states tax would push it over.
Posted on Reply
#15
Guwapo77
oxrufiioxoHonestly I have no issue with sub 300 usd cards having 8GB my issue is the very meager generational improvements and the fact that someone can get a similarly priced 12GB gpu with better rasterization performance. My argument that it should have come with 12GB is simply due to the fact that it would be more appealing. This seems like overall a step back from it's predecessor that not only offered 20% more performance at launch vs the 2060 but also included 100% more vram while being technically cheaper launch msrp vs launch msrp.

While I think in 2023 people should not be buying 8GB cards if someone loves running games at 1080p medium and just cares about fps then 8GB is fine.
You have no issue with 8GB, but at the same time nobody should buy 8GB GPUs in 2023... Then why make them? 12GB should be the minimum for any gaming GPU. If it says Geforce it should be at least 12GB same goes for Radeon.
GhostRyderAt this performance area, are you really turning on Ray Tracing? I mean seriously, Ray tracing is only still really useable on the higher end Nvidia offerings unless you enjoy sub 30 FPS gaming. Yes DLSS is probably a nice helper and probably significantly more important on this card, however even that has its drawbacks and should only be considered as a bonus and not to be relied on.

Besides, until we see actual game tests Geekbench is to be taken with a grain of salt on the performance side. It will be interesting to see how this thing runs in the wild at the estimated price point (If its true) of $300.
It does support DLSS 3. I believe RT should be a viable option at 1080p.
Posted on Reply
#16
kapone32
Guwapo77It does support DLSS 3. I believe RT should be a viable option at 1080p.
RT at 1080P is purportedly not that good.
Posted on Reply
#17
kiakk
At least a max. 120W TDP VGA.
In my view not just the naming/pricing is hilarious, but more likely the last years of TDP gaining. This 4060 step back to the 1060 120W TDP level. The only problem is that it should be more likely a 4050/4050Ti rather than 4060. I am still not satisfied and overall disspaointed with the firms hipocrisy, because of sacrifice efficiency on the altar of performance.
Posted on Reply
#18
Broken Processor
Meh 4050 silicon at 4060 prices, less bandwidth so less benefit for the 16 GB variety when it turns up.
DLSS can't save this garbage, last gen at this money is the way to go unless you blindly believe the marketing bollox.
Posted on Reply
#19
Vayra86
TheinsanegamerNFor the millionth time, we have demonstrable proof of games today suffering far lower 1% lows, suffering from texture load issues, and using demonstrably lower settings on 8GB cards.

You can scream until you are blue in the face about utilization and how 8GB is still plenty, but once you are done clutching that pearl you will realize the day of the 8GB GPU is long over, much like 4GB cards before them, and no $150+ GPU should have that kind of frame buffer. FFS the 580 had a standard 8GB framebuffer in 2017. 6 years ago.

Any company that pushes 8 gb in 2023 is greedy beyond belief. Period.
sethmatrix7The 7600 is nearly $50 cheaper and there are definitely people that don’t care about RT in a handful of games or fake frames.
dlgh7DLLS is only really beneficial at high frame rates. If a game has an issue and doesn't process a graphic right. Like it doesn't load. Which in 4060 ti testing happened a lot. Well even if DLSS gives you more fake frames it is unable to fix the issues with textures not loading. So it still looks like garbage. Also who cares about RT on a card like this. It is basically useless.

The memory bandwidth is such an issue and the fact that card manufacturers the last gen and this one seem to be removing features valuable in Linux have not been helpful at all. In fact some of the devs for different emulators have come out and said the last gen cards trounce this gen cards for emulation because the memory bandwidth is that bad unless you go up to the 4080 or higher.

And to the guy that said under $300 card. This is a $300 card. $299 is $300. Don't care what anyone says. 99 percent of people pay sales tax so it is going to be over $300 regardless. $279 would be under $300 but still in some states tax would push it over.
You could also just not feed the troll.

There is an ignore button
Posted on Reply
#20
Bomby569
it's hilarious it doesn't even surpass the 3060ti, but it would be also hilarious if it did, i guess it makes sense
Posted on Reply
#21
GhostRyder
Guwapo77You have no issue with 8GB, but at the same time nobody should buy 8GB GPUs in 2023... Then why make them? 12GB should be the minimum for any gaming GPU. If it says Geforce it should be at least 12GB same goes for Radeon.

It does support DLSS 3. I believe RT should be a viable option at 1080p.
Even if it does the amount of performance hit is still going to be way to high to justify. Plus having to mix in everything just to get it to be viable is not a winning endorsement. The overwhelming majority are going to buy this card care about trying to keep everything at 60+ FPS and turning RT even with DLSS is going to make that very difficult even at 1080p.

I am just saying we need to be real when we talk about these cards. Ray Tracing is cool but its still only for cards 4070ti and higher that can make is somewhat useable in the few games that truly support it. We should not be trying to sell that as a feature that pushes it over the edge on the lower end cards as the performance degradation even with DLSS is just too high and not noticeable enough at 1080p.
Posted on Reply
#22
oxrufiioxo
Guwapo77You have no issue with 8GB, but at the same time nobody should buy 8GB GPUs in 2023... Then why make them? 12GB should be the minimum for any gaming GPU. If it says Geforce it should be at least 12GB same goes for Radeon.
Again everyone has different use cases so even though I don't think buying an 8GB card in 2023 is a smart decision doesn't mean someone who only plays games at competitive settings cares. I'm sure there are plenty of gamers that just run the lowest settings in esports like games who don't care about more than 8GB and only care about the cost. Also anyone who doesn't play AAA games and is more into light indie gaming also likely doesn't care.

And while I think 8GB cards should be in the sub 200 usd range the reality is that both AMD/Nvidia think 250-300 usd is the actual market for them well with Nvidia up to 400 usd lol which the 4060ti should really be DOA hopefully nobody buys that shite.
Posted on Reply
#23
dlgh7
The most pathetic thing is that Nvidia is trying to nickel and dime everyone. Amd a bit as well. With 16gb version. Ti version. Amd does XT and XTX. Though AMD hasn't revealed their midrange as of yet really. I remember buying my GTX 970 for like $350 but it was a more expensive Asus Strix model that I purchased because it only need a single power connector instead of two like most 970's. Now we have what is basically a 4050 being sold as a 4060 and the jump to a 4070 isn't $100, or $200, but $300. Literally doubling the price of the card. That is insane.

My belief is that Nvidia at least was so sold and enamored with sales with mining that they developed the 4000 series for miners and not gamers. They thought they would barely improve performance but would do it in a way that would cost less energy and be more energy efficient. So they wanted people to double dip and upgrade the cards based on power draw and not really any actual increases in performance. Ignoring the gamers while trying to spin DLSS and all this software garbage that doesn't matter to over 90 percent of pc gamers.
Posted on Reply
#24
SSGBryan
This makes my decision to purchase an a770 look even better than when I ordered it.
Posted on Reply
#25
tfdsaf
Where did you get the 20% faster? In actual Vulkan bench which more closely resembles gaming load it is only 12% faster and we know Geenbench is a shitty benchmark anyways!

My informed guess is that its probably 10-14% faster on average at 1080p and actually loses to the 3060 12GB at 1440p and 4k.
Posted on Reply
Add your own comment
May 8th, 2024 19:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts