• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Announces Financial Results for First Quarter Fiscal 2026

I feel a lot of these revenue is derived from panic buying before tariffs and tightening of export control.
Here's a breakdown of NVIDIA's gaming revenue over the last six quarters:
Quarter EndingGaming Revenue (USD)Quarter-over-Quarter Change
Jan 2025$2.544 billion-22.4%
Oct 2024$3.279 billion+13.8%
Jul 2024$2.88 billion+8.8%
Apr 2024$2.647 billion+18.2%
Jan 2024$2.865 billion+0.3%
Oct 2023$2.856 billion+14.9%
These figures indicate fluctuations in NVIDIA's gaming revenue, with notable growth in mid-2024 followed by a decline in early 2025. The dip in Q1 2025 aligns with broader industry trends and potential market saturation. Despite these fluctuations, NVIDIA's gaming segment remains a significant contributor to its overall revenue.
I suspect you are right.

For the ones saying about Nvidia stopping gaming GPUs, that won't happen. They got where they are by making their GeForce GPUs fall into the hands of most people, and allowing those to jump into CUDA. The GeForce lineup is pretty much a gateway drug so that people can get into GPGPU with CUDA and when they manage a high position job, those folks will work with what they're used to.


Instinct is still far behind Nvidia in most cases, sadly. And so is the 7900xtx, albeit the latter is now able to match a 3090 in many more scenarios nowadays.


Those instincts are more cost effective if you only account the raw acquisition cost, but this easily turns around if you factor in the required engineering time to get things working in their platform.
For really large scale deployments once you managed to make it work properly, then the engineering costs gets diluted over time. But then comes the issue that instinct products don't scale that well in multi node configs.
MI300 series are good at inferencing but training not so great, this changes significantly from MI355x which is why oracle has committed to buying large numbers of MI355x
 
Should I dismiss any criticism you make at AMD even if it's valid because in this conversation you shown no good faith in your arguments and some were arguably false?
Yes, if you feel its not in good faith throw it in the garbage.
 
Does the gaming include the switch 2 Tegra chips that were bought by Nintendo?
 
Yes, if you feel its not in good faith throw it in the garbage.
Are you really telling me that what matters is intention and not veracity?

Just because someone has other motives behind their comment doesn't make it less true.

Can't you see that kind of mentality helps prepetuate the same thing you criticised?
 
Are you really telling me that what matters is intention and not veracity?
No, im telling you veracity cannot be determined without knowing the intent. For example - how would I understand if the users complaining about a 5060 not being able to do RT actually in fact care about RT or they are just complaining cause it's ngreedia? If I realize that said users have been buying RDNA 2 and 3 that could do RT even worse then obviously the 5060 not being able to do RT isn't a serious issue, they are just complaining cause it's an nvidia gpu.

How would I understand if a card only having 8gb is actually an issue or people are just again complaining cause it's nvidia? Cause see while nvidia has released 12gb gpus at the xx60 tier 5 years ago the same people crying about vram nowadays suggested the 6600xt 8gb card over the 3060. So they don't really care about vram and it's not an important issue, they are just complaining cause nvidia.

So yeah, intent definitely matters.
 
Can the 5060 do RT properly when we have RT exclusive games?

During the Amper gen wasn't street price of the 6600xt lower than the 3060 and had average better performance?

Did the 8gb of the 6600xt hit any major bottlenecks during its gen?

All these are validy observations for me.

Are you really buying a 5060 with RT in mind?

Are the extra productivity and future proofing your VRAM not worth the worst gaming value at current prices?

All these are valid observations for me.

You can have both sides with valid arguments, you need to see if they apply to your personal use case.

The 5060ti is great value.
The 9060s aren't released so you should just buy Nvidia, even if you are not in urgent need of a card.
These aren't valid in my opinion, false in the first and the second is the opposite of good advise.

Not saying this is what you meant, but someone looking at your post could interpret it as such and be mislead.
 
Can the 5060 do RT properly when we have RT exclusive games?

During the Amper gen wasn't street price of the 6600xt lower than the 3060 and had average better performance?

Did the 8gb of the 6600xt hit any major bottlenecks during its gen?

All these are validy observations for me.

1. ...yes, it can. With the right settings, and a healthy enough dose of DLSS, you can run even games like Metro Exodus on a laptop RTX 3050 with 4 GB of VRAM, without significant visual degradation (a la Forspoken or Horizon Forbidden West, games known to crash out on low VRAM cards and look like PS2 games)
2. The 6600 XT was the intended competitor to the RTX 3060 Ti, and was slower than it in almost every benchmark. This has remained true with the 6650 XT refresh, nowadays, the 3060 Ti tends to edge out the RX 7600 XT, according to the latest review data (W1zzard's RTX 5060 review as a reference). Noteworthy that, the RX 7600 XT is a 16 GB GPU and the situation of point 1. can or can not apply, depending on the game.
3. No.
 
Can the 5060 do RT properly when we have RT exclusive games?

During the Amper gen wasn't street price of the 6600xt lower than the 3060 and had average better performance?

Did the 8gb of the 6600xt hit any major bottlenecks during its gen?

All these are validy observations for me.

Are you really buying a 5060 with RT in mind?

Are the extra productivity and future proofing your VRAM not worth the worst gaming value at current prices?

All these are valid observations for me.

You can have both sides with valid arguments, you need to see if they apply to your personal use case.

The 5060ti is great value.
The 9060s aren't released so you should just buy Nvidia, even if you are not in urgent need of a card.
These aren't valid in my opinion, false in the first and the second is the opposite of good advise.

Not saying this is what you meant, but someone looking at your post could interpret it as such and be mislead.
Every rdna2 gpu was hitting major bottlenecks. In rt. Even 1200$ rdna2 gpus couldn't do it. Same with rdna 3. But it didn't matter cause rt is gimmicky. Now it matters cause an 8gb nvidia gpu runs out of vram.
 
RX 9070 XT is very much high end.

I guess if your definition of high end is “it doesn’t match the competitors 2.5 year old last generation high end card - let alone flagship”, then yes you can call it high end.

If you bothered to look at the OP and put 30 seconds of thought into it you wouldn't need to make this post.

1)almost 80% gross profit are official gaap numbers for q1 25 from Nvidia.

2) this gen Nvidia doesn't have the best price/pref in raster compared to this gen AMD in the same segment, the 9060's didn't release yet, so the 5060 might be the best value only until then (the b580 in certain cases has better price/pref).

3) Nvidia has a monopoly in practice so it can be almost as greedy has it wants, and considering that the pseudo competition is more than happy to be as bad if not worst you end up with ridiculous product margins and and what amounts to almost complete stagnation in price performance.
Ah, the raster argument.

Since we are comparing using obsolete metrics, which card is faster at 2D sprites? How about a hardware cursor?
 
If you bothered to look at the OP and put 30 seconds of thought into it you wouldn't need to make this post.

1)almost 80% gross profit are official gaap numbers for q1 25 from Nvidia.

2) this gen Nvidia doesn't have the best price/pref in raster compared to this gen AMD in the same segment, the 9060's didn't release yet, so the 5060 might be the best value only until then (the b580 in certain cases has better price/pref).

3) Nvidia has a monopoly in practice so it can be almost as greedy has it wants, and considering that the pseudo competition is more than happy to be as bad if not worst you end up with ridiculous product margins and and what amounts to almost complete stagnation in price performance.
The Ngreedia "argument" always amuses me to no end as it's usually made by people who'd otherwise defend capitalism to the hilt.

This is also followed by the fact that while NVidia is no doubt greedy - this is written into a definition of a "corporation" - it is impossible to establish how much of the current pricing is in fact due to greed and how much to external factors, of which there are plenty in 2025. I mean, maybe it'd be possible if somebody bothered to investigate it properely instead of just screaming the N-word over and over again. I do recall TechSpot had a stab at it but since they seem to have been following the sensationalist path recently, I take everything they write with a pinch of salt.

Then there's also the good ole "but raster!" thing, which is just so tunnel-visioned and tiresome. Eg, I do understand it is important for some people to have a bit more fps in this kind of scenario, but why is it conversely impossible to understand for those who proclaim it, that for a vast chunk of the consumers it's the DLSS and all that comes with it that matters more?
 
I guess if your definition of high end is “it doesn’t match the competitors 2.5 year old last generation high end card - let alone flagship”, then yes you can call it high end.


Ah, the raster argument.

Since we are comparing using obsolete metrics, which card is faster at 2D sprites? How about a hardware cursor?
Because it totally cannot be called "high end" within its product lineup? No one ever called RX 480 that even though it was technically the highest end in its own lineup. RX 9070 XT deservingly can carry that title. And it's still high end even outside of it. It's not hard to make top of the line products if everyone buys your shit even when its literal shit and you can just do whatever the fuck you want because you're pissing cash out your rear. AMD on the other hand, considering the low market share and struggling to capture market, they really pulled a great product.

Remember, first generation of Ryzen also didn't beat Intel at everything. And look at the situation now. I'm not saying that means NVIDIA is doomed, but it just shows that being the top dog means shit if you're greedy and out of touch. Guess what Intel was. Exactly that, greedy and out of touch. NVIDIA is treading that realm now. They are greedy and entirely out of touch with consumers and totally absorbed into $$$ of Ai bullshittery.
 
Because it totally cannot be called "high end" within its product lineup? No one ever called RX 480 that even though it was technically the highest end in its own lineup. RX 9070 XT deservingly can carry that title. And it's still high end even outside of it. It's not hard to make top of the line products if everyone buys your shit even when its literal shit and you can just do whatever the fuck you want because you're pissing cash out your rear. AMD on the other hand, considering the low market share and struggling to capture market, they really pulled a great product.

Remember, first generation of Ryzen also didn't beat Intel at everything. And look at the situation now. I'm not saying that means NVIDIA is doomed, but it just shows that being the top dog means shit if you're greedy and out of touch. Guess what Intel was. Exactly that, greedy and out of touch. NVIDIA is treading that realm now. They are greedy and entirely out of touch with consumers and totally absorbed into $$$ of Ai bullshittery.

Polaris never fit the flagship role, that was the R9 Fury X at the time. The full core used on RX 480/580 treading around 2014's GTX 980 performance wise, that was a GPU that sold on a value perspective alone. Anyone who purchased a Polaris GPU based on a performance argument, is a laughable fool. Value was all it had going for it. Value... then volume. Though most of them have met a grizzly death in the digital mines, and some of them recycled into unlicensed, unreliable aftermarket cards for guillible idiots "value conscious gamers" :laugh: to buy after ETH went proof of stake.

I agree the 9070 XT is a great product. But it's not a high end product, either. It never claimed to be one. And it's definitely not priced as one, despite the fact that MSRPs are not honored and most of the time, you should be ready to spend around $900 USD, or roughly 33% above MSRP, to get one.

I do not think the Intel situation will repeat here. Intel's problem is their foundry, and AMD/NV use effectively the same fabrication technology from TSMC. Besides, Nvidia isn't even remotely challenged here, the RTX 5090's ~triple-digit percentile advantage on the 9070 duo is proof of that. You exclude it on its pricing alone, which while valid, if you get the chance to try one out... spend some time with it. You'll see how brutal it is.
 
Polaris never fit the flagship role, that was the R9 Fury X at the time. The full core used on RX 480/580 treading around 2014's GTX 980 performance wise, that was a GPU that sold on a value perspective alone. Anyone who purchased a Polaris GPU based on a performance argument, is a laughable fool. Value was all it had going for it. Value... then volume. Though most of them have met a grizzly death in the digital mines, and some of them recycled into unlicensed, unreliable aftermarket cards for guillible idiots "value conscious gamers" :laugh: to buy after ETH went proof of stake.

I agree the 9070 XT is a great product. But it's not a high end product, either. It never claimed to be one. And it's definitely not priced as one, despite the fact that MSRPs are not honored and most of the time, you should be ready to spend around $900 USD, or roughly 33% above MSRP, to get one.

I do not think the Intel situation will repeat here. Intel's problem is their foundry, and AMD/NV use effectively the same fabrication technology from TSMC.
Fab nodes mean absolutely nothing and Intel's foundries are just one part of their business. I see same fallacy people constantly make with mobile devices, this and that used Samsung node so it's bad because of that and TSMC is glorified like it's the sole reason for chip's capabilities. Reality is, nodes play a small part, RTX 3000 series were made by Samsung and they were excellent, because the core logic of the chip was good. Would it be slightly more efficient with TSMC nodes? Sure. But that's not the defining factor. What is is how chip is designed, what logic it has going and with graphic cards in recent years, what software backend can do with it (upscalers, frame generation, RT logic and how it's done etc).

As for the MSRP thing, that's not exactly AMD's fault when entire market is now based on total arbitrary BS that has no connection with reality. And it's NVIDIA that basically started it with their draconian policies and their Founders Edition cards that basically pushed all AIBs in the corner at mercy of whatever they push on them. Most vendors who do both AMD and NVIDIA then just applied same BS to AMD and the rest just followed.

Eventually this shit will collapse because it's not sustainable when graphic card alone suddenly costs half of the entire cost of the system and increasing. It's just not sustainable and it'll crash at one point. In the past it was 1/4 or 1/3, it'll soon become 3/4 of the cost. It's just idiotic and just can't function in the long run.
 
Fab nodes mean absolutely nothing and Intel's foundries are just one part of their business. I see same fallacy people constantly make with mobile devices, this and that used Samsung node so it's bad because of that and TSMC is glorified like it's the sole reason for chip's capabilities. Reality is, nodes play a small part, RTX 3000 series were made by Samsung and they were excellent, because the core logic of the chip was good. Would it be slightly more efficient with TSMC nodes? Sure. But that's not the defining factor. What is is how chip is designed, what logic it has going and with graphic cards in recent years, what software backend can do with it (upscalers, frame generation, RT logic and how it's done etc).

As for the MSRP thing, that's not exactly AMD's fault when entire market is now based on total arbitrary BS that has no connection with reality. And it's NVIDIA that basically started it with their draconian policies and their Founders Edition cards that basically pushed all AIBs in the corner at mercy of whatever they push on them. Most vendors who do both AMD and NVIDIA then just applied same BS to AMD and the rest just followed.

Eventually this shit will collapse because it's not sustainable when graphic card alone suddenly costs half of the entire cost of the system and increasing. It's just not sustainable and it'll crash at one point. In the past it was 1/4 or 1/3, it'll soon become 3/4 of the cost. It's just idiotic and just can't function in the long run.

Oh it does matter, because Intel's products are historically designed for their own fabs in mind. First time they made something on TSMC, low volume, high costs, rancid performance - that's Arrow Lake for you. The mobile argument is different, the Samsung node itself is not the problem, it's that Exynos is a bad IP compared to Snapdragon, and Qcom doesn't use Samsung fabs to build Snapdragon chips (for good reason, too).

I think there is some degree of fault with AMD. They launched the product at an intentionally low MSRP they knew that they and their partners had no means to honor. I don't think there's anything draconian going on either - Nvidia has a lead and they maintain this lead, with their same old FOSS-unfriendly, Graphics Mafia style. Halo products never made much sense to begin with for most people, and that's not about to start now I suppose. And even if it does collapse - I doubt their shareholders will care by now.
 
1. ...yes, it can. With the right settings, and a healthy enough dose of DLSS, you can run even games like Metro Exodus on a laptop RTX 3050 with 4 GB of VRAM, without significant visual degradation (a la Forspoken or Horizon Forbidden West, games known to crash out on low VRAM cards and look like PS2 games)
2. The 6600 XT was the intended competitor to the RTX 3060 Ti, and was slower than it in almost every benchmark. This has remained true with the 6650 XT refresh, nowadays, the 3060 Ti tends to edge out the RX 7600 XT, according to the latest review data (W1zzard's RTX 5060 review as a reference). Noteworthy that, the RX 7600 XT is a 16 GB GPU and the situation of point 1. can or can not apply, depending on the game.
3. No.
Let repeat my self read the post befor commenting.

1) properly is the key word, if you've read the second part you would understand that if you acount for RT in your use case it's a positive in this price point, at this performance tier RT will not be a priority for most.

2) No one was talking about the 3060ti, the subject was 6600xt Vs 3060 12gb, during the cripto craze street price at least we're I'm from was 50+€ for the 3060 12 GB, and if I'm not mistaken globally it was more expensive.

Every rdna2 gpu was hitting major bottlenecks. In rt. Even 1200$ rdna2 gpus couldn't do it. Same with rdna 3. But it didn't matter cause rt is gimmicky. Now it matters cause an 8gb nvidia gpu runs out of vram.
Hindsight is 20/20 at the time considering street prices and that RT at the time was arguably a gimmick rdn2 was a viable option, if cards where at MSRP there would be almost no reason to get a Radeon card.

At the time RT was just a optional gimmick for most, now if you want to play some games you need it different realities, so different weights on the feature.

You are clearly biased towards Nvidia, and you are defending the company from criticism, for me that's a oxymoron, if ppl are critical of their short comings you might get better products, being apologetic of products that are already gimped at release is ultimately bad for you.

You ultimately care more for the brand than their products.

PS: the things I'm not addressing in your post I think are valid.

Ah, the raster argument.

Since we are comparing using obsolete metrics, which card is faster at 2D sprites? How about a hardware cursor?
So according to you half the Nvidia lineup is obsolete or I'm missing something.

Alot of games don't have RT even today and in very few it's mandatory, and for a lot of consumers it's irrelevant, considering raster obsolete is the most asinine thing I've read TPU's forums.

Nvidia fanboys are rabid today
 
Last edited:
You are clearly biased towards Nvidia, and you are defending the company from criticism, for me that's a oxymoron, if ppl are critical of their short comings you might get better products, being apologetic of products that are already gimped at release is ultimately bad for you.

You ultimately care more for the brand than their products.

PS: the things I'm not addressing in your post I think are valid.
Not at all, im biased towards reality. Nvidia gpus have been bad to terrible excluding a few models here and there. But you wont see me criticizing nvidia because... well everyone elses product is just as bad or worse. I dont see the point of being overly focused on that specific brand when - although they arent releasing good products - they are still better than the competition.

I have a lot more expectations from amd since they are bleeding marketshare and well, they said themselves they want to gain marketshare by focusing on the midrange. I have 0 expectations from nvidia, they are already dominating, considering that they are facing no competition im actually surprised by what they are offering (even though its crap, it's less crap than everyone elses crap)
 
The Ngreedia "argument" always amuses me to no end as it's usually made by people who'd otherwise defend capitalism to the hilt.

This is also followed by the fact that while NVidia is no doubt greedy - this is written into a definition of a "corporation" - it is impossible to establish how much of the current pricing is in fact due to greed and how much to external factors, of which there are plenty in 2025. I mean, maybe it'd be possible if somebody bothered to investigate it properely instead of just screaming the N-word over and over again. I do recall TechSpot had a stab at it but since they seem to have been following the sensationalist path recently, I take everything they write with a pinch of salt.

Then there's also the good ole "but raster!" thing, which is just so tunnel-visioned and tiresome. Eg, I do understand it is important for some people to have a bit more fps in this kind of scenario, but why is it conversely impossible to understand for those who proclaim it, that for a vast chunk of the consumers it's the DLSS and all that comes with it that matters more?
I agree with you in almost every thing, except on two points.

The OP is about officially Nvidia number's and projections, the operating expenses is there, the brute margins and profits also, external factors are already accounted for.
As a corporation it wants to maximise profits, it's up to the consumers to be critical if they want shift in business model, as brand and money are what matters to these companies.

RT is certainly a big factor of decision nowadays and likely even more in the future and a definite pro in Nvidia's side, but realistically most games most ppl play aren't RT, so raster is still the main metric for a lot of consumers because raster always matters, RT not always.
The polarisation about RT Vs raster is absurd, how important any of the technologies is for your personal use case will differ from person to person, so for one person 20 bucks is too much, for another it's a no brainer.
 
Curious if you run into any issues with the 5090 - update us if you have issues or not. Have fun with your new toy.

Everything is peachy so far. Not a single black screen (yet?) :D .

Haven't had too much time to properly test it yet but several runs of 3D Mark Speed Way and Steel Nomad as well as the Speed Way Stress Test (standard 20 loops) finished with zero issues.

I also played ~1 hour of DOOM The Dark Ages and COE33 each without a single issue. The max temp in DOOM TDA was 60°C but usually it stayed at 58°C (4K, everything ultra, DLSS Quality, 130+ fps). The liquid SOC is awesome. Great cooling and near zero noise.

My only "problem" so far was that the card's RGB was pulsing white by default all the time. My case is under my desk but it was still a bit irritating to see it flashing/pulsing out of the corner of my eyes.
My previous RTX 4090 always showed a solid white light by default with no pulsing when no software was installed (I don't want to run MSI Center and/or Mystic Light because MSI software, contrary to their hardware, is garbage).

With the air-cooled MSI RTX 4090 Suprim X, it was not possible to change or turn off the RGB via software and to make a lighting setting stick (w/o software) but with the RTX 5090 Suprim Liquid this is finally possible!

So, I reluctantly one-time installed MSI Center/Mystic Light only to disable the RGB on the card completely and then I uninstalled it again (had to manually remove a lot of stuff because MSI software sucks). The RGB off setting is now persistent which is perfect for me. I could have lived with a solid lighting again like with the RTX 4090 but "off" is even better, of course.
 
RT is certainly a big factor of decision nowadays and likely even more in the future and a definite pro in Nvidia's side, but realistically most games most ppl play aren't RT, so raster is still the main metric for a lot of consumers because raster always matters, RT not always.
The polarisation about RT Vs raster is absurd, how important any of the technologies is for your personal use case will differ from person to person, so for one person 20 bucks is too much, for another it's a no brainer.
But I did not say anything about "RT". While to some extent this is another factor important to some people, my point was about DLSS in general. Upscaling is something most people use on a daily basis, and the Nvidia implementation is not only superior but, more importantly, widely available, whereas FSR...
 
Back
Top