Thursday, August 24th 2023

NVIDIA Announces Record Financial Results for Q2 of 2023

NVIDIA (NASDAQ: NVDA) today reported revenue for the second quarter ended July 30, 2023, of $13.51 billion, up 101% from a year ago and up 88% from the previous quarter. GAAP earnings per diluted share for the quarter were $2.48, up 854% from a year ago and up 202% from the previous quarter. Non-GAAP earnings per diluted share were $2.70, up 429% from a year ago and up 148% from the previous quarter.

"A new computing era has begun. Companies worldwide are transitioning from general-purpose to accelerated computing and generative AI," said Jensen Huang, founder and CEO of NVIDIA. "NVIDIA GPUs connected by our Mellanox networking and switch technologies and running our CUDA AI software stack make up the computing infrastructure of generative AI.
"During the quarter, major cloud service providers announced massive NVIDIA H100 AI infrastructures. Leading enterprise IT system and software providers announced partnerships to bring NVIDIA AI to every industry. The race is on to adopt generative AI," he said.

During the second quarter of fiscal 2024, NVIDIA returned $3.38 billion to shareholders in the form of 7.5 million shares repurchased for $3.28 billion, and cash dividends. As of the end of the second quarter, the company had $3.95 billion remaining under its share repurchase authorization. On August 21, 2023, the Board of Directors approved an additional $25.00 billion in share repurchases, without expiration. NVIDIA plans to continue share repurchases this fiscal year.

NVIDIA will pay its next quarterly cash dividend of $0.04 per share on September 28, 2023, to all shareholders of record on September 7, 2023.

Outlook
  • NVIDIA's outlook for the third quarter of fiscal 2024 is as follows:
  • Revenue is expected to be $16.00 billion, plus or minus 2%.
  • GAAP and non-GAAP gross margins are expected to be 71.5% and 72.5%, respectively, plus or minus 50 basis points.
  • GAAP and non-GAAP operating expenses are expected to be approximately $2.95 billion and $2.00 billion, respectively.
  • GAAP and non-GAAP other income and expense are expected to be an income of approximately $100 million, excluding gains and losses from non-affiliated investments.
  • GAAP and non-GAAP tax rates are expected to be 14.5%, plus or minus 1%, excluding any discrete items.
NVIDIA achieved progress since its previous earnings announcement in these areas:
Data Center
  • Second-quarter revenue was a record $10.32 billion, up 141% from the previous quarter and up 171% from a year ago.
  • Announced that the NVIDIA GH200 Grace Hopper Superchip for complex AI and HPC workloads is shipping this quarter, with a second-generation version with HBM3e memory expected to ship in Q2 of calendar 2024.
  • Announced the NVIDIA L40S GPU — a universal data center processor designed to accelerate the most compute-intensive applications — available from leading server makers in a broad range of platforms, including NVIDIA OVX and NVIDIA AI-ready servers with NVIDIA BlueField DPUs, beginning this quarter.
  • Unveiled NVIDIA MGX, a server reference design available this quarter that lets system makers quickly and cost-effectively build more than 100 server variations for AI, HPC and NVIDIA Omniverse applications.
  • Announced NVIDIA Spectrum-X, an accelerated networking platform designed to improve the performance and efficiency of Ethernet-based AI clouds, which is shipping this quarter.
  • Joined with global system makers to announce new NVIDIA RTX workstations with up to four new NVIDIA RTX 6000 Ada GPUs, as well as NVIDIA AI Enterprise and NVIDIA Omniverse Enterprise software, expected to ship this quarter.
  • Launched general availability of cloud instances based on NVIDIA H100 Tensor Core GPUs with Amazon Web Services, Microsoft Azure and regional cloud service providers.
  • Partnered with a range of companies on AI initiatives, including:
  • ServiceNow and Accenture to develop AI Lighthouse, a first-of-its-kind program to fast-track the development and adoption of enterprise generative AI capabilities.
  • VMware to extend the companies' strategic partnership to ready enterprises running VMware's cloud infrastructure for the era of generative AI with VMware Private AI Foundation with NVIDIA.
  • Snowflake to provide businesses with an accelerated path to create customized generative AI applications using their own proprietary data.
  • WPP to develop a generative AI-enabled content engine that lets creative teams produce high-quality commercial content faster, more efficiently and at scale while staying fully aligned with a client's brand.
  • SoftBank to create a platform for generative AI and 5G/6G applications based on the GH200, which SoftBank plans to roll out at new, distributed AI data centers across Japan.
  • Hugging Face to give developers access to NVIDIA DGX Cloud AI supercomputing within the Hugging Face platform to train and tune advanced AI models.
  • Announced NVIDIA AI Workbench, an easy-to-use toolkit allowing developers to quickly create, test and customize pretrained generative AI models on a PC or workstation and then scale them, as well as NVIDIA AI Enterprise 4.0, the latest version of its enterprise software.
  • Set records in the latest MLPerf training benchmarks with H100 GPUs, excelling in a new measure for generative AI.
Gaming
  • Second-quarter revenue was $2.49 billion, up 11% from the previous quarter and up 22% from a year ago.
  • Began shipping the GeForce RTX 4060 family of GPUs, bringing to gamers NVIDIA Ada Lovelace architecture and DLSS, starting at $299.
  • Announced NVIDIA Avatar Cloud Engine, or ACE, for Games, a custom AI model foundry service using AI-powered natural language interactions to transform games by bringing intelligence to non-playable characters.
  • Added 35 DLSS games, including Diablo IV, Ratchet & Clank: Rift Apart, Baldur's Gate 3 and F1 23, as well as Portal: Prelude RTX, a path-traced game made by the community using NVIDIA's RTX Remix creator tool.
Professional Visualization
  • Second-quarter revenue was $379 million, up 28% from the previous quarter and down 24% from a year ago.
  • Announced three new desktop workstation RTX GPUs based on the Ada Lovelace architecture — NVIDIA RTX 5000, RTX 4500 and RTX 4000 — to deliver the latest AI, graphics and real-time rendering, which are shipping this quarter.
  • Announced a major release of the NVIDIA Omniverse platform, with new foundation applications and services for developers and industrial enterprises to optimize and enhance their 3D pipelines with OpenUSD and generative AI.
  • Joined with Pixar, Adobe, Apple and Autodesk to form the Alliance for OpenUSD to promote the standardization, development, evolution and growth of Universal Scene Description technology.
Automotive
  • Second-quarter revenue was $253 million, down 15% from the previous quarter and up 15% from a year ago.
  • Announced that NVIDIA DRIVE Orin is powering the new XPENG G6 Coupe SUV's intelligent advanced driver assistance system.
  • Partnered with MediaTek, which will develop mainstream automotive systems on chips for global OEMs, which integrate new NVIDIA GPU chiplet IP for AI and graphics.
CFO Commentary
Commentary on the quarter by Colette Kress, NVIDIA's executive vice president and chief financial officer, is available at https://investor.nvidia.com.
Add your own comment

26 Comments on NVIDIA Announces Record Financial Results for Q2 of 2023

#1
john_
Nvidia can buy TSMC to build it's GPUs if it continues like this. :p
Posted on Reply
#2
Dirt Chip
So much for 'nobody buying NV GPU anymore'
Posted on Reply
#3
mama
I think if you dig into Nvidia gaming segmentation results that gaming revenue is down this year. Check beyond the press release.
Posted on Reply
#4
pressing on
john_Nvidia can buy TSMC to build it's GPUs if it continues like this. :p
Apple and Nvidia between them are already dominating TSMC's output, the ultimate brake on generative AI may be the lack of foundry capacity.
Posted on Reply
#5
N/A
A year ago means roughly the time j2c advocated not waiting and buying a card right now and nobody in their right mind was buying.
imagining how overpriced must be now compared to a heavily overpriced 3080 Ti 3090 Ti before the discounts took place and up 22% now.
Posted on Reply
#6
john_
70% Gross margin.
Nvidia thinks we are all fools. More like 170%.....
Posted on Reply
#7
Chomiq
Dirt ChipSo much for 'nobody buying NV GPU anymore'
Most of the gaming revenue probably comes from 4090's that are being bought for AI work.
Posted on Reply
#8
Bwaze
RIP gaming, again.

I can only imagine what will happen if, or rather when the crypto rises again and then we'll see spending competition between cryptominers and AI acceleration users...
Posted on Reply
#9
renz496
mamaI think if you dig into Nvidia gaming segmentation results that gaming revenue is down this year. Check beyond the press release.
it is down vs a quarter where gaming GPU sales are being boosted by crypto. Q2 last year vs Q2 this year it is up something like 20%. also in the past the only time nvidia gaming revenue able to break 2 billion wall is when there is mining craze.
Posted on Reply
#10
TheDeeGee
Dirt ChipSo much for 'nobody buying NV GPU anymore'
Maybe AMD should stop being garbage at everything but rasterized.
Posted on Reply
#11
tfdsaf
Their gaming revenue is very low, but AI computing is saving Nvidia, this means we will likely be getting more garbage gaming products and over expensive turds in the foreseeable future!

Worse thing still is that they are still selling decent amount of GPU's even though their stack is extremely bad and utter garbage value!
Posted on Reply
#12
R0H1T
Dirt ChipSo much for 'nobody buying NV GPU anymore'
Saw on some other site that a lot of demand is probably from China stockpiling GPU's much like last year or two, if more sanctions hit & Nvidia have to stop shipping these high end products there the revenues & profits will drop like a rock!

This is simply unsustainable much like Evergrande or Country Garden.
Posted on Reply
#14
defaultluser
ChomiqMost of the gaming revenue probably comes from 4090's that are being bought for AI work.
so is the new cache-powered Maxwell reintroduction of the 128-bit 4060! most new games, it matches the old 3060 ti, for $30-off the 3060; the bonus being, cache +128-bit means the card doesn't appeal for AI

but yeah, every other card besides the 4090 is overpriced :D
Posted on Reply
#15
Tomorrow
TheDeeGeeMaybe AMD should stop being garbage at everything but rasterized.
I didn't realize that Ampere level RT perf on 7900 is suddenly garbage.
Posted on Reply
#16
john_
AnotherReaderNobody has pointed out that Nvidia's revenue has exceeded that of Intel for the first time.
You are reading it too superficially. That's not the most important thing from their report. But the fact that their server income is about as much (haven't checked the numbers but I believe it is so if I remember correctly) AMD's and Intel's combined.

Huang said "The big mega theme is that the world’s computer data centers are transitioning to a new model, from general purpose computing to accelerated computing

That means, less money spend on CPUs, much more money spend on GPUs. AMD is falling of a cliff today in the stock market because investors don't expect MI300 to manage to have success in a world where everything is Nvidia (think Windows vs Linux) and also expect AMD (and possibly intel) to have bad news in the next quarters from their data center sales. And AMD is in a very bad situation right now, with RDNA3 a failure in gaming and AM5 still not selling well - with the exception maybe of X3D chips.
Posted on Reply
#17
AnotherReader
john_You are reading it too superficially. That's not the most important thing from their report. But the fact that their server income is about as much (haven't checked the numbers but I believe it is so if I remember correctly) AMD's and Intel's combined.

Huang said "The big mega theme is that the world’s computer data centers are transitioning to a new model, from general purpose computing to accelerated computing

That means, less money spend on CPUs, much more money spend on GPUs. AMD is falling of a cliff today in the stock market because investors don't expect MI300 to manage to have success in a world where everything is Nvidia (think Windows vs Linux) and also expect AMD (and possibly intel) to have bad news in the next quarters from their data center sales. And AMD is in a very bad situation right now, with RDNA3 a failure in gaming and AM5 still not selling well - with the exception maybe of X3D chips.
I highlighted the significance of exceeding Intel, because Intel, not AMD, has been the benchmark for the semiconductor design houses for a very long time. You're right that the data center spend is shifting from CPUs to GPUs and that's very worrisome for everyone except Nvidia and Google. Now that AMD and Intel have realized the significance of the market they missed, we'll see if they manage to catch up.

With this much money being thrown around, MI300 or Intel may catch a few scraps as Nvidia doesn't seem to have enough GPUs to satisfy demand. A lot of this growth is due to Chinese panic buying and may stop if export restrictions tighten.
Posted on Reply
#18
cvaldes
Once again there are TPU forum participants who aren't understanding that there is evolution in computing.

Data center computing is not a static technology. It's not just Xeon or EPYC cores that need to have higher clock rates every year. Data center is an evolving landscape that increasingly benefits from thoughtful implementation of differentiated silicon. It has already been on this path for a while.

Nvidia calls the three pillars of computing the CPU, GPU, and DPU. Guess what? Nvidia is strong in GPUs and now has a foothold in DPUs with their BlueField silicon. As data center technology puts more emphasis on the latter two, Nvidia is already there to eat Intel and AMD's lunch.

This is no different that the graphics business where the industry has moved from pure rasterization to differentiated silicon for ray tracing, video encoding/decoding, machine learning, etc.

Even the CPU industry has evolved their cores over the years. Forty years ago, floating point calculations were done on undifferentiated silicon. By the late Eighties and early Nineties there were floating point co-processors, not just on fancy UNIX servers but also systems like Macs.

Those who think that you can just throw more CPUs into a data center simply don't get it. However, the people who build and run data centers (corporate IT directors, computer scientists, engineers, etc.) do. They are the ones sending purchase orders in for Nvidia AI accelerators; they used to sent in more purchase orders for EPYC and Xeon chips 5-10 years ago.

For Intel, the main point of XeSS/ARC GPUs isn't gaming. It's for enterprise level AI acceleration.

Someday there will be a shift from what we are witnessing today to a different technology, probably something that is still sitting in someone's R&D lab. I don't know when, I don't know what, I don't know why but it's inevitable. Because we haven't solved all of the universe's computational challenges yet.

Computing is all about change. It's not about Pentium IIs, North and South Bridge chips, ISA connectors, NuBus slots, parallel ports, Voodoo Graphics cards, etc. anymore.
Posted on Reply
#19
AnotherReader
cvaldesOnce again there are TPU forum participants who aren't understanding that there is evolution in computing.

Data center computing is not a static technology. It's not just Xeon or EPYC cores that need to have higher clock rates every year. Data center is an evolving landscape that increasingly benefits from thoughtful implementation of differentiated silicon. It has already been on this path for a while.

Nvidia calls the three pillars of computing the CPU, GPU, and DPU. Guess what? Nvidia is strong in GPUs and now has a foothold in DPUs with their BlueField silicon. As data center technology puts more emphasis on the latter two, Nvidia is already there to eat Intel and AMD's lunch.

This is no different that the graphics business where the industry has moved from pure rasterization to differentiated silicon for ray tracing, video encoding/decoding, machine learning, etc.

Even the CPU industry has evolved their cores over the years. Forty years ago, floating point calculations were done on undifferentiated silicon. By the late Eighties and early Nineties there were floating point co-processors, not just on fancy UNIX servers but also systems like Macs.

Those who think that you can just throw more CPUs into a data center simply don't get it. However, the people who build and run data centers (corporate IT directors, computer scientists, engineers, etc.) do. They are the ones sending purchase orders in for Nvidia AI accelerators; they used to sent in more purchase orders for EPYC and Xeon chips 5-10 years ago.

For Intel, the main point of XeSS/ARC GPUs isn't gaming. It's for enterprise level AI acceleration.

Someday there will be a shift from what we are witnessing today to a different technology, probably something that is still sitting in someone's R&D lab. I don't know when, I don't know what, I don't know why but it's inevitable. Because we haven't solved all of the universe's computational challenges yet.

Computing is all about change. It's not about Pentium IIs, North and South Bridge chips, ISA connectors, NuBus slots, parallel ports, Voodoo Graphics cards, etc. anymore.
I agree with your assessment of this change in computing, but I think you're being rather pessimistic about every silicon design house that isn't Nvidia. Both AMD and Intel have DPUs and GPUs so it's rather early to count them out of the game on that account.
Posted on Reply
#20
cvaldes
AnotherReaderBoth AMD and Intel have DPUs too so it's rather early to count them out of the game on that account.
Of course.

But today, looking at Nvidia's earnings announcement, they are currently chowing down leaving AMD and Intel to fight over crumbs. Market share percentage can and will change. But not tomorrow, not next week, not next month.

Nvidia could stumble and give up market share. Or either AMD and/or Intel could stumble and give up more market share.

How many smartphones does Microsoft sell today? How many smartphones did Apple sell in 2006?

Apple exited the tablet market after Newton but eventually came back. What percentage of tablet profits do you think Apple takes home today?

Again, corporate orders have a long lead time. Companies aren't going to flip flop between the three companies every quarter. Deployment takes time, full integration and optimization takes time, getting engineers up to speed takes time. And corporations really want to get their money's worth past when the equipment is fully depreciated. It's not like they are going to buy Nvidia's latest and greatest AI accelerator tomorrow and jump to AMD at Christmas even if AMD had a breakthrough in the R&D labs next week.

The point is Nvidia had a great quarter and destroyed AMD and Intel in data center revenue and market share. And this trend will likely continue for the next few quarters.

This is a fluid competition anything can happen. I didn't say that AMD and Intel are doomed. I just said that Nvidia is eating their lunch RIGHT NOW in the data center market.
Posted on Reply
#21
AnotherReader
cvaldesOf course.

But today, looking at Nvidia's earnings announcement, they are currently chowing down leaving AMD and Intel to fight over crumbs. Market share percentage can and will change. But not tomorrow, not next week, not next month.

Nvidia could stumble and give up market share. Or either AMD and/or Intel could stumble and give up more market share. How many smartphones does Microsoft sell today? How many smartphones did Apple sell in 2006? Apple exited the tablet market after Newton but eventually came back. What percentage of tablet profits do you think Apple takes?

Again, corporate orders have a long lead time. Companies aren't going to flip flop between the three companies every quarter. Deployment takes time, full integration and optimization takes time, getting engineers up to speed takes time. And corporations really want to get their money's worth past when the equipment is fully depreciated. It's not like they are going to buy Nvidia's latest and greatest AI accelerator tomorrow and jump to AMD at Christmas even if AMD had a breakthrough in the R&D labs next week.

The point is Nvidia had a great quarter and destroyed AMD and Intel in data center revenue and market share. And this trend will likely continue for the next few quarters.

This is a fluid competition anything can happen. I didn't say that AMD and Intel are doomed. I just said that Nvidia is eating their lunch RIGHT NOW in the data center market.
That's an accurate assessment. However, there's interest in alternatives to Nvidia as the dependence on them has led to shortages when everyone's scrambling to buy the H100 or even the A100. This does provide their competitors an opportunity, but that doesn't mean that they will actually capitalize on it.
Posted on Reply
#22
cvaldes
AnotherReaderThat's an accurate assessment. However, there's interest in alternatives to Nvidia as the dependence on them has led to shortages when everyone's scrambling to buy the H100 or even the A100. This does provide their competitors an opportunity, but that doesn't mean that they will actually capitalize on it.
Yes, unfortunately Apple and Nvidia are going to soak up the vast majority of TSMC's capacity at the cutting edge nodes.

AMD: "Hi, we'd like some chips on your MegaAwesome Node from your new SuperDuper Fab."
TSMC: "We'd be happy to do business with you. Just a moment please." (steps into back office and emerges a few minutes later)
TSMC: "I found a few wafers but they won't be ready until April 2024. The fab is running 24x7 and is fully booked until July 2024. If we find some free time, we'll move it up. Would you like to place an order?"
AMD: "Um, yeah, okay. Fine."

Nvidia will sell every single piece of silicon they can get their hands on. They will price the RTX 5090 at a premium to lessen demand because they can basically slap a price tag on the same silicon for 5-10x to enterprise customers.

Apple? TSMC can't kick Apple back to the curb. Apple has prepaid for years and years. Basically Apple's iPhone orders saved TSMC from bankruptcy. Apple is basically guaranteed money.

Competitors can't just wave a magic wand and make competitive solutions appear. The semiconductor industry doesn't work like that unfortunately.

And performance-per-watt is one of the primary decision drivers of data center hardware purchases. Putting cutting edge circuits on old process nodes isn't a workable approach.

It goes far beyond chip availability anyhow. Nvidia has built a massive developer ecosystem to support their data center technologies, the most visible being the Omniverse suite. It's not like they're selling AI accelerator cards and saying, "There you are, go nuts. Good luck, we hope you like it."

Nvidia is beating AMD and Intel on multiple levels in the data center arena, not just ML core architecture. Building a successful data center business is far more involved than having the top score on some synthetic benchmark.

I know there are people here who point at some YouTube video and say "Look! This AMD card runs Benchmark X faster than Nvidia's card." Wonderful for AMD. That's not enough to build a successful business though.

This whole press release and Nvidia's data center performance shows how crushingly limited something like a synthetic benchmark is worth. I'm sorry, no one is building data centers to run 3Dmark (or the data center equivalent).
Posted on Reply
#23
ZoneDymo
TheDeeGeeMaybe AMD should stop being garbage at everything but rasterized.
Maybe people should stop having this mindset....

Its like what Gordon Mah Ung said to GN, gamers complain about AMD because they want them to drop prices so Nvidia drops prices...so they can buy Nvidia for cheaper.

and that...in the end is the problem, that 80% Steam share for 1 of the 2 companies, the consumer made it this way.
Posted on Reply
#24
cvaldes
ZoneDymoMaybe people should stop having this mindset....

Its like what Gordon Mah Ung said to GN, games complain about AMD because they want them to drop prices so Nvidia drops prices...so they can buy Nvidia for cheaper.

and that...in the end is the problem, that 80% Steam share for 1 of the 2 companies, the consumer made it this way.
The flaw in this logic is that graphics card pricing is only influenced by gamers.

Gamers complaining about prices didn't make Ampere cards any cheaper during the crypto mining craze. The enterprise AI bump will move more wafers into AI accelerator chip production.

Even today, GeForce 40-series cards see smaller discounts than RDNA 3 Radeon 70-series cards.

Should Joe Gamer just wordlessly pay what Nvidia, AMD, and Intel ask? I'm sure whoever runs those companies' social media accounts would love to log in and not see a barrage of "make it cheaper" posts.

Maybe the government should set the pricing on computer components. That would really encourage innovation, wouldn't it?

Great ideas, keep them coming.
Posted on Reply
#25
john_
AnotherReaderNow that AMD and Intel have realized the significance of the market they missed, we'll see if they manage to catch up.
AMD does have an advantage over Intel in this race, but they have to start thinking outside of CPUs. AMD was having so much fun beating Intel in the CPU game, that almost forgot GPUs. Intel realized the change a few years back when they brought Raja to perform some miracles in the GPU market for them. People saying the last months that Intel will quit after ARC failing to be a success in the market, might realize eventually that Intel's mind wasn't in gaming.
AnotherReaderWith this much money being thrown around, MI300 or Intel may catch a few scraps as Nvidia doesn't seem to have enough GPUs to satisfy demand.
The "Linux vs Windows" example I threw above wasn't random. Nvidia is the Windows platform for AI today and even if AMD or Intel produce something as good or even better than Nvidia's ecosystem, many will be very reluctant to go with their solutions, the same way many ignore Linux because "everyone uses Windows, everyone is comfortable with Windows, all apps and features are available in Windows". Of course this is hardware, not software and being unable to satisfy demand could push many to alternatives. Then again we are talking about billions of dollars of investments, so I wouldn't be surprised if a multi billion entity decides to wait 6 months for Nvidia hardware instead of investing on AMD or Intel hardware that would be immediately available.
ZoneDymoIts like what Gordon Mah Ung said to GN, games complain about AMD because they want them to drop prices so Nvidia and Intel drops prices...so they can buy Nvidia and Intel for cheaper.
Fixed that for you. This is happening the last 15 years, if not the last 25 years (in the CPU market). It's not something new. People and tech press for ages where worshiping the shiniest logo, Nvidia's or Intel's, attacking AMD, because AMD was the brand for "the peasants", "the poor". Tech sites where downplaying anything wrong about Nvidia or Intel, they where attacking AMD on the other hand to promote the image of "a tech site that is independent and doesn't fear multi billion companies". Well they didn't fear AMD, because they knew that AMD being the underdog, couldn't react in a negative way, they where losing their sh!t (sorry for the expression) when they had to write something unfavorite for Intel or Nvidia, so they where always sugarcoating it. Individuals where also playing Intel's and Nvidia's marketing tunes. AMD is bad, AMD's drivers crush, AMD's GPU dies crack, AMD's hardware will explode and kill you in the process. Well, here we are at today. Nvidia controlling the market, AMD unable to do anything, Intel still advertising that their newer drivers are less beta than the previous version and AI demand now making sure that Nvidia's current GPUs might even start becoming more expensive in the future and future GPUs will either be more expensive or still VRAM limited to not be used in AI.
Congrats to all those Intel and Nvidia fanboys and girls for bringing us back to the middle ages of a monopolistic market. At least I hope those Intel fanatic fanboys and girls will open up their eyes and lower their Anti AMD crusade effords to avoid in the future paying $600 for a 22 core Intel CPU (2 Performance cores, 16 E cores, 4 extra Low Power E cores)
cvaldesThe flaw in this logic is that graphics card pricing is only influenced by gamers.
When gamers rush to buy the lower performing, more expensive card, because of the logo on it, then it is a result of how consumers, in this case, gamers use their money. This started happening way before even crypto was a thing. 10+ years ago, people where rushing to pay more for Nvidia's lower end graphics cards because of "CUDA, PhysX, better drivers, premium brand" narrative. And in the gaming market, what people promote is extremely important. So the narrative was always, even in cases where AMD was offering a faster product or a cheaper but equaly good one, as the Intel or Nvidia alternative "to avoid AMD hardware at all costs for A,B,C,D,E" reasons, even reasons that where not real. A narrative that only intensified in the years latter and brought us here, today, where even Jayztwocents said in one of his last videos that channel's like his will be negatively affected if only one GPU manufacturer will prevail. Well, too late for that Jay. We are already there.
Posted on Reply
Add your own comment
May 21st, 2024 13:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts