Tuesday, November 15th 2022

AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080

AMD in its technical presentation confirmed the reference clock speeds of the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. The company also made its first reference to a GeForce RTX 40-series "Ada" product, the RTX 4080 (16 GB), which is going to launch later today. The RX 7900 XTX maxes out the "Navi 31" silicon, featuring all 96 RDNA3 compute units or 6,144 stream processors; while the RX 7900 XT is configured with 84 compute units, or 5,376 stream processors. The two cards also differ with memory configuration. While the RX 7900 XTX gets 24 GB of 20 Gbps GDDR6 across a 384-bit memory interface (960 GB/s); the RX 7900 XT gets 20 GB of 20 Gbps GDDR6 across 320-bit (800 GB/s).

The RX 7900 XTX comes with a Game Clocks frequency of 2300 MHz, and 2500 MHz boost clocks, whereas the RX 7900 XT comes with 2000 MHz Game Clocks, and 2400 MHz boost clocks. The Game Clocks frequency is more relevant between the two. AMD achieves 20 GB memory on the RX 7900 XT by using ten 16 Gbit GDDR6 memory chips across a 320-bit wide memory bus created by disabling one of the six 64-bit MCDs, which also subtracts 16 MB from the GPU's 96 MB Infinity Cache memory, leaving the RX 7900 XT with 80 MB of it. The slide describing the specs of the two cards compares them to the GeForce RTX 4080, which is what the two could compete more against, especially given their pricing. The RX 7900 XTX is 16% cheaper than the RTX 4080, and the RX 7900 XT is 25% cheaper.
Add your own comment

166 Comments on AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080

#76
AusWolf
ValantarI would say that a Mustang manages to do that exact thing just fine, but then that might just be my personal preferences :laugh:
All the more reason not to buy a Ferrari. Ever. :laugh:
spnidelWe get it, lots of you care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.

Would ya'll please be mindful there is a subset of enthusiasts who don't want RT in future purchases.

If it's for you, awesome, but I tire of hearing eVeRyBoDy cArEs ABouT rAy TrACinG when clearly people don't, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is not a gimmick or that 'most' people do care, good on you!).

Personally I'd love to be able to very strongly consider NVIDIA GPU's, but a prerequisite of that is for them to take RT less seriously, and lessen the power draw, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
People who care about RT think that everybody cares. People who don't care about RT think that nobody does. Which one is it, then?
Posted on Reply
#77
Towelie00
AusWolfAh, so your 4090 isn't artificially limited by having 2048 of its GA102's shaders disabled? ;)

I don't care about having the best, but I do care about having the best in the price and performance range I consider sensible for my needs.


Then why do you care about a locked voltage/frequency curve?
hum the probleme is different with 6900xt (xtx) and the 6900xt (xtxh) :cool: , i bought my 6900xt in january 2021 4month later , the same card with 1.2v max voltage and better overhaul performance for the same price was there (ok with the GPU shortage i got my 6900xt a way lower price...but.
AMD troll us , and now i don't know if a have to wait a real 7900xtx in april with better overclock stability and performance or if they wait for the 7950xt , i'm brainfucked, i don't wanna wait 6 months.
and the leak said the first version of 7900xt/xtx got a issue about max frequency due to production mistake... hum.... interresting.... i'm disgusted
Posted on Reply
#78
zorb
4090 RT performance isn't enough at 4k.
Don't know who the target market is for RT but I find it odd to imagine buying a 2k GPU to play games at 1080p RT. If you care about eye candy why is someone using a small or low res display.
Then factor in that you'll be using dlss to make it playable and now you don't need high refresh rate either.
And then consider most implementations of RT aren't much better looking than Bloom lighting (things have been shiny in games for a long time, did you ever think the light source felt unrealistic?) and the whole RT argument starts to feel really pathetic.

Nvidia is resting on its laurels to price 4080 so high.
Posted on Reply
#79
AusWolf
zorbIf you care about eye candy why is someone using a small or low res display.
Because gaming on a high-res display needs a high-end graphics card which have drifted way above my affordability zone. Besides, 1080p up to 24" still looks good enough, imo. My 2 cents. :)
Posted on Reply
#80
ARF
Today it became clear that Radeon RX 7900 XT 20 GB and Radeon RX 7900 XTX 24 GB won't compete with RTX 4080 16 GB.

Performance difference between RX 6950 XT and RTX 4080 is mere 26%.



AMD Radeon RX 6950 XT Specs | TechPowerUp GPU Database

The Radeon 7900 series graphics will have very wide headroom to make RTX 4080 DOA as it deserves to be.
AusWolfBecause gaming on a high-res display needs a high-end graphics card which have drifted way above my affordability zone.
That's false. You can game at 4K with Radeon RX 6700 XT 12 GB for 400 money just fine.


ASUS Radeon RX 6750 XT STRIX OC Review - Battlefield V | TechPowerUp
Posted on Reply
#81
AusWolf
ARFThat's false. You can game at 4K with Radeon RX 6700 XT 12 GB for 400 money just fine.
But I can game at 1080p with the 6700 XT even better (or longer). :cool:
Posted on Reply
#82
ARF
AusWolfBut I can game at 1080p with the 6700 XT even better (or longer). :cool:
And at 720p and 480p even longer :kookoo:
Posted on Reply
#83
AusWolf
ARFAnd at 720p and 480p even longer :kookoo:
The pixels might start to hurt my eyes at those resolutions, but technically, yes. :D
Posted on Reply
#84
Wasteland
AusWolfPeople who care about RT think that everybody cares. People who don't care about RT think that nobody does. Which one is it, then?
Most people probably don't even know what RT is. They just want their games to run pretty and smooth.

So far, the best thing to come out of ray tracing is that its absurd performance cost forced Nvidia to come up with DLSS, which in turn prompted AMD to develop FSR (and Intel to develop XeSS). These features are genuinely useful to people on low-to-mid-range hardware, which is almost everyone in the PC/console gaming space. AMD's implementation of the poorly named "DLSS 3" may continue that trend of RT-performance-compensators-that-prove-more-generally-attractive-than-RT-itself, albeit to a lesser extent.

(I single out AMD's implementation because Nvidia's version is limited to the 40-series, which is ludicrously expensive now and may not produce an affordable model at all, if Ampere stocks remain as high as I've been led to believe. RDNA3 may not have any affordable offerings either, but AMD's features are generally backwards compatible, if not totally open-source).

For people at the extreme high end, I get why RT is a major talking point. Look, if I were spending four figures on GPUs every two years, I'd want to know that I had the best of every feature too. It's a perfectly reasonable desire. And in the few current examples where RT is fully/properly implemented, RT does look better than raster, but the difference is subtle. You really have to look for it; you most likely wouldn't notice in the course of normal game play. Even if you do have among the best available RT hardware, it still isn't self-evident that enabling RT is worth the enormous trade off in performance.

RT's problem, in short, is that raster is damn pretty too. Developers have had a very long time to hone their methods of "faking" natural lighting in normal rasterized scenes, and they will continue to use and refine these techniques for many years to come because again, most of their audience isn't using RT-capable hardware.

I'm on a PC hardware enthusiast forum, so of course I care about RT, in the same way that I'd care about any shiny new innovation. I think that RT will one day become "the standard," but that day is still far in the future, effectively an eternity in the context of current hardware. If you aren't at the bleeding edge of the GPU-upgrade cycle, you shouldn't consider RT perf to be a deal-breaking factor in your purchasing decisions for quite some time to come, IMO. Decent RT perf simply costs too much, in a market where GPUs are already overpriced, generally.
Posted on Reply
#85
Dirt Chip
ValantarThat is literally exactly what I said. That's the distinction between being the top-tier SKU and a top-tier SKU.

No. All purchases are a judgement of value for money, and when buying a product you cannot escape such judgements even if you explicitly don't care about value - it's there in literally every aspect of the thing. If you're paying extra for a top-tier product - which you inherently are by buying a flagship GPU - then you're putting your money and trust into a relationship with another party based on their messaging, i.e. marketing. If that company then abuses that trust by subsequently changing the basis for the agreement, then that's on them, not on the buyer.

What? No. A while is not a fixed amount of time. It is an inherently variable and flexible amount of time. That's the entire point. There's nothing naive about this, it is explicitly not naive, but has the expectation of its end built into it. The question is how and why that end comes about - whether it's reasonable, i.e. by a new generation arriving or significant refresh occurring, or whether it's unreasonable, i.e. through the manufacturer creating minuscule, arbitrary binnings in order to launch ever-higher priced premium SKUs.

No it isn't. The explicit promise of a flagship GPU is that it's the best GPU, either from that chipmaker or outright. Not that it will stay that way forever, not that it will stay that way for any given, fixed amount of time, but that it is so at the time and will stay that way until either the next generation or a significant mid-gen refresh.

Yes, of course, all criticism of shady marketing practices is bias, of course. Unbiased criticism is impossible! Such logic, much wow! It's not as if these companies have massive marketing budgets and spend hundreds of millions of dollars yearly to convince people to give them their hard-earned money, no, of course not :rolleyes: Seriously, if there's anyone arguing a woefully naïve stance here it's you with how you're implicitly pretending that humans are entirely rational actors and that how we are affected by our surroundings is optional. 'Cause if you're not arguing that, then the core of your argument here falls apart.



No, not everything is a shitty move, that's true. But you're giving a lot of benefit of the doubt here. An unreasonable amount IMO. Even if initial yields of fully enabled dice are bad - say 70%, which is borderline unacceptable for the chip industry - and that a further 50% of undamaged dice don't meet the top spec bin, what is stopping them from still making a retail SKU from the remaining 35% of chips?

The problem is, you're drawing up an unreasonable scenario. Nobody is saying Nvidia has to choose between either launching a fully enabled chip, or a cut down one. They could easily do both - supplies of either would just be slightly more limited. Instead they're choosing to only sell the cut-down part - which initially must include a lot of chips that could have been the top-end SKU, unless their yields are absolute garbage. Look at reports of Intel's fab woes. What yield rates are considered not economically viable? Even 70% is presented as bad. And 70% yields doesn't mean 70% usable chips, it means 70% fault-free chips.

A napkin math example: AD102 is a 608mm² almost-square die. As I couldn't find the specifics online, let's say it's 23x26.4mm (that's 607.2mm², close enough, but obviously not accurate). Let's plug that into Caly Technologies' die-per-wafer calculator (sadly only on the Wayback machinethese days). On a 300mm wafer, assuming TSMC's long-reported 0.09 defect rate (which should be roughly applicable for N4, as N4 is a variant of N5, and N5 is said to match N7 defect rates, which were 0.09 several years ago), that results in 87 total dice per wafer, of which ~35 would have defects, and 52 would be defect-free. Given how GPUs are massive arrays of identical hardware, it's likely that all dice with defects are usable in a cut-down form. Let's then assume that half of defect-free dice meet the binning requirements for a fully enabled SKU. That would leave Nvidia with three choices:

- Launch a cut-down flagship consumer SKU at a binning and active block level that lets them use all chips that don't meet binning criteria for a fully enabled chip, and sell all fully enabled chips in higher margin markets (enterprise/workstation etc.) - but also launch a fully enabled consumer SKU later
- Launch a fully enabled consumer SKU and a cut-down SKU at the same time, with the fully enabled SKU being somewhat limited in quantity and taking some supply away from the aforementioned higher margin markets
- Only ever launch a cut-down consumer SKU, leaving fully enabled chips only to other markets

Nvidia consistently picks the first option among these - the option that goes hard for maximizing profits above all else, while also necessarily including the iffy move of promising "this is the flagship" just to supersede it 6-12 months later. And that? That's a shitty move, IMO. Is it horrible? Of course not. But it's explicitly exploitative and cash-grabby at the expense of customers, which makes it shitty.


That depends on die size and actual yields. As I showed above, with published yields for the process nodes used here, there are still lots of chips that would meet the criteria for fully enabled SKUs.


Also remember that that chart for some reason only assumes MSRP or lower rather than the expected and actual reality of prices being MSRP or higher.
You beat me to words and I humbly lay down my arms.
Till next time :toast:
Posted on Reply
#86
skates
WastelandMost people probably don't even know what RT is. They just want their games to run pretty and smooth.

So far, the best thing to come out of ray tracing is that its absurd performance cost forced Nvidia to come up with DLSS, which in turn prompted AMD to develop FSR (and Intel to develop XeSS). These features are genuinely useful to people on low-to-mid-range hardware, which is almost everyone in the PC/console gaming space. AMD's implementation of the poorly named "DLSS 3" may continue that trend of RT-performance-compensators-that-prove-more-generally-attractive-than-RT-itself, albeit to a lesser extent.

(I single out AMD's implementation because Nvidia's version is limited to the 40-series, which is ludicrously expensive now and may not produce an affordable model at all, if Ampere stocks remain as high as I've been led to believe. RDNA3 may not have any affordable offerings either, but AMD's features are generally backwards compatible, if not totally open-source).

For people at the extreme high end, I get why RT is a major talking point. Look, if I were spending four figures on GPUs every two years, I'd want to know that I had the best of every feature too. It's a perfectly reasonable desire. And in the few current examples where RT is fully/properly implemented, RT does look better than raster, but the difference is subtle. You really have to look for it; you most likely wouldn't notice in the course of normal game play. Even if you do have among the best available RT hardware, it still isn't self-evident that enabling RT is worth the enormous trade off in performance.

RT's problem, in short, is that raster is damn pretty too. Developers have had a very long time to hone their methods of "faking" natural lighting in normal rasterized scenes, and they will continue to use and refine these techniques for many years to come because again, most of their audience isn't using RT-capable hardware.

I'm on a PC hardware enthusiast forum, so of course I care about RT, in the same way that I'd care about any shiny new innovation. I think that RT will one day become "the standard," but that day is still far in the future, effectively an eternity in the context of current hardware. If you aren't at the bleeding edge of the GPU-upgrade cycle, you shouldn't consider RT perf to be a deal-breaking factor in your purchasing decisions for quite some time to come, IMO. Decent RT perf simply costs too much, in a market where GPUs are already overpriced, generally.
Can you recommend a game using RT which ramps up the grittiness of a game? Last time I used it objects become shiny and new looking and I was expecting objects that were dirty to look 'dirtier', if that makes sense?
Posted on Reply
#87
BSim500
AusWolfPeople who care about RT think that everybody cares. People who don't care about RT think that nobody does. Which one is it, then?
They both have their points. There's nothing wrong with the option of RTX on high-end cards. The problem is RTX cores aren't "free" in terms of die space and for those who don't want it it's a tax not a feature when it gets forced onto lower end GPU's the way nVidia are doing. Perfect example - RTX 3050 vs GTX 1660S = "pay +50% more for exactly the same 3 year old performance..." Rip-off GPU pricing aside, part of the reason the RTX 3050 isn't cheaper is the fact the die size hasn't shrunk (GTX 1660 = 284mm2 @ 12nm vs RTX 3050 = 276mm2 @ 8nm) is because it's just been wasted on RTX for a GPU that's too slow to use them anyway (and for DLSS, gamers can still use FSR). Could that 276mm2 have been nearer 200mm2 and resulted in a much cheaper GTX 3050 GPU? Quite possibly. Instead the entire "budget" GPU has failed at both ends as enthusiasts buy faster cards whilst budget gamers buy AMD / last gen. It's a "white elephant" partly because of the RTX cores. Those who don't want them they aren't saying "no one should be allowed to use them", rather they are hoping that AMD has a bit more common sense than nVidia does for low-mid GPU's.
Posted on Reply
#88
AusWolf
BSim500They both have their points. There's nothing wrong with the option of RTX on high-end cards. The problem is RTX cores aren't "free" in terms of die space and for those who don't want it it's a tax not a feature when it gets forced onto lower end GPU's the way nVidia are doing. Perfect example - RTX 3050 vs GTX 1660S = "pay +50% more for exactly the same 3 year old performance..." Rip-off GPU pricing aside, part of the reason the RTX 3050 isn't cheaper is the fact the die size hasn't shrunk (GTX 1660 = 284mm2 @ 12nm vs RTX 3050 = 276mm2 @ 8nm) is because it's just been wasted on RTX for a GPU that's too slow to use them anyway (and for DLSS, gamers can still use FSR). Could that 276mm2 have been nearer 200mm2 and resulted in a much cheaper GTX 3050 GPU? Quite possibly. Instead the entire "budget" GPU has failed at both ends as enthusiasts buy faster cards whilst budget gamers buy AMD / last gen. It's a "white elephant" partly because of the RTX cores. Those who don't want them they aren't saying "no one should be allowed to use them", rather they are hoping that AMD has a bit more common sense than nVidia does for low-mid GPU's.
That's fair. I think the same way about Tensor cores, to be honest. All they ever get used for is DLSS, which has become dispensable ever since FSR has been around - not to mention that I prefer raw horsepower at native resolution. Upscaling should always be a last resort for any gamer at all price points, just in case you really need to squeeze a few more FPS out of your system. Yet, it's a selling point for some unknown reason. I'm not saying that one shouldn't be allowed to like or dislike RT, DLSS, or any modern die space consuming feature, but just because one has an opinion, doesn't mean that everybody else shares that same opinion.
Posted on Reply
#89
Mysteoa
RichardsThe ray tracing performance looks horrid .. the 4080 will land the knockout punch
If RX7000 series RT, which you haven't seen, is horrid, then what about the RT of RTX3000 series? Is that also horrid?
Posted on Reply
#90
erocker
*
Performance looks to be really good for 4080 and 7xxx series but the prices are not. What was a $500-700 segment 3-4 years ago is now $1000-1200+. Well beyond the rate of inflation and a horrible value.
Posted on Reply
#91
wolf
Performance Enthusiast
spnidelWe get it
Doesn't seem like it, in fact it's obvious you don't get it. Nobody is saying everybody cares, but there are people saying nobody cares.

And here I was thinking that obvious trolling should not be supported, perhaps try harder next time to actually make a point, that post was beyond childish, reported.
AusWolfPeople who care about RT think that everybody cares. People who don't care about RT think that nobody does. Which one is it, then?
Incorrect, or at least not what I was asserting or I've ever seen anyone else assert.

The no RT crowd often say nobody/who cares, the pro RT crowd aren't saying everybody cares, they're saying they care, as a direct response to nobody cares, which is factually incorrect.
ValantarThat depends
Naturally, there is no one single factor that leads to this practice, it's a combination of yields, volume considerations, deliberate segmentation for professional/golden binned product lines and likely even more factors unknown to consumers.

My point is I don't think it's a shitty move, it's just business. The same business nobody is a saint in, they have shareholders to please and so on.
Posted on Reply
#92
Minus Infinity
RT performance with FSR 2.1 enabled will be plenty good enough and there's FSR 3.0 to come. 4080 is quite impressive but price is simply outrageous, so to me it's DOA. Best thing about the 4080 is power efficiency.
Posted on Reply
#93
wolf
Performance Enthusiast
skatesbut being salty about folks who don't want RT is weird to me.
I'm not salty on people who don't want it, power to you. I'd like for people to stop generalising that nobody wants it, ie me responding to the person who wrote in all caps about 'who cares', people do, so stop using a flawed argument.

Again it's that I'd love to be able to strongly consider high end AMD GPU's, just like when they make compelling CPU's, I have no aversion or political reasoning behind who's products I'll buy (they all suck one way or another, so I just stick to the products themselves), but if those products can't or wont cater to what I want from a video card, it seriously narrows the choices. Hell even Intel's first stab at it was at least taking it more seriously than a compatibility checkbox.
Posted on Reply
#94
ratirt
AusWolfPeople who care about RT think that everybody cares. People who don't care about RT think that nobody does. Which one is it, then?
Oh the answer is so obvious here. I DON'T CARE. Sorry I had to :D
Towelie00AMD troll us , and now i don't know if a have to wait a real 7900xtx in april with better overclock stability and performance or if they wait for the 7950xt , i'm brainfucked, i don't wanna wait 6 months.
and the leak said the first version of 7900xt/xtx got a issue about max frequency due to production mistake... hum.... interresting.... i'm disgusted
What is this really? Real 7900xtx? 7900xtx release is just around the corner with 7900xt. Both cards launch at the same time. Leak said the first 7900xtx has an issue? It is not out yet put yourself together and stop listening to trolls in the media trying to get some subscribers and likes. Also, try not to be a troll yourself. Wait for reviews, these will tell you the real picture not so wanna be trolls in the media spewing crap.
AusWolfBecause gaming on a high-res display needs a high-end graphics card which have drifted way above my affordability zone. Besides, 1080p up to 24" still looks good enough, imo. My 2 cents. :)
How about this. I remember not long ago companies advertised cards for 4k gaming. Well this is now gone but how about RT and 1080p gaming? Obviously emphasis on RT which is so damn cool and makes your game hundred times better right? They are preparing for 4k gaming advertisement again it will just have RT there. It all repeats itself with a minor change RT in the mix.
Posted on Reply
#95
AsRock
TPU addict
AusWolfIf AMD can sell fully enabled chips even in their highest-end products, then so can Nvidia.
But why shoot them self's in the foot by doing so ?.
Posted on Reply
#96
Dirt Chip
ratirtOh the answer is so obvious here. I DON'T CARE. Sorry I had to :D

What is this really? Real 7900xtx? 7900xtx release is just around the corner with 7900xt. Both cards launch at the same time. Leak said the first 7900xtx has an issue? It is not out yet put yourself together and stop listening to trolls in the media trying to get some subscribers and likes. Also, try not to be a troll yourself. Wait for reviews, these will tell you the real picture not so wanna be trolls in the media spewing crap.


How about this. I remember not long ago companies advertised cards for 4k gaming. Well this is now gone but how about RT and 1080p gaming? Obviously emphasis on RT which is so damn cool and makes your game hundred times better right? They are preparing for 4k gaming advertisement again it will just have RT there. It all repeats itself with a minor change RT in the mix.
[not addressing specifically to you @ratirt as we are in agreement, just riding on the wave]
Some people are somewhat possets about product naming to the point it is disrupt their grasp of realty, starting to imagine "real" future product as if the one at hand doesn't exist.
By linking X9xx\Xx9x to "the best" you are on a colliding course with realty and realty stronger.
The realty: It is just a name and it doesn't promise anything, despite trying to force it on that, about the product performance or future product.
Spec backup by benchmark promise you performance.
Nothing will promise you anything about future product, the next day or the next year, and hopelessly trying to seize control about it will resolute in frustration and/or anger.
No wrong about that (to a point) but at least know where you are enter.
Posted on Reply
#97
ratirt
Dirt ChipSome people are somewhat possets about product naming to the point it is disrupt their grasp of realty, starting to imagine "real" future product as if the one at hand doesn't exist.
By linking X9xx\Xx9x to "the best" you are on a colliding course with realty and realty is stronger.
The realty: It is just a name and it doesn't promise anything, despite trying to force it on that, about the product performance or future product.
Spec and benchmark promise you performance.
Nothing will promise you anything about future product, the next day or the next year, and haplessly trying to seize control about it will resolute in frustration.
No wrong about that (to a point) but at least know where you are enter.
I literally have no idea what you want to say. Something about your reality does not budge?
Posted on Reply
#98
Valantar
wolfNaturally, there is no one single factor that leads to this practice, it's a combination of yields, volume considerations, deliberate segmentation for professional/golden binned product lines and likely even more factors unknown to consumers.

My point is I don't think it's a shitty move, it's just business. The same business nobody is a saint in, they have shareholders to please and so on.
And that's precisely where the core of our disagreement lies: I don't accept "pleasing shareholders" or "being profitable" as the main operating principle for any business. When that thinking takes over - which it largely has in the US and globally, yay late stage capitalism! - that is a mode of thinking that is so fundamentally flawed that it can only lead to two long term outcomes: economic collapse or corporate fascism. Either they collapse because this mode of business is fundamentally unsustainable, or they gather enough power to keep said power through force.

Put simply: any business has two core functions, to privide useful goods and/or services, and to provide gainful employment to people. Everything else is secondary to that, and any business that prioritizes anything other than that has lost the plot. Obviously I'm not so naive as to think that this is in any way a dominant mode of thinking today, but there are still degrees of difference - and what we're seeing here, and that's where I'm arguing that Nvidia is making shitty moves. They're exploitative, they're profiteering, they're laser focused on profits and margins above all else, and are making decisions that actively harm consumers to maximise profits.

My problem with your line of reasoning isn't so much that you don't hold to the same ideals as me, but that you're pretending that these actions are value neutral. They aren't. There are other possible choices, which would still have been profitable for Nvidia, that they don't make as they are less profitable, even if these would serve their customers better. This is in other words Nvidia actively choosing to maximize profits at the cost of the consumer.

(Obviously I'm also not saying other corporations are much better. AMD is pretty close to the same patterns these days, though not quite and not with the same history. Intel has historically been far worse. Etc.)
Posted on Reply
#99
bug
ValantarAnd that's precisely where the core of our disagreement lies: I don't accept "pleasing shareholders" or "being profitable" as the main operating principle for any business. When that thinking takes over - which it largely has in the US and globally, yay late stage capitalism! - that is a mode of thinking that is so fundamentally flawed that it can only lead to two long term outcomes: economic collapse or corporate fascism. Either they collapse because this mode of business is fundamentally unsustainable, or they gather enough power to keep said power through force.

Put simply: any business has two core functions, to privide useful goods and/or services, and to provide gainful employment to people.
The piece that you're missing is that the measure of you providing useful goods and/or services is exactly "being profitable" and, by extension, "pleasing shareholders".
Posted on Reply
#100
spnidel
wolfDoesn't seem like it, in fact it's obvious you don't get it. Nobody is saying everybody cares, but there are people saying nobody cares.

And here I was thinking that obvious trolling should not be supported, perhaps try harder next time to actually make a point, that post was beyond childish, reported.
bro you take online discussions way too seriously lmao
Posted on Reply
Add your own comment
Jun 3rd, 2024 05:50 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts