Tuesday, May 18th 2021

Fancy Your Hardware? TV Pricing Sees 30% Increase, Could Escalate Further

We've all been beating dead consumer horses for a while now in most product areas that require semiconductors to operate - and that applies to almost anything, really. Whether CPU shortages from the AMD camp, GPU shortages from both AMD and NVIDIA, increasing prices of storage due to a new cryptocurrency boom, scalpers left and right on the most recent PC and console hardware, shortages on semiconductors for car manufacturers and technological companies like Bosch... It's a wild ride in the semiconductor world right now. And if you were looking at upgrading your media-consumption living room with a fancy new TV, you will also have to cope with increased pricing now, and perhaps further price climbs and shortages in the future.


Market research company NPD has said as much in its most recent market analysis; they've concluded that Smart TV prices have already increased 30% comparatively to the first months of 2020. The price increases are expected to hit anything with a screen - whether smartphones, TVs, laptops, or any other product that has to take up a portion of the world's panel output. This is the market correcting itself when it comes to the supply/demand equation - increased demand post-COVID-19 and global supply chain issues have set up a series of network effects that have led manufacturers to increase product pricing according to demand, passing on additional supply costs on to the customer, and simultaneously attaching the highest possible profits on the existing (and insufficient) supply. It's highly unlikely that this semiconductor supply shortage will see a turnaround throughout 2021.
Source: CNBC
Add your own comment

53 Comments on Fancy Your Hardware? TV Pricing Sees 30% Increase, Could Escalate Further

#26
AusWolf
randompeepUh, someone else is calling out the consumers...
I'm currently running on a 50$ 960 2GB too and idgaf about the newer cards unless I get at least the same performance per watt + performance per buck combo

If you buy the fastest car...or a 'supercar' in general (RTX 3080/90 + 6800XT/6900XT), the price hick-up may be justified! But throw that shit into low-end/mid-tier (best selling products), and you'll be facing backlash from the peers who used to consider brand new GPUs in that $$$ range.

Cash in, cash out...but sooner or later they'd be burned out
Agreed.

I used to own a 5700 XT, but sold it when I saw prices hike. I used the difference of what I paid and what I got for it to buy a 1650. I'm just as happy as I was with the 5700 XT.
Posted on Reply
#27
AsRock
TPU addict
BSim500I'd love to see more "dumb" TV's. Why? Streaming "dongles" like the Amazon Fire stick tend to be supported (on the software side) for longer, whereas some TV brands I've owned in the past have had barely a couple of years of software upgrades and then quietly abandoned to 'encourage' you to buy a new set, when literally nothing about the display technology has improved much between them and it's just a glorified "app update" that's what's "new" for this year...
Buy a monitor haha, smart TV's are like spyware.
Posted on Reply
#28
rutra80
ChomiqYes, remote control is the thing that should dictate your TV purchase...
Yes actual mining is done on the remote and TV is only used for internet connection.
Posted on Reply
#29
80-watt Hamster
NaterThis.

What the F' do you guys expect when the Fed just keeps printing money?
That's a contributing factor, but not the only one, and probably not one of the most significant.
Posted on Reply
#30
yotano211
All prices are going up. This month has had the highest inflation in 12 years.
Posted on Reply
#31
R-T-B
rutra80Yeah my TV mines like crazy, I already ordered 10 more.
Mining has no relation to the global silicon supply problem.
mechtechDon't watch TV..................
My LG B9 55" has never seen a day of TV in its life. Has done a lot of 120hz 4k gaming though...
PhilaphlousBlamed on a chip shortage but actually is inflation...
NaterThis.

What the F' do you guys expect when the Fed just keeps printing money?
yotano211All prices are going up. This month has had the highest inflation in 12 years.
80-watt HamsterThat's a contributing factor, but not the only one, and probably not one of the most significant.
It's only able to account for like 6% at best. That's hardly low but it isn't the core issue.

We are finally feeling the full impact of being shut down for nearly 2 years, plus taiwan droughts.
AsRockBuy a monitor haha, smart TV's are like spyware.
Find me a decent OLED monitor and I'm game.
Posted on Reply
#32
AsRock
TPU addict
R-T-BMining has no relation to the global silicon supply problem.


My LG B9 55" has never seen a day of TV in its life. Has done a lot of 120hz 4k gaming though...





It's only able to account for like 6% at best. That's hardly low but it isn't the core issue.

We are finally feeling the full impact of being shut down for nearly 2 years, plus taiwan droughts.


Find me a decent OLED monitor and I'm game.
Some people are just more picky than others.
Posted on Reply
#33
R-T-B
AsRockSome people are just more picky than others.
Definition of luxury goods I suppose.

But I meant more any OLED monitor at all. It'd be a start... wish the industry would get on that.
Posted on Reply
#34
watzupken
Prime2515102Here's an idea: Everybody stop over-paying for everything and not buy anything for one month. The prices will instantly plummet.
Completely agree. Companies are trying to stir people into frantic buying by constantly telling people that there is a shortage and prices are increasing. When people start buying parts to hoard, they are creating more "shortage" since its not a "normal" demand, i.e. if you usually only need 2 SSDs, now you buy another 1 or 2 extras, so artificially increasing demand. Then manufacturers will take the opportunity to increase prices again because they know they can.
Posted on Reply
#35
las
AusWolfThe funny thing is, you don't really need anything more than a 960 for 1080p gaming. It's just that people want 300,000 fps instead of 200,000, and super uber high resolutions and DLFXXXSSAA enabled, and it seems they're stupid enough willing to pay thousands to get it.
Yes, yes you do. Unless you play 5 year old games on medium maybe.
Posted on Reply
#36
AusWolf
lasYes, yes you do. Unless you play 5 year old games on medium maybe.
No, you don't. All you need is a compromise between graphics settings and performance.

Edit: Here is an example of a 2 GB 960 running Cyberpunk 2077 at 1080p (with minor scaling) with above 30 fps average. Sure, there are a couple of dips, but this was recorded with a dual-core Pentium that's basically pegged at 100% usage.
Posted on Reply
#37
las
AusWolfNo, you don't. All you need is a compromise between graphics settings and performance.

Edit: Here is an example of a 2 GB 960 running Cyberpunk 2077 at 1080p (with minor scaling) with above 30 fps average. Sure, there are a couple of dips, but this was recorded with a dual-core Pentium that's basically pegged at 100% usage.
30 fps you kidding me? with dips to 10 fps too? Yeah, no thanks

The game runs like pure garbage using a 960 even on lowest settings



Just like all newer games do
Posted on Reply
#38
TheinsanegamerN
80-watt HamsterThat's a contributing factor, but not the only one, and probably not one of the most significant.
HALF of all cash in circulation in the US economy was "printed" in the last 15 months. Basic economics 101, hyperprinting leads to hyperinflation. People will try to find any excuse as to why this is not the case, but numbers dont lie, and everything, regardless if it is in short supply or not, is going up in price, often very noticeably.

You simply cannot pump 6+ trillion into the economy in a few months and expect everything to remain stable.
Posted on Reply
#41
R-T-B
Basically, as I said earlier, this printing will produce inflation, but nothing like a lot of pundits are trying to tell you.

The actual economists are pegging the math of it's inflationary impact at around 6%
Posted on Reply
#42
fb020997
lasPeople are willing to pay more so they are doing it. Thats supply vs demand. Can't be stopped. Will normalize over time.



I own C9 and G1 and there is not much difference, mostly in HDR (peakbrigthness is higher on G1). I went 12" up and got OLED Evo panel. Wallmounting looks insane on G series, zero gap. Which is mostly why I went with G series this time. Huge gap on C series when wallmounted. Slightly better motion on G1 too, but Sony is the king of motion (interpolation in 24-30 fps content, not gaming - LG wins hands down on gaming; lower latency, more features).

C9 vs CX vs C1, barely any difference here. Only G series got the new OLED Evo panel.

C9 had everything you wanted back in 2019; HDMI 2.1, 120 Hz, VRR 40-120, Gsync, ALLM etc. LG was 1-2 years ahead of everyone else this year.

C series is now considered mid-end, like B series. A is entry level. G and up (OLED Evo panel, with +20% peak brightness) = high end. C series are still great and pretty much destroys all other LCD TVs, but it's not part of LGs high-end line anymore.

You are paying a huge premium for Panasonic and design is pretty bad overall and they are much worse for gaming etc (input lag, features and issues here). Their RMA sucks badly in europe (probably worse in US, a market they vanished from) and software issues are sometimes never fixed. They outsourced their LCD line and will probably soon outsource their OLED line as well. It's not a huge market for them, hence the price. They can't compete, so they price their TV very high. Very few buy them as a result. Good TVs for movies and series, bad overall for gaming especially. You buy a TV based on how the remote is? It's not like Panasonics remote is better than Sony.

I'd get LG G1 or Sony A90J. Best in Class OLEDs overall today. C1 and A80J good too but many OLEDs are good then.
Panasonic and LG (and mostly anyone else, except for Sony) 2021 TVs don’t support DTS (SHAME!!!), so most of my Blu-Ray rips would be silent. I rip them instead of playing them with a player because of the MFing slowness of the horrible copy protection system they use (a f*****g java virtual machine and other sh!t like that), and other unskippable intros. Sorry for the bad words, but I don’t have anything better to say about that.
I currently have a Panasonic plasma (50” VT50 series), and if the TV market stays like that, my next OLED (or MicroLED) TV would be a Sony. I am in a niche, I know that, since streaming (which I really don’t like from an audio and video quality standpoint vs. a very low compression BD rip) is used by nearly everyone with a smart TV, and streaming is at most the base dolby for bandwidth reasons.
Posted on Reply
#43
las
fb020997Panasonic and LG (and mostly anyone else, except for Sony) 2021 TVs don’t support DTS (SHAME!!!), so most of my Blu-Ray rips would be silent. I rip them instead of playing them with a player because of the MFing slowness of the horrible copy protection system they use (a f*****g java virtual machine and other sh!t like that), and other unskippable intros. Sorry for the bad words, but I don’t have anything better to say about that.
I currently have a Panasonic plasma (50” VT50 series), and if the TV market stays like that, my next OLED (or MicroLED) TV would be a Sony. I am in a niche, I know that, since streaming (which I really don’t like from an audio and video quality standpoint vs. a very low compression BD rip) is used by nearly everyone with a smart TV, and streaming is at most the base dolby for bandwidth reasons.
Well Sony A90J is pretty much the best OLED right now, especially for movies

I still went with LG G1 tho, considered the Sony alot but wallmounting on the G1 is next level, zero gap and so thin
Posted on Reply
#44
AusWolf
las30 fps you kidding me? with dips to 10 fps too? Yeah, no thanks

The game runs like pure garbage using a 960 even on lowest settings



Just like all newer games do
Like I said, the drops in the video I linked are due to the dual core Pentium processor used for the test, not the 960. Note how low the GPU usage is during those dips while the CPU is pegged at 100%.

As for your videos, 1080p 30 fps in Cyberpunk is something PS4 users could only dream about. Are you saying the PS4 is garbage, and should never be used for gaming?

Not to mention that there is option for resolution scaling too, which when cleverly used, doesn't take away much from the image quality, but gives you better framerates. With any 4-core 8-thread CPU and a tiny bit of resolution scaling, the 960 is perfectly capable of 40 fps average which I find absolutely fine. Or maybe it's just me, having grown up with a 300 MHz Celeron MMX and an honestly rubbish S3 ViRGE 4 MB.
Posted on Reply
#45
las
AusWolfLike I said, the drops in the video I linked are due to the dual core Pentium processor used for the test, not the 960. Note how low the GPU usage is during those dips while the CPU is pegged at 100%.

As for your videos, 1080p 30 fps in Cyberpunk is something PS4 users could only dream about. Are you saying the PS4 is garbage, and should never be used for gaming?

Not to mention that there is option for resolution scaling too, which when cleverly used, doesn't take away much from the image quality, but gives you better framerates. With any 4-core 8-thread CPU and a tiny bit of resolution scaling, the 960 is perfectly capable of 40 fps average which I find absolutely fine. Or maybe it's just me, having grown up with a 300 MHz Celeron MMX and an honestly rubbish S3 ViRGE 4 MB.
Yeah PS4 is garbage for playing new games and Cyberpunk should never have been released for last gen consoles. It was removed from PS Store for a reason.

Sub 100% resolution scaling and 30 fps is not something I would ever enjoy or accept in a PC game, so... I guess we are too far apart. I'd never accept 30-40 fps just like I would never accept 1080p gaming.
Posted on Reply
#46
AusWolf
lasYeah PS4 is garbage for playing new games and Cyberpunk should never have been released for last gen consoles. It was removed from PS Store for a reason.

Sub 100% resolution scaling and 30 fps is not something I would ever enjoy or accept in a PC game, so... I guess we are too far apart. I'd never accept 30-40 fps just like I would never accept 1080p gaming.
Well, some people would never accept not having the latest Ferrari in the garage. As for me, I'd never accept being unemployed and homeless.

What I'm saying is, there is a difference between wants and needs, and there's nothing wrong with it. If you can afford 4K gaming and/or super high framerates, go for it. But it's clearly not the standard, especially with the current state of the GPU market. ;)
Posted on Reply
#47
R-T-B
lasWell Sony A90J is pretty much the best OLED right now, especially for movies

I still went with LG G1 tho, considered the Sony alot but wallmounting on the G1 is next level, zero gap and so thin
I do believe the gaming support on LGs is better too (adaptive sync and such).
Posted on Reply
#48
las
R-T-BI do believe the gaming support on LGs is better too (adaptive sync and such).
Yeah Sony is mostly best in terms of motion and HDR peak nits (altho LG G1 is close on the peak nits, because of OLED Evo panel which A, B and C series does not have this panel, this year).
LG is still considered the best ALL-ROUND OLEDs. Sony and Panasonic has some tricks for movies but loses in gaming and general feature-support.
AusWolfWell, some people would never accept not having the latest Ferrari in the garage. As for me, I'd never accept being unemployed and homeless.

What I'm saying is, there is a difference between wants and needs, and there's nothing wrong with it. If you can afford 4K gaming and/or super high framerates, go for it. But it's clearly not the standard, especially with the current state of the GPU market. ;)
Well yeah, but 1080p at low preset with 30 fps is something you can get from a PC found at the landfill pretty much. I will probably never understand, I would rather NOT BE GAMING than play games at 1080p with 30 fps using LOWEST POSSIBLE SETTINGS. Games will look and play like pure trash. Immersion completely gone. Experience ruined.
Posted on Reply
#49
R-T-B
lasYeah Sony is mostly best in terms of motion and HDR peak nits (altho LG G1 is close on the peak nits, because of OLED Evo panel which
Oh you got one of those...

You make me green with frog envy. I have a B9 55".
Posted on Reply
#50
AusWolf
lasWell yeah, but 1080p at low preset with 30 fps is something you can get from a PC found at the landfill pretty much.
That's what I mean. Compared to the late 90s and early 2000s when you had to build a completely new PC basically every year just to be able to play the newest games at any fps, this isn't such a bad situation.
lasI will probably never understand, I would rather NOT BE GAMING than play games at 1080p with 30 fps using LOWEST POSSIBLE SETTINGS. Games will look and play like pure trash. Immersion completely gone. Experience ruined.
I've heard people say that any car with an engine below 300 horsepower drives like pure trash, experience ruined, etc. Some people would never game below 120 fps, whereas I would take a rock solid 40 over an unstable 60 any day. I mostly play games for story and feeling, not to flex my reflexes. Again, needs and wants. :)

A little JayzTwoCents that I totally agree with (although it's a 980 this time around):

Posted on Reply
Add your own comment
Apr 24th, 2024 23:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts