• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Hardware Elitism - a tech scourge that needs to die.

Status
Not open for further replies.
Hey! Don't knock multi-monitors until you've learned to live with them for awhile. Chances are, you will wonder how you ever lived with just one. And "trend"? Ummm, no. I've had a multi-monitor setup since W95 was a newborn. If there's any trend, it only due to some finally catching on.

You can have my full sized keyboard and mouse, and 2 x 24" monitors when you can pry them out of my cold, dead hands.
i got 2x32" and see no genuine use for two monitors outside of specific 2D workloads

I dont do 2D workloads, i'm a gamer or web browser... and two monitors doesnt work well for that. Okay yes i can fire up youtube or netflix on it... but i can only look at one screen at a time.
 
Those who are staying on old hardware, only buying what they can afford/need are sensible.

I have been known more than once to buy some PC hardware then realise months later I have brought something I really didnt need and could have saved money. I have an addiction to it I suppose.

But at the same time I dont begrudge those with top end setups either, I just feel we should all be respectful to each other and respect each other's situation. I think it is a great thing that the PC market supports so many different levels of products and it is a shame in the last couple of generations the budget areas have been forgotten by the vendors.
 
Okay yes i can fire up youtube or netflix on it... but i can only look at one screen at a time.
"Look at" one at a time, yes. But "using" both at once is their huge advantage. It is much easier to cut and paste from one document to another (or even several others) when all the documents are open and totally visible. Or copying files from one drive to another. Or streaming CNN or Pandorian in the corner of the 2nd monitor while working on the first. Or (or and) having your email client on the second - all WITHOUT overlaying or burying one window over another, or over several other windows.

Once you get used to having multiple monitors, it can and does make you more productive.

Just because you don't find something useful, that does not make it silly. I don't find RGB lighting useful at all. It does nothing for performance. It consumes some energy while not adding anything to performance. And it generates some heat - again without providing any gains in performance. And last, I prefer to pay attention to what's on my "multiple" monitors and not be distracted by my case, keyboard or mouse. So RGB adds no value in terms of performance. It adds no value to productivity, and in fact, because it can be distracting, it can hinder productivity.

So is RGB lighting silly? I would never say that - well, at least not out loud. ;)

The main point is, and with keeping in mind the topic and context of this thread, calling multi-monitor use "silly stuff" could easily be construed as an elitist point of view! Or if not elitist, then "sour grapes"!
 
I was left behind way back when the graphics slot was AGP 8x and it switched to PCI express. I could not afford a new mobo, cpu, memory and graphics card at the same time so it was Xbox to the rescue. So, I
sure ain't going to make fun out of someone with less and help them to get the most of what they have. So i may succumb to my fears of being left behind again and get a 3090 or a prem mobo that's not elitism, it's fear of this may be my last build...plus i give all my old builds away. I've seen people here being shunned for console gaming. Hell, i've been laughed at for using a controller instead of a mouse/keyboard while playing a first person shooter...my hands hurt when i use a mouse/keyboard combo for gaming. The best policy is be kind for you don't know what tomorrow may bring.
 
"Look at" one at a time, yes. But "using" both at once is their huge advantage. It is much easier to cut and paste from one document to another (or even several others) when all the documents are open and totally visible. Or copying files from one drive to another. Or streaming CNN or Pandorian in the corner of the 2nd monitor while working on the first. Or (or and) having your email client on the second - all WITHOUT overlaying or burying one window over another, or over several other windows.

Once you get used to having multiple monitors, it can and does make you more productive.

Just because you don't find something useful, that does not make it silly. I don't find RGB lighting useful at all. It does nothing for performance. It consumes some energy while not adding anything to performance. And it generates some heat - again without providing any gains in performance. And last, I prefer to pay attention to what's on my "multiple" monitors and not be distracted by my case, keyboard or mouse. So RGB adds no value in terms of performance. It adds no value to productivity, and in fact, because it can be distracting, it can hinder productivity.

So is RGB lighting silly? I would never say that - well, at least not out loud. ;)

The main point is, and with keeping in mind the topic and context of this thread, calling multi-monitor use "silly stuff" could easily be construed as an elitist point of view! Or if not elitist, then "sour grapes"!
i did say its useful for 2D workloads - but not many people out there are doing what you just talked about. These are gamers getting two or three monitors just for the looks and to be elite, they arent watching CNN and compiling emails from word documents
 
These are gamers getting two or three monitors just for the looks and to be elite
Some are. But flight and car simulator programs, for example, are much more entertaining and realistic when 3 monitors provide a full frontal and wrap around view.

I don't think it is being an elitist if one buys fancy hardware to actually use and take advantage of what it has to offer.

Being an elitist (again, in the context of this thread) would be if that multi-monitor user put down or belittled others for only having one monitor.
 
Some are. But flight and car simulator programs, for example, are much more entertaining and realistic when 3 monitors provide a full frontal and wrap around view.

I don't think it is being an elitist if one buys fancy hardware to actually use and take advantage of what it has to offer.

Being an elitist (again, in the context of this thread) would be if that multi-monitor user put down or belittled others for only having one monitor.
those are good game examples that do use it i hadnt thought of, since i dont play those genres
 
The best policy is be kind for you don't know what tomorrow may bring.
And we don’t know what someone else’s story is. Everyone has a story that is only theirs.
 
@the54thvoid pfft, that's nothing! If you wanna see elitism, just check out my mate's cat's reaction to me whenever I visit. Curious and a half greeting when I come through the door, with just one sniff of my hand, Madame recognizes me. She then nonchalantly walks off, completely uninterested and any hint of a greeting disappears. If I then try to pay her any attention, she'll hiss, bite and scratch at me. This is because she's clearly superior to me and we both know it. :laugh:

But seriously, I couldn't agree more. Elitism is snobbism and has no place here, or anywhere. No one is "better" than anyone else, no matter what they own, or how much money they have. You give out those infractions if you need to.

One place it's rife is in the high end Hi-Fi market: "Oh, so you don't have a £10000 twin monoblock valve amplifier, matching speakers, a 100 inch 8K TV and a huge bank balance? You're simply not good enough for us. Go away, peasant."

Looking at the other side of the argument though and you get snowflakes who take offense at every little thing that isn't glowing praise. Constructive criticism and good advice that one doesn't necessarily wanna hear are all valid arguments and should not be censored. There always needs to be a sensible balance.
 
Truth is the moment you do something with technology its like outdated the next day.
 
I sure enjoyed overclocking E0 Q9550 fair 30% (while undervolted) and considerer it as major part of the hobby at time.
But overclocking doesn't even give anymore significant performance increase/more longevity as good PC.
Nowadays biggest difference manual overclocking would make is in max power consumption/heat output.

And while water is excellent for absorpting heat from short load spikes, in continuous cooling per noise average liquid cooler sold using fashion just isn't better than high end heatpipe cooler.

I got ridiculed on Anandtech from pairing a H81 ITX board with a 4790K when I cited similar reasons back in 2014. Why should I pay triple for a Z97 board for a pointless 10% OC which is not even guaranteed at the end of the day?
 
I don't know if anyone has talked about the phycology of all this. But if you are putting others down for wanting to learn, being excited about the mediocre or making mistakes, than you have your own set of issues. True strength is shown threw kindness. To mock or make fun of others is someone that is weak and lacks self respect. And they will argue about it until you just agree to shut them up.
 
I don't know if anyone has talked about the phycology of all this. But if you are putting others down for wanting to learn, being excited about the mediocre or making mistakes, than you have your own set of issues. True strength is shown threw kindness. To mock or make fun of others is someone that is weak and lacks self respect. And they will argue about it until you just agree to shut them up.
I couldn't agree more. I despise people like that, but unfortunately, there are a whole lot of them out there.
 
Real talk if you don't have 6 31inch ultra wide 8k monitors you are a pleb and should go back to your 386 and and monochrome 13in crts
 
Real talk if you don't have 6 31inch ultra wide 8k monitors you are a pleb and should go back to your 386 and and monochrome 13in crts
I remember buying my first flat screen 15in . I caught hell from the CRT elitists
 
I remember buying my first flat screen 15in . I caught hell from the CRT elitists
man those were the days with response times that felt slower than dial up, more smearing than my the skiddies in my undies and HOLY CRAP 15" LCDS WERE SO EASY TO CARRY
 
I remember buying my first flat screen 15in . I caught hell from the CRT elitists

Ah yeah, those were the days..... You still got motion blur on those early flat screens even when you had motion blur disabled
 
i was big into the LAN party scene - on public transport buses, 45 minute bus ride from my little town to the slightly bigger town a mountain over...

And i'd take my PC with me. LCD's changed my world. (we had 28.8k dial up at the time)
 
I sure enjoyed overclocking E0 Q9550 fair 30% (while undervolted) and considerer it as major part of the hobby at time. But overclocking doesn't even give anymore significant performance increase/more longevity as good PC. Nowadays biggest difference manual overclocking would make is in max power consumption/heat output.
+1. To me overclocking peaked before it became an "industry" in itself. Take a couple of Celeron 366's, throw them onto an Abit BP6 (dual socket), OC both +50% to 550MHz and enjoy a dual-core + 50% 1T gain long before dual-core's existed for the sake of 2x £8 CPU coolers was certainly a noticeable boost worth it. Fast foward to today and when people blow a £150 premium on Z boards, K chips and cooling, then ridicule lower end B boards for being "locked", they overlook the fact you can often just buy the next locked chip up for that B board for same overlocking premium (eg, £200 i5-10600K + £150-£200 Z490 + £50-£80 liquid cooling vs £230 i7-10700F + £100 B460 + £25 212 EVO) and the gain ends up... quite a lot less impressive than it used to be...
 
i was big into the LAN party scene - on public transport buses, 45 minute bus ride from my little town to the slightly bigger town a mountain over...

And i'd take my PC with me. LCD's changed my world. (we had 28.8k dial up at the time)
Cases were made of 1.5mm-thick SECC back then, too. I think case weight became inversely proportional with heatsink/radiator weight over time. :laugh:
My gosh, if you showed-up at a Lan party back then with anything other than a Chieftec Dragon, CoolerMaster Stacker or a Thermaltake Xaser (to name a few) with all the UV to potentially give you skin cancer and overclocked to voltage levels that make us blush now, you would be looked sideways by everyone so much so the Earth spun faster and you actually managed to time travel to the next day.
But then, the kid(s) that had modded his pops 1995 eMachines/Persario/Vectra into a sleeper and would just frag or rush anyone in that party, wouldn't be smug. He/She would just collect the wins/prizes/screenshots and head home.
 
Cases were made of 1.5mm-thick SECC back then, too. I think case weight became inversely proportional with heatsink/radiator weight over time. :laugh:
My gosh, if you showed-up at a Lan party back then with anything other than a Chieftec Dragon, CoolerMaster Stacker or a Thermaltake Xaser (to name a few) with all the UV to potentially give you skin cancer and overclocked to voltage levels that make us blush now, you would be looked sideways by everyone so much so the Earth spun faster and you actually managed to time travel to the next day.
But then, the kid(s) that had modded his pops 1995 eMachines/Persario/Vectra into a sleeper and would just frag or rush anyone in that party, wouldn't be smug. He/She would just collect the wins/prizes/screenshots and head home.
antecs with neon reactive DFI parts and UV CCFLs - LED's werent common yet
 
+1. To me overclocking peaked before it became an "industry" in itself. Take a couple of Celeron 366's, throw them onto an Abit BP6 (dual socket), OC both +50% to 550MHz and enjoy a dual-core + 50% 1T gain long before dual-core's existed for the sake of 2x £8 CPU coolers was certainly a noticeable boost worth it. Fast foward to today and when people blow a £150 premium on Z boards, K chips and cooling, then ridicule lower end B boards for being "locked", they overlook the fact you can often just buy the next locked chip up for that B board for same overlocking premium (eg, £200 i5-10600K + £150-£200 Z490 + £50-£80 liquid cooling vs £230 i7-10700F + £100 B460 + £25 212 EVO) and the gain ends up... quite a lot less impressive than it used to be...

Besides, the games in that era like Half-Life were very CPU limited, and CPUs were mostly priced by MHz than by core count.

Whereas the real-world difference in games between a 10400F and a 5900X is minuscule at best.
 
i morethan agree buddy ive allways been a gen or 2 away but thats by choice i dont believe in paying through the nose for the latest and greatest i take great pleasure from biding my time and saving money. but i have come across folks "not many" who think there a chip above the rest. i belong to another forum for astronomers and to say its bad on there is a understatement everybodys in clicks.
 
Last edited:
antecs with neon reactive DFI parts and UV CCFLs - LED's werent common yet
True. If your board had a 7-segment, that would actually be it for extra LEDs.
I think the major "smug"/Elitism inflection point was actually post 2009-2011 years, when hardware actually became the cheapest and anybody could get that 2500K/2600K + GTX580/HD7970 combo. After that, actually getting high-end meant the R9 390X and the 980Ti that started being more expensive, by the first wave of the mining boom.
I'm not forgetting the SR-7 days when actually running dual-procs and quad-SLi meant you were top-dog and e-Peen envy. Even then, owners of those would actually share most of the knowledge on those platforms and overclocking tips.

Minor edit for grammar.
 
Last edited:
It's like rabid sports fanboys except people are making up their own teams as they go, so no one ever agrees on anything

the sheer amount of misinformation out there doesnt help
 
Status
Not open for further replies.
Back
Top