• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Let's discuss: there's always something better on the horizon vs smart upgrading

Joined
Feb 26, 2019
Messages
91 (0.04/day)
I'm trying to get a discussion going on this topic.

I am of the opinion that there is such a thing as a best time to build a new rig. I am opposing the idea that there's always something better on the horizon, and you'll end up waiting forever. I will not include the position of upgrading because current build died.

So here's my argument. You do research, you take notes on incoming tech releases for cpu, gpu etc., you make a timetable, and you compare the benefits of waiting for them instead of building now. As an example, I've got a 3570k at stock and a 1070 (which is probably not running at its best with this ancient cpu). My monitor died (1080p, 120hz, 24inch), so obviously I'm going to want something better. So I upgraded to 1440p, 144hz, 27inch.

I look at some benchmarks, and I see that some of the newish games don't really run at 60fps on my new resolution with the 1070 (nevermind 144fps), and the cpu ain't helping either. Modded Skyrim and Witcher wreck my pc, Starcraft 2 starts to choke when there's a lot of units, CyberPunk 2077 will probably need medium settings for stable fps, etc. Sacrifices have to be made.

I could buy something today that would put to shame my current build (and I can salvage the ssds, psu, case, so some money saved), or wait till end of the year for zen 3, intel 10nm, and new gpus from both sides.

Ok, but what comes after? Zen 4 probably won't be on AM4, ddr5 on the horizon as well as usb 4.

So let's make some math. DDR4 came in 2014, DDR5 will come around 2021-2023, that's a lifespan of at least 8 years as top dog for DDR4, and maybe 12 before it gets discontinued (or at least not mass market). USB 4 with the bandwidth of Thunderbolt 3. And, with these 2 platform changes, you also get Rocket/Meteor Lake on 5-7nm for Intel, Zen 5/6 on 3-7nm for AMD, and new Nvidia gpu + RDNA 3/4 from AMD. PS5 and XSX will also have been out on the market for 2-3 years, giving developers time to finally build their games for high performance SSDs from the ground up and forget about HDD. Hopefully the other bottlenecks between storage, ram and cpu (Mark Cerny presentation of technical PS5 details) will get resolved for the PC platform.

And why do I think that I'm making a good choice by waiting for 2021-2023, hence my stance on this debate? Well, the tech will last me a good 6-10 years, and honestly, what comes after probably won't be that impressive. CPU/GPU on 1nm vs 3/5/7? Can they even build something at 1nm? Silicon will probably be replaced before I have to build a new computer. DDR6, how long will that be before it comes out, 8-12 years? New monitors, pfff it'll be ips/tn/va until microled will be affordable at small/big sizes. Advances in audio, not likely. Something else better than VR/AR? Maybe, but not for the masses.

So, what say you? Do you side with build when you want or smart upgrading?
 
I generally build when the itch becomes to much to ignore. I'm at a stage in life that I don't worry to much about what's around the corner. If it's fits my criteria for a potentially fun build I'll try it out. This is my only real hobby so I indulge myself when the urge strikes. My upgrade patterns are pretty erratic tho.
I jumped on a 2600k/z68 rig as soon as it came out and ran it hard for 5 or 6 years. I wish I hadn't sold it, sob was bulletproof. I upgraded to x99, admittadly not my greatest idea. Got bored with the 6850k and a max oc of ddr4 3200 real quick, mb won't do one MHz beyond 3200. So went 8086k/z370 to shoot for 5ghz. Ran that for a year or so but didn't/dont much like the mb. So went 9900kf/z390, caught a couple saaaweet deals on mbs (both are excellent cpu AND mem ocers) and have been alot of fun to play with. Picked up a couple sets or 3 of memory to oc during the lockdown and that's where I'm at currently.
Looking forward to seeing what amd does next in the cpu world but I'm an oc junky so I likely won't build anything amd until there's something to oc! Hopefully sooner rather than later but I'm not holding my breathe.
The new round of gpus should be exciting so looking forward to what amd does(you can do it!) while dreading nvidia's awful pricing.
 
I'm trying to get a discussion going on this topic.

I am of the opinion that there is such a thing as a best time to build a new rig. I am opposing the idea that there's always something better on the horizon, and you'll end up waiting forever. I will not include the position of upgrading because current build died.

So here's my argument. You do research, you take notes on incoming tech releases for cpu, gpu etc., you make a timetable, and you compare the benefits of waiting for them instead of building now. As an example, I've got a 3570k at stock and a 1070 (which is probably not running at its best with this ancient cpu). My monitor died (1080p, 120hz, 24inch), so obviously I'm going to want something better. So I upgraded to 1440p, 144hz, 27inch.

I look at some benchmarks, and I see that some of the newish games don't really run at 60fps on my new resolution with the 1070 (nevermind 144fps), and the cpu ain't helping either. Modded Skyrim and Witcher wreck my pc, Starcraft 2 starts to choke when there's a lot of units, CyberPunk 2077 will probably need medium settings for stable fps, etc. Sacrifices have to be made.

I could buy something today that would put to shame my current build (and I can salvage the ssds, psu, case, so some money saved), or wait till end of the year for zen 3, intel 10nm, and new gpus from both sides.

Ok, but what comes after? Zen 4 probably won't be on AM4, ddr5 on the horizon as well as usb 4.

So let's make some math. DDR4 came in 2014, DDR5 will come around 2021-2023, that's a lifespan of at least 8 years as top dog for DDR4, and maybe 12 before it gets discontinued (or at least not mass market). USB 4 with the bandwidth of Thunderbolt 3. And, with these 2 platform changes, you also get Rocket/Meteor Lake on 5-7nm for Intel, Zen 5/6 on 3-7nm for AMD, and new Nvidia gpu + RDNA 3/4 from AMD. PS5 and XSX will also have been out on the market for 2-3 years, giving developers time to finally build their games for high performance SSDs from the ground up and forget about HDD. Hopefully the other bottlenecks between storage, ram and cpu (Mark Cerny presentation of technical PS5 details) will get resolved for the PC platform.

And why do I think that I'm making a good choice by waiting for 2021-2023, hence my stance on this debate? Well, the tech will last me a good 6-10 years, and honestly, what comes after probably won't be that impressive. CPU/GPU on 1nm vs 3/5/7? Can they even build something at 1nm? Silicon will probably be replaced before I have to build a new computer. DDR6, how long will that be before it comes out, 8-12 years? New monitors, pfff it'll be ips/tn/va until microled will be affordable at small/big sizes. Advances in audio, not likely. Something else better than VR/AR? Maybe, but not for the masses.

So, what say you? Do you side with build when you want or smart upgrading?
DDR4 for the first 2 years was overpriced and too slow. You could get DDR4-2133 for a lot more than DDR3-2133. New tech is usually much more expensive and just as fast/slightly slower than the old as you start at the bottom again with efficiency. For example: 14nm is beating 10nm right now because Intel has squeezed out 14nm to its limits while 10nm is in its infancy. Buying 10nm when it first comes out next year would be foolish in that aspect, even if it's new tech. I think the same will happen with DDR5. It'll be about as fast as mid-range/high-end DDR4 while costing a lot more.
 
I am opposing the idea that there's always something better on the horizon, and you'll end up waiting forever.
But that is opposing a simple fact! There "IS" always something better on the horizon or just around the corner. The problem is, computers consist of 100s if not 1000s of different technologies - and they all progress and advance on their own timetables.

And that mismatch of time tables is what makes your below argument (as written) invalid.
So here's my argument. You do research, you take notes on incoming tech releases for cpu, gpu etc., you make a timetable, and you compare the benefits of waiting for them instead of building now.
The next generation CPU is NOT going to be released at the same time as the next gen GPU. Neither will the next gen SSD, SATA, USB, PCIe, RAM, HDMI, DisplayPort, or [fill in the blank]. They all have their own time tables.
So, what say you? Do you side with build when you want or smart upgrading?
Yes.

First priority for my new build (or upgrade) is based on "need". If I "need" a new computer or upgrade "now", I will research what is out there "now" and buy "now".

If I don't need it now, I will take the time to research deeper, see what is out there now, build up my budget, then pull the trigger. I don't buy cutting edge technologies because inevitably it will be more expensive, have fewer available options and is likely to still have a few bugs in it.

Then, after I have my new build up and running, I will stop looking at what is just around the corner for I know there will ALWAYS be something that will supersede something I just bought! Regardless how close to the cutting edge my upgrade was. :(

So IMO, it is "smart" upgrading to build when you want - just don't buy unproven, cutting edge technologies that are still bleeding all over the place.
 
I feel this thread needs to be refined in terms of what tech you are talking about. Today the traditional update paths have been reversed in terms of GPUs vs CPUs. With AMDs scheduled launches and now Intel's hyper threading support across it entire lineup means that there are a plethora of upgrade choices that would be tangible even if someone did their build as late as 2017. GPUs are faster overall than they have ever been but also have been priced into the stratosphere making upgrading a less frequent proposition. There are probably a ton of Vega users that are waiting for Big Navi. Having said all of that though there have been compelling releases. DDR4 was slower than DDR3 when it launched but DDR4 is now as high as 5200 MHZ. The other thing about DDR4 is do a test with 3000MHZ vs 3600MHZ on AM4 and you will smile. Storage has definitely come a long way since the 3570k too as
I'm trying to get a discussion going on this topic.

I am of the opinion that there is such a thing as a best time to build a new rig. I am opposing the idea that there's always something better on the horizon, and you'll end up waiting forever. I will not include the position of upgrading because current build died.

So here's my argument. You do research, you take notes on incoming tech releases for cpu, gpu etc., you make a timetable, and you compare the benefits of waiting for them instead of building now. As an example, I've got a 3570k at stock and a 1070 (which is probably not running at its best with this ancient cpu). My monitor died (1080p, 120hz, 24inch), so obviously I'm going to want something better. So I upgraded to 1440p, 144hz, 27inch.

I look at some benchmarks, and I see that some of the newish games don't really run at 60fps on my new resolution with the 1070 (nevermind 144fps), and the cpu ain't helping either. Modded Skyrim and Witcher wreck my pc, Starcraft 2 starts to choke when there's a lot of units, CyberPunk 2077 will probably need medium settings for stable fps, etc. Sacrifices have to be made.

I could buy something today that would put to shame my current build (and I can salvage the ssds, psu, case, so some money saved), or wait till end of the year for zen 3, intel 10nm, and new gpus from both sides.

Ok, but what comes after? Zen 4 probably won't be on AM4, ddr5 on the horizon as well as usb 4.

So let's make some math. DDR4 came in 2014, DDR5 will come around 2021-2023, that's a lifespan of at least 8 years as top dog for DDR4, and maybe 12 before it gets discontinued (or at least not mass market). USB 4 with the bandwidth of Thunderbolt 3. And, with these 2 platform changes, you also get Rocket/Meteor Lake on 5-7nm for Intel, Zen 5/6 on 3-7nm for AMD, and new Nvidia gpu + RDNA 3/4 from AMD. PS5 and XSX will also have been out on the market for 2-3 years, giving developers time to finally build their games for high performance SSDs from the ground up and forget about HDD. Hopefully the other bottlenecks between storage, ram and cpu (Mark Cerny presentation of technical PS5 details) will get resolved for the PC platform.

And why do I think that I'm making a good choice by waiting for 2021-2023, hence my stance on this debate? Well, the tech will last me a good 6-10 years, and honestly, what comes after probably won't be that impressive. CPU/GPU on 1nm vs 3/5/7? Can they even build something at 1nm? Silicon will probably be replaced before I have to build a new computer. DDR6, how long will that be before it comes out, 8-12 years? New monitors, pfff it'll be ips/tn/va until microled will be affordable at small/big sizes. Advances in audio, not likely. Something else better than VR/AR? Maybe, but not for the masses.

So, what say you? Do you side with build when you want or smart upgrading?
If everyone subscribed to that train of thought there would be no PC market. The innovations you are looking for in 2021 would not be there without the instant gratification buyer and the PC is the best example of a Consumer product that is almost eternally gratifying. What you are talking about is not wrong though as there are plenty of enthusiasts that subscribe to your train of thought. I would say though that there is so much choice in today's market it can be confusing. The 3300X in some ways is a much better CPU than the 1700 (my own experience) but $200 cheaper too. I know for me I bought a 3300X for my AM4 system because I am saving my marbles for TRX40 as even X670 will not have the I/O that that chipset provides so I also subscribe somewhat to your train of thought.
 
I upgrade when I can afford it, or when there's a good deal or something new that I really want, assuming I can afford it.
Sometimes it's just one part, sometimes it's most of the system. I have a bad habit of being an early adopter, but I hope I can stop myself from being that when it comes to PCs in the future, as this time around, the first three months wasn't a great experience. Yes, everything worked, but it was not working as good as it could, nor did it do what it said on the box.

Obviously as with some others here, this is my main hobby and having worked as a tech journalist, I still follow the industry and obviously still work in tech, although not really doing PC related stuff any more. That way I think I have managed to avoid some "upgrade traps" as I have an idea of when new things are going to be out and as such, I don't end up buying something at the end of the life cycle, unless it's something that's still very much a decent piece of kit. For example, I picked up my RTX 2080 card for the same money RTX 2070 Super cards were going for, which made the upgrade itch a bit too much, even though I wasn't going to buy a new card last year...

I don't think there's a point to always waiting for the next thing, since it can be three months or a year away. I mean, look at Intel, how many people are still waiting for their first 10nm consumer CPU? Sometimes there's something promising on the horizon that ends up failing miserably in the end. I would say AMD's graphics cards with HBM were sort of in that territory. Plenty of memory bandwidth, but not enough RAM and too expensive for the benefit it brought to the table in the consumer space. The latest, greatest technology isn't always what makes sense.
 
Although I always want new kit, where I live It's never really affordable, so I always buy my major upgrades on trips to the UK where it's half of what I'd pay here.
FX8320 to Ryzen 2600X was my last major upgrade and most of it is driven by gaming, without question and I always buy what has been on the market for at least a year.
I want to add NVMe 1TB and perhaps upgrade the RX580, but it does make sense to wait a while for new tech to come down in price as newer tech emerges. That's my view anyway, but if I'm honest, most is driven by the size of my pocket.
 
my rule is if a cpu sucks at 1080p it's gonna suck at 1440p and 4c/4t is dead while 4/8 atm is actually good value look at 3300x.it's closer to 3800x (8%) than 3800x is to 9900k/10700k (16%)
that is exceptional value.same would go for new i3s if intel's platform/chipset strategy didn't suck.
2700.jpg
relative-performance-games-1280-720.png


So, what say you? Do you side with build when you want or smart upgrading?
they're the same thing.
pc components don't lose value that quickly except for 1/2 gen ryzen/threadripper maybe.
if you build a good system now it's gonna last no matter what's coming soon.

you want a smart purchase ? don't buy sth you'll wanna sell next year for half the price.
 
Last edited:
I'm driven to upgrade based on need. The longer one waits, the more bang for the buck (generally), you'll get.
 
im on the last gen when building for myself because its a big money saver and to tell the truth its not long before new /latest is replaced + i get my son in laws hand me downs "hes just changed to a 3950x so im hoping his soon to be mine 9900k is on its way here :) ".
 
Today the traditional update paths have been reversed in terms of GPUs vs CPUs.
DDR4 was slower than DDR3 when it launched but DDR4 is now as high as 5200 MHZ.
Huh? Who defined what the "traditional" update path was? I don't think there ever was a path. It all depended on what the computer started with. IMO, if there was a path, it was always to look at upgrading RAM first as that "traditionally" gives the most bang for your money (again - depending on starting point). Beyond that, some folks looked at bigger graphics cards (or replacing integrated with a card) and some looked at faster and/or more cores CPUs - again depending on their starting points. Today, many look first at replacing spinners with SSDs.

None of those upgrade paths are wrong - again depending on the starting point.

DDR4 was not slower. It started out at 800MHz while DDR3 was at 400MHz and right from the start offered higher data transfer rates that got even higher. Plus DDR4 supported higher densities and lower voltages from the start.
 
Huh? Who defined what the "traditional" update path was? I don't think there ever was a path. It all depended on what the computer started with. IMO, if there was a path, it was always to look at upgrading RAM first as that "traditionally" gives the most bang for your money (again - depending on starting point). Beyond that, some folks looked at bigger graphics cards (or replacing integrated with a card) and some looked at faster and/or more cores CPUs - again depending on their starting points. Today, many look first at replacing spinners with SSDs.

None of those upgrade paths are wrong - again depending on the starting point.

DDR4 was not slower. It started out at 800MHz while DDR3 was at 400MHz and right from the start offered higher data transfer rates that got even higher. Plus DDR4 supported higher densities and lower voltages from the start.
I am using my own and the experiences of my friends it made common sense every 2 years to get a new GPU (as the mid range was $200) vs replacing the CPU. Unless you were doing CPU tasks there was no need to replace the CPU as often. Things like RAM count today but I remember when you could get 16 GB of DDR3 for $35 (with heat spreaders). The starting point is important and you are not wrong in the HDD replacement argument either. The thing is now everyhting 1/2 decent costs at least $100 and that goes from PSUs to cases to RAM.

When talking about DDR3 vs DDR4 I was referencing the OP's comment that he/she was looking at DDR4 at it's entrance to the consumer market so DDR4 was at 2400 max I think and DDR3 was at 2133 but that is years ago now.
 
Well, 2400 is still faster than 2133. And again, DDR4 went up from there.

As far as not needing to replace the CPU "as often", then again, that depends on your starting point. For you, that may be true. But for others, maybe their CPU budget was not so big during the initial builds. Or maybe their computing habits changed and the tasks they do today or more CPU intensive than GPU. All users are unique just as each and every Windows computer is unique.

it made common sense every 2 years to get a new GPU

2 is an arbitrary number and in this case, simply represents your personal "anecdotal" experiences - not "common sense" and certainly not "traditional" practice across the industry. And that's fine - but what seemed like common sense to you should not be suggested as the norm for everyone or even most users.

IMO, it is NOT "common sense" to get a new GPU every 2 years. That's a waste of money and tells me you are not buying the GPU you really need in the first place. That may be due to budget restraints (and no shame there!), or just poor planning - but not common sense. Or as I said above, computing habits/tasks may change. Not everyone on this site is a gamer, not all games are GPU intensive, and not all gamers play the same games.

Unless you "need" a new graphics card "yesterday", the "smart" policy is to take the time to (1) research, research and research some more, then (2) take the time to plan ahead and build up the budget so you can buy a card (or CPU, or PSU, or RAM or drive or whatever) that will last you many years into the future - not a mere 2 years.

Regardless, IMO, the smart policy is to build your own computer (or at least have it custom built for you) using ATX standard components. Don't buy a factory built computer and expect it can "evolve" over the next several years with upgrades.
 
I usually only upgrade when my PC is simply not up to my standards anymore 'they aint much' and is not capable of reasonably running what I need it for.

My previous PC that had a i 3 4160/GTX 950 in it lasted me 3+ years since I did not play anything too demanding at the time.

Then I just built a new system completely when I started playing new games, I rarely do single part upgrades if possible. 'if I do then its my GPU'

As for the waiting part, nope I'm not a fan of it and almost never care about it unless the new ~mid range/budget hardware is maybe 1-2 months away then I might wait. 'have to wait more anyway, cause prices in my country are not funny especially when something is new'

On a side note, upgrade itch is pure evil, luckily I can ignore it most of the time but once I couldn't and upgraded my GPU only to go back playing the same game/s my previous card could already handle just fine.:kookoo::laugh:
 
if you look at the current offerings available both amd/intel their cpus are great for gaming and multi-tasking alike.
Amd will be easier on the wallet side and saving money is always important as you can use that bit towards a new gpu/ssd so on.
There is a refresh on AMD Zen2 cpus before Zen 3 arrives but there isnt a date yet.

As for DDR4 to DDR 5 there wont be any noticeable difference straight away. You most likely to find a performance gap after 2 years or so
when ddr5 reaches faster speeds than DDR4 can. So no need to worry about RAM.

GPU is where i would wait till you see the next gen from both amd/nvidia which is about 3-4 months away because of the price increase on current gpu. Therefore you might want to pay close attention to as which gpu will most likely suit you because it is rather expensive now and
most likely this trend will continue.

If you need to buy now, i would get the cpu/mb/ram combo from either amd/intel.
6cores is where you want to have a smooth gaming experience for the next 3 years and if you want further future proofing
8cores.
 
I feel this thread needs to be refined in terms of what tech you are talking about. Today the traditional update paths have been reversed in terms of GPUs vs CPUs. With AMDs scheduled launches and now Intel's hyper threading support across it entire lineup means that there are a plethora of upgrade choices that would be tangible even if someone did their build as late as 2017. GPUs are faster overall than they have ever been but also have been priced into the stratosphere making upgrading a less frequent proposition. There are probably a ton of Vega users that are waiting for Big Navi. Having said all of that though there have been compelling releases. DDR4 was slower than DDR3 when it launched but DDR4 is now as high as 5200 MHZ. The other thing about DDR4 is do a test with 3000MHZ vs 3600MHZ on AM4 and you will smile. Storage has definitely come a long way since the 3570k too as

If everyone subscribed to that train of thought there would be no PC market. The innovations you are looking for in 2021 would not be there without the instant gratification buyer and the PC is the best example of a Consumer product that is almost eternally gratifying. What you are talking about is not wrong though as there are plenty of enthusiasts that subscribe to your train of thought. I would say though that there is so much choice in today's market it can be confusing. The 3300X in some ways is a much better CPU than the 1700 (my own experience) but $200 cheaper too. I know for me I bought a 3300X for my AM4 system because I am saving my marbles for TRX40 as even X670 will not have the I/O that that chipset provides so I also subscribe somewhat to your train of thought.

Ok, I feel like some people are comparing the users that will spend hundreds of $ more for 5000mhz ddr4 ram for 2% increase in performance vs those of us that buy a sensible, balanced system. I for one know that for zen 2 and possibly zen 3, sweet spot is 3600mhz for the ram. So, same for ddr5, w/e the sweet spot ends up being, it will be found out 1-2 years after ddr5 has been out on the market, and the ram has come down in price and is more reliable. I don't care for nor will I wait for 7k mhz ddr5 ram before upgrading because it would be foolish for me, a waste of time and money.

But that is opposing a simple fact! There "IS" always something better on the horizon or just around the corner. The problem is, computers consist of 100s if not 1000s of different technologies - and they all progress and advance on their own timetables.

And that mismatch of time tables is what makes your below argument (as written) invalid.
The next generation CPU is NOT going to be released at the same time as the next gen GPU. Neither will the next gen SSD, SATA, USB, PCIe, RAM, HDMI, DisplayPort, or [fill in the blank]. They all have their own time tables.
Yes.

First priority for my new build (or upgrade) is based on "need". If I "need" a new computer or upgrade "now", I will research what is out there "now" and buy "now".

If I don't need it now, I will take the time to research deeper, see what is out there now, build up my budget, then pull the trigger. I don't buy cutting edge technologies because inevitably it will be more expensive, have fewer available options and is likely to still have a few bugs in it.

Then, after I have my new build up and running, I will stop looking at what is just around the corner for I know there will ALWAYS be something that will supersede something I just bought! Regardless how close to the cutting edge my upgrade was. :(

So IMO, it is "smart" upgrading to build when you want - just don't buy unproven, cutting edge technologies that are still bleeding all over the place.

I believe there are times when most timetables match, and so offer the opportunity for the best upgrade. If taking my example, ddr5+usb 4+hdmi 2.1+3/7nm CPU and GPU+PCIe 4.0+NVME SSD, how can you say that this is not an opportunity worth exploiting? Nothing from that list will be outdated for years to come, except the GPU perhaps, but we've seen how good the 1080ti is even today, 3 years after launch. Still, the GPU is the most easily upgradable piece, provided the cpu is strong enough for it (most likely it will be since 1440p is more GPU bound). As for PCIe 5.0, 4.0 won't be saturated by then by a single GPU
 
Last edited:
I believe there are times when most timetables match
Oh? When?

I've been in this business for decades and have NEVER EVER seen where new CPU, GPU, USB, RAM, PCIe, SATA, HDMI, Bluetooth, and younameit versions/standards were all released at the same time! I mean not even two at once! So "most"? Not even! Two at the same time? I don't think so.

It sure would be nice if they coordinated such releases but not going to happen. Too many players would have to agree to hold off until everyone was ready. Do you really think AMD and Intel, and AMD and NVIDIA, and ASUS and Gigabyte, etc. will agree to wait for the other? USB is the perfect example. You might be able to find a new Gigabyte motherboard with the latest and greatest USB standard. But what about a Fractal Design case? Will that ASUS motherboard support PCIe 5.0? If it does, will your next MSI graphics card? Will the PSU have the necessary power connector?
 
Back
Top