newtekie1
Semi-Retired Folder
- Joined
- Nov 22, 2005
- Messages
- 28,472 (4.25/day)
- Location
- Indiana, USA
Processor | Intel Core i7 10850K@5.2GHz |
---|---|
Motherboard | AsRock Z470 Taichi |
Cooling | Corsair H115i Pro w/ Noctua NF-A14 Fans |
Memory | 32GB DDR4-3600 |
Video Card(s) | RTX 2070 Super |
Storage | 500GB SX8200 Pro + 8TB with 1TB SSD Cache |
Display(s) | Acer Nitro VG280K 4K 28" |
Case | Fractal Design Define S |
Audio Device(s) | Onboard is good enough for me |
Power Supply | eVGA SuperNOVA 1000w G3 |
Software | Windows 10 Pro x64 |
Networks are everywhere in our lives at this point. The average home in the US has 5.7 internet connected devices, and that number is only increasing. There are Cell phones, media streaming devices like Chromecast or Amazon Fire devices, and of course computers. All of these devices are connected to your home network through some type of connection. Today I set out to test some of the popular networking methods available today and how they might affect the tasks we perform. For this test I'll be looking at various ways to connect a desktop computer to your network. Some of this will apply to laptops too. However, because mobility is usually key for laptops, and you'll almost always be using the internal card provided with the laptop, you'll probably just use the best wireless version you can.
Before we begin I'd like to address a few things that everyone should know whenever we are talking about networking. That is the difference between Mbps and MBps. Just capitalizing that one letter does make a big difference. Mbps stands for Megabits-per-second. MBps stands for MegaBytes-per-second. There are 8 bits in a byte, so Mbps will be 8 times MBps. So to convert MBps into Mbps you multiple the MBps number by 8. Also, Mbps and Mb/s are the same thing. The / stands for "per" just like the p. Finally, the M should always be capitalized. Writing mbps or Mbps does not change anything when talking about networking. I'm going to try to stay as consistent as I can through this and using Mbps as much as I can. However, I may use MBps every one in a while during the write up, all the results will use Mbps though.
Lets look at the contenders as well as my testing setup before we get into the numbers.
First, my testing setup.
The file server use in the testing as well as the wireless router are located in the basement of my home. The server is wired to the router with Gigabit Ethernet. They are in the north east corner of the basement. The computer I'm using to run the network tests is located on the ground floor of my home, in the south west corner. The server has a set of 4 WD Red 3TB drives in RAID5 on a Highpoint RAID controller. The RAID5 array is capable of over 150MBps sustained reads and writes, which is more than enough to saturate a Gigabit Ethernet connection. The computer I'm using has a 1TB 7200RPM Seagate SSHD. Before it is said, I know the single hard drive might likely be a bottleneck for some of the tests, but it is all I had available. Plus, you'll see it is fast enough to get the point across.
The router is an ASUS RT-AC66U running Shibby TomatoUSB firmware. It is a simultaneous dual-band Wireless AC1750 rated router with Gigabit Ethernet ports. The router is connected to my Comcast internet connection that is rated for 105Mbps download and 25Mbps upload. My internet connection is very consistent, delivering over 100Mbps download and over 30Mbps almost all the time. The tests were performed in non-peak hours just to make sure.
Lets look at the contenders.
Wireless N 2.4GHz
Pretty much the standard networking we find now. In the last few years Wireless N 2.4GHz has come to replace Wireless G 2.4GHz, which was the standard for far too many years. Because Wireless G was the standard for so long there are still alot of devices around that only work with it. Wireless N 2.4GHz has some benefits, the main one being that it is backwards compatible with Wireless G. So wireless G devices can still connect to Wireless N 2.4GHz networks. And no, when a Wireless G device connects to a Wireless N network it does not force the entire network to run at Wireless G speeds. The down side to Wireless N 2.4GHz is that the 2.4GHz band can be very congested. Just in my area alone there are roughly 25 wireless networks using the 2.4GHz band that I can see from my house, there is not a single channel that doesn't have a wireless network running on it. This means that in my house there are 25 sources of interference on the 2.4GHz band. Plus bluetooth runs on 2.4GHz, a lot of wireless mice use 2.4GHz, microwaves put out 2.4GHz interference, etc.. Wireless N 2.4GHz is limited to a theoretical maximum of 600Mbps. To achieve this, however, there would need to be absolutely no interference, which would allow the network to run with 40MHz wide channels and you'd need a router and network card that both have 4 antennas. Each antenna gives 150Mbps when running at 40MHz channel width for a total of 600Mbps. This is almost never the case. Most consumer wireless cards only have 2 antennas and the network congestion forces 20MHz channel width. Each antenna running at 20Mhz channel width gives 72Mbps. This means in most cases Wireless N 2.4GHz will only give a theoretical maximum of 150Mbps. For this test I'll be using an Intel Mini-PCIe card that has two external antennas.
Wireless N 5.0GHz
This is Wireless N running on the 5.0GHz band, obviously. This is not backwards compatible with Wireless G but does offer other benefits. Just a tidbit of info, most people believe Wireless N was the first form of WiFi to use 5.0GHz. That isn't the case, the Wireless A standard actually uses the 5.0GHz band as well, and has been around forever. Wireless A is older than Wireless G actually. However, it is so rarely used it isn't worth really considering. The main benefit being that the 5.0GHz band is relatively unused compared to the 2.4GHz band. Using my area again, my network is the only network I can see on the 5.0GHz band. This lack of network congestion means that Wireless N 5.0GHz has a better chance of working at 40MHz. This gives Wireless N 5.0GHz with only 2 antennas a theoretical maximum of 300Mbps in most use cases. For this test I'll again be using an Intel Mini-PCIe card that has two external antennas.
Wireless AC 5.0GHz
Wireless AC is the new comer to the party in terms of wireless. It was released in 2014, and only uses the 5.0GHz band. The standard allows for up to 80MHz channel widths with up to 8 antennas. The 80MHz channel width allows for up to 433Mbps per antenna. Most consumer wireless AC routers have either 3 or 4 antennas. My ASUS RT-N66U has 3, which gives it a theoretical maximum speed of 1300Mbps. Most consumer AC wireless cards have either 2 or 3 antennas. This gives a theoretical maximum fo 866Mbps for 2 antennas and 1300Mbps for 3 antennas. I'm using the PCIe x1 TP-Link Archer AC1750 wireless card for my testing, which has 3 antennas.
Powerline 500Mbps
Powerline networking has been around for a over 10 years. In fact the first standard for home use was released way back in 2001. Though the maximum speed was very low. The first kits having theoretical maximum speeds of only 12Mbps. Though at the time that wasn't all that terrible. The standard home network was 100Mbps, and it wasn't uncommon to find 10Mbps switches...errr hubs...in use in homes. With the internet connections at the time these were more than sufficient to bring an internet connection to a computer. The biggest problems with the powerline adapter of this time was there wasn't a real standard that allowed adapters from different manufacturers to work together. This changed in 2005 when the first HomePlug AV standard was released. It was at this time that the theoretical maximum speed increased as well to 200Mbps. Which again for the time was fine as a lot of home networks were still 100Mbps. Though most adapters were 85Mbps, the 200Mbps adapters were rare and very expensive. It was again in 2010 that the standard speed was raised to 500Mbps, though some manufacturers advertise 600Mbps for some kits. This is probably when powerline kits started to become more widely used in the mainstream market. The manufacturers had all pretty much adopted the standard so adapters from different manufacturers could work together without hassle. The consumer market had forgotten the time before the standards were adopted. Now, the interesting thing about a lot of 500Mbps adapters is they only have 100Mbps ethernet ports. A lot of people say this is the manufacturers trying to rip you off, or because they know you'll never get 500Mbps. And yes, that is true, you won't get 500Mbps between two adapters. However, one large disadvantage of powerline networking is that the bandwidth is shared between all the adapters. So while two adapter might not get close to 500Mbps, when you add a 3rd or 4th adapter the 500Mbps limit makes it so that the extra adapters don't affect the first two adapter's connections nearly as much. In my experience with 4 adapters, adding the 3rd and 4th adapter had actually not real affect on the speeds I was getting between just two adapters. Even though 3 adapters were basically feeding off one that was plugged into my router and providing the connection to my internet and my server. For this test I'm using two Trendnet 500Mbps nano powerline adapters.
Powerline 1200Mbps
This is a new standard released very recently and it borrows some technology from wireless. The main technology borrowed is MIMO. MIMO is what allows wireless to use multiple antennas, it stands for Multiple-Input-Multiple-Output. In this case MIMO allows the powerline adapters to use more than one of the powerlines in your house. With the old standard, all the adapters in the home communicated on only one of the wires in your electrical system. These new 1200Mbps adapters use, I believe, two of the wires in your electrical system. This is supposed to provide better speeds. We'll see about that. For this test I'll be using two Netgear 1200Mbps powerline adapters.
Gigabit Ethernet
The good old directly wired 1000Mbps Ethernet connection. Been in the consumer market for a good 10+ years, be it rather expensive 10+ years ago. How expensive? My first 8-Port gigabit switch was a Linksys I bought in 2003, it cost over $200. But now gigabit Ethernet is cheap, and cables are cheap. The most expensive part can be the cost to run the wires through the walls to get them where they need to go. Because I don't have an Ethernet jack run to the room of the house the test computer is in, for this test I ran a 100ft Cat6 cable from my basement directly connected to my router, halfway accross the basement, up the stairs, and halfway across my first floor to the computer.(The wife found this amusing...)
Now for the part we've all been waiting for!
The Test Results.
Internet Connection Speeds
For this test I ran an internet speed test using www.testmy.net. I ran 3 tests on each connection and took the average of the 3 for the results.
Local File Transfers
For this test I transferred about 1GB of data off of my server, then transferred it back. There are 342 files in the transfer ranging in size from 9KB up to 512MB. I used a script that does this and at the end tells me the exact time taken and the speed of the transfer.
Ping
For this test I pinged the router 100 times and took the average of all the pings as well as the maximum ping. The reason I'm pinging the router and not a website is because we are only concerned with the latency added by the method used to connect to the router. The latency after the router out to the rest of the world you don't have any control over. The maximum ping is an important figure as well, especially for online gamers. Ping spikes are what cause perceivable lag. We've probably all experience playing on a server and everything seems fine, pings are fine, but suddenly ping jumps everything stutters and you end up dead. Ping spikes suck.
Conclusion
Well, this was interesting that is for sure. Before I go into the main conclusion I want to say this hasn't been sponsored by anyone. I paid for everything used in this article out of my own pocket.
Now, my thoughts. Obviously the old trusty Gigabit Ethernet is the way to go. But if that isn't available what you go with depends on what is important to you. If you are going for maximum bandwidth, like if you are doing a lot of file local file transfers or big downloads/uploads from the internet, then Wireless AC is clearly the way to go. The downsides here are cost and the high ping. For online gaming that high max ping will likely cause problems. Then, the router I have costs $150, but the entry cost for a decent AC1750 router starts at $100. Also, don't even bother with a router that doesn't have external antennas, like the Linksys EA6500. The poor signal quality caused by the internal antennas means you're likely never going to get any decent connection speed, even if you are using Wireless AC. Then the network card is another $70. So a decent Wireless AC setup can easily cost $200 or more.
So, lets say that cost is beyond what you are willing to pay, what next? I'm going to say it right now, the Powerline 1200 is completely not worth it. The very slight file transfer speed does not warrant more than double the price. If you live in any kind of urban environment Wireless N 2.4GHz is out of the question as well. I'm almost positive those super low file transfer speeds and internet speed are because of the interference caused by all the other 2.4GHz networks in my area. So, you're looking at Wireless N 5.0 or Powerline 500Mbps.
Well, again the first think I look at is price. A decent Wireless N router with two external antennas can be had for $40 and a dual-antenna PCI-E Wireless N 5.0GHz network card can be had for $25. Though a lot of motherboards are coming with Wireless N network cards included. The Powerline 500Mbps kit costs $40. There is a good chance you'll be buying a router anyway, and the minimum you'll want to spend is $40. So the Wireless N 5.0GHz solution may be cheaper. If you already have a router though, then Powerline 500Mbps would be cheaper.
However, cost might not be a factor for you anyway if you plan to play online games. Again, high maximum pings plague Wireless N 5.0GHz. And could cause problems with online games. So you may want to go with Powerline 500Mbps no matter what.
In the end, considering all the factors, Powerline 500Mbps comes out the winner here if Gigabit Ethernet isn't possible. Cost is low, but performance is pretty equal to Wireless N 5.0GHz. However, one place it excels is pings. I hear people say that Wireless AC makes Powerline obsolete. And in part that is true for file transfers and if cost isn't a concern. However, even Wireless AC has ping issues. Latency has always been an issue with wireless, and Wireless AC hasn't improved any here.
Before we begin I'd like to address a few things that everyone should know whenever we are talking about networking. That is the difference between Mbps and MBps. Just capitalizing that one letter does make a big difference. Mbps stands for Megabits-per-second. MBps stands for MegaBytes-per-second. There are 8 bits in a byte, so Mbps will be 8 times MBps. So to convert MBps into Mbps you multiple the MBps number by 8. Also, Mbps and Mb/s are the same thing. The / stands for "per" just like the p. Finally, the M should always be capitalized. Writing mbps or Mbps does not change anything when talking about networking. I'm going to try to stay as consistent as I can through this and using Mbps as much as I can. However, I may use MBps every one in a while during the write up, all the results will use Mbps though.
Lets look at the contenders as well as my testing setup before we get into the numbers.
First, my testing setup.
The file server use in the testing as well as the wireless router are located in the basement of my home. The server is wired to the router with Gigabit Ethernet. They are in the north east corner of the basement. The computer I'm using to run the network tests is located on the ground floor of my home, in the south west corner. The server has a set of 4 WD Red 3TB drives in RAID5 on a Highpoint RAID controller. The RAID5 array is capable of over 150MBps sustained reads and writes, which is more than enough to saturate a Gigabit Ethernet connection. The computer I'm using has a 1TB 7200RPM Seagate SSHD. Before it is said, I know the single hard drive might likely be a bottleneck for some of the tests, but it is all I had available. Plus, you'll see it is fast enough to get the point across.
The router is an ASUS RT-AC66U running Shibby TomatoUSB firmware. It is a simultaneous dual-band Wireless AC1750 rated router with Gigabit Ethernet ports. The router is connected to my Comcast internet connection that is rated for 105Mbps download and 25Mbps upload. My internet connection is very consistent, delivering over 100Mbps download and over 30Mbps almost all the time. The tests were performed in non-peak hours just to make sure.
Lets look at the contenders.
Wireless N 2.4GHz
Pretty much the standard networking we find now. In the last few years Wireless N 2.4GHz has come to replace Wireless G 2.4GHz, which was the standard for far too many years. Because Wireless G was the standard for so long there are still alot of devices around that only work with it. Wireless N 2.4GHz has some benefits, the main one being that it is backwards compatible with Wireless G. So wireless G devices can still connect to Wireless N 2.4GHz networks. And no, when a Wireless G device connects to a Wireless N network it does not force the entire network to run at Wireless G speeds. The down side to Wireless N 2.4GHz is that the 2.4GHz band can be very congested. Just in my area alone there are roughly 25 wireless networks using the 2.4GHz band that I can see from my house, there is not a single channel that doesn't have a wireless network running on it. This means that in my house there are 25 sources of interference on the 2.4GHz band. Plus bluetooth runs on 2.4GHz, a lot of wireless mice use 2.4GHz, microwaves put out 2.4GHz interference, etc.. Wireless N 2.4GHz is limited to a theoretical maximum of 600Mbps. To achieve this, however, there would need to be absolutely no interference, which would allow the network to run with 40MHz wide channels and you'd need a router and network card that both have 4 antennas. Each antenna gives 150Mbps when running at 40MHz channel width for a total of 600Mbps. This is almost never the case. Most consumer wireless cards only have 2 antennas and the network congestion forces 20MHz channel width. Each antenna running at 20Mhz channel width gives 72Mbps. This means in most cases Wireless N 2.4GHz will only give a theoretical maximum of 150Mbps. For this test I'll be using an Intel Mini-PCIe card that has two external antennas.
Wireless N 5.0GHz
This is Wireless N running on the 5.0GHz band, obviously. This is not backwards compatible with Wireless G but does offer other benefits. Just a tidbit of info, most people believe Wireless N was the first form of WiFi to use 5.0GHz. That isn't the case, the Wireless A standard actually uses the 5.0GHz band as well, and has been around forever. Wireless A is older than Wireless G actually. However, it is so rarely used it isn't worth really considering. The main benefit being that the 5.0GHz band is relatively unused compared to the 2.4GHz band. Using my area again, my network is the only network I can see on the 5.0GHz band. This lack of network congestion means that Wireless N 5.0GHz has a better chance of working at 40MHz. This gives Wireless N 5.0GHz with only 2 antennas a theoretical maximum of 300Mbps in most use cases. For this test I'll again be using an Intel Mini-PCIe card that has two external antennas.
Wireless AC 5.0GHz
Wireless AC is the new comer to the party in terms of wireless. It was released in 2014, and only uses the 5.0GHz band. The standard allows for up to 80MHz channel widths with up to 8 antennas. The 80MHz channel width allows for up to 433Mbps per antenna. Most consumer wireless AC routers have either 3 or 4 antennas. My ASUS RT-N66U has 3, which gives it a theoretical maximum speed of 1300Mbps. Most consumer AC wireless cards have either 2 or 3 antennas. This gives a theoretical maximum fo 866Mbps for 2 antennas and 1300Mbps for 3 antennas. I'm using the PCIe x1 TP-Link Archer AC1750 wireless card for my testing, which has 3 antennas.
Powerline 500Mbps
Powerline networking has been around for a over 10 years. In fact the first standard for home use was released way back in 2001. Though the maximum speed was very low. The first kits having theoretical maximum speeds of only 12Mbps. Though at the time that wasn't all that terrible. The standard home network was 100Mbps, and it wasn't uncommon to find 10Mbps switches...errr hubs...in use in homes. With the internet connections at the time these were more than sufficient to bring an internet connection to a computer. The biggest problems with the powerline adapter of this time was there wasn't a real standard that allowed adapters from different manufacturers to work together. This changed in 2005 when the first HomePlug AV standard was released. It was at this time that the theoretical maximum speed increased as well to 200Mbps. Which again for the time was fine as a lot of home networks were still 100Mbps. Though most adapters were 85Mbps, the 200Mbps adapters were rare and very expensive. It was again in 2010 that the standard speed was raised to 500Mbps, though some manufacturers advertise 600Mbps for some kits. This is probably when powerline kits started to become more widely used in the mainstream market. The manufacturers had all pretty much adopted the standard so adapters from different manufacturers could work together without hassle. The consumer market had forgotten the time before the standards were adopted. Now, the interesting thing about a lot of 500Mbps adapters is they only have 100Mbps ethernet ports. A lot of people say this is the manufacturers trying to rip you off, or because they know you'll never get 500Mbps. And yes, that is true, you won't get 500Mbps between two adapters. However, one large disadvantage of powerline networking is that the bandwidth is shared between all the adapters. So while two adapter might not get close to 500Mbps, when you add a 3rd or 4th adapter the 500Mbps limit makes it so that the extra adapters don't affect the first two adapter's connections nearly as much. In my experience with 4 adapters, adding the 3rd and 4th adapter had actually not real affect on the speeds I was getting between just two adapters. Even though 3 adapters were basically feeding off one that was plugged into my router and providing the connection to my internet and my server. For this test I'm using two Trendnet 500Mbps nano powerline adapters.
Powerline 1200Mbps
This is a new standard released very recently and it borrows some technology from wireless. The main technology borrowed is MIMO. MIMO is what allows wireless to use multiple antennas, it stands for Multiple-Input-Multiple-Output. In this case MIMO allows the powerline adapters to use more than one of the powerlines in your house. With the old standard, all the adapters in the home communicated on only one of the wires in your electrical system. These new 1200Mbps adapters use, I believe, two of the wires in your electrical system. This is supposed to provide better speeds. We'll see about that. For this test I'll be using two Netgear 1200Mbps powerline adapters.
Gigabit Ethernet
The good old directly wired 1000Mbps Ethernet connection. Been in the consumer market for a good 10+ years, be it rather expensive 10+ years ago. How expensive? My first 8-Port gigabit switch was a Linksys I bought in 2003, it cost over $200. But now gigabit Ethernet is cheap, and cables are cheap. The most expensive part can be the cost to run the wires through the walls to get them where they need to go. Because I don't have an Ethernet jack run to the room of the house the test computer is in, for this test I ran a 100ft Cat6 cable from my basement directly connected to my router, halfway accross the basement, up the stairs, and halfway across my first floor to the computer.(The wife found this amusing...)
Now for the part we've all been waiting for!
The Test Results.
Internet Connection Speeds
For this test I ran an internet speed test using www.testmy.net. I ran 3 tests on each connection and took the average of the 3 for the results.
Local File Transfers
For this test I transferred about 1GB of data off of my server, then transferred it back. There are 342 files in the transfer ranging in size from 9KB up to 512MB. I used a script that does this and at the end tells me the exact time taken and the speed of the transfer.
Ping
For this test I pinged the router 100 times and took the average of all the pings as well as the maximum ping. The reason I'm pinging the router and not a website is because we are only concerned with the latency added by the method used to connect to the router. The latency after the router out to the rest of the world you don't have any control over. The maximum ping is an important figure as well, especially for online gamers. Ping spikes are what cause perceivable lag. We've probably all experience playing on a server and everything seems fine, pings are fine, but suddenly ping jumps everything stutters and you end up dead. Ping spikes suck.
Conclusion
Well, this was interesting that is for sure. Before I go into the main conclusion I want to say this hasn't been sponsored by anyone. I paid for everything used in this article out of my own pocket.
Now, my thoughts. Obviously the old trusty Gigabit Ethernet is the way to go. But if that isn't available what you go with depends on what is important to you. If you are going for maximum bandwidth, like if you are doing a lot of file local file transfers or big downloads/uploads from the internet, then Wireless AC is clearly the way to go. The downsides here are cost and the high ping. For online gaming that high max ping will likely cause problems. Then, the router I have costs $150, but the entry cost for a decent AC1750 router starts at $100. Also, don't even bother with a router that doesn't have external antennas, like the Linksys EA6500. The poor signal quality caused by the internal antennas means you're likely never going to get any decent connection speed, even if you are using Wireless AC. Then the network card is another $70. So a decent Wireless AC setup can easily cost $200 or more.
So, lets say that cost is beyond what you are willing to pay, what next? I'm going to say it right now, the Powerline 1200 is completely not worth it. The very slight file transfer speed does not warrant more than double the price. If you live in any kind of urban environment Wireless N 2.4GHz is out of the question as well. I'm almost positive those super low file transfer speeds and internet speed are because of the interference caused by all the other 2.4GHz networks in my area. So, you're looking at Wireless N 5.0 or Powerline 500Mbps.
Well, again the first think I look at is price. A decent Wireless N router with two external antennas can be had for $40 and a dual-antenna PCI-E Wireless N 5.0GHz network card can be had for $25. Though a lot of motherboards are coming with Wireless N network cards included. The Powerline 500Mbps kit costs $40. There is a good chance you'll be buying a router anyway, and the minimum you'll want to spend is $40. So the Wireless N 5.0GHz solution may be cheaper. If you already have a router though, then Powerline 500Mbps would be cheaper.
However, cost might not be a factor for you anyway if you plan to play online games. Again, high maximum pings plague Wireless N 5.0GHz. And could cause problems with online games. So you may want to go with Powerline 500Mbps no matter what.
In the end, considering all the factors, Powerline 500Mbps comes out the winner here if Gigabit Ethernet isn't possible. Cost is low, but performance is pretty equal to Wireless N 5.0GHz. However, one place it excels is pings. I hear people say that Wireless AC makes Powerline obsolete. And in part that is true for file transfers and if cost isn't a concern. However, even Wireless AC has ping issues. Latency has always been an issue with wireless, and Wireless AC hasn't improved any here.
Last edited: