I have run a few cables through the house to get the HTPC on cable as opposed to wireless. This is the setup: One port on the gigabit router connects into the wall outlet via a 2m CAT6. Through the wall there is a 10m CAT6 cable which hooks up to the back of another wall outlet. On this other outlet, it is a double, so there are two 2m CAT6 cables connecting to the gigabit switch from it; one is from the router and one connects to downstairs. The outlet that goes to downstairs has a 30m CAT6e cable running through the wall. Anyway onto the point. When the HTPC is running downstairs, it only picks up a 100Mb connection, not a 1.0Gbit. I assumed this was odd as previously on Vista I am 90% sure it had 1.0Gbit, now that it is on 7 it only runs at 100Mb. I took the HTPC upstairs and connected it straight to the switch (via the 2m cable that runs from the wall outlet (which connects downstairs) to the switch) and sure enough, it gets the 1.0Gbit. Now this is the part I find weird. When downstairs (connecting at 100Mb) it transfers a particular file from my computer at 10-11MB/s, which is almost maxing out the 100Mbit connection. However, when I transfer that same file from the same computer to the same location when the HTPC is upstairs at 1.0Gbit, it only sends at 5-5.5MB/s. Now upstairs, the ONLY difference from when it is downstairs is the 30M cable in the way. Everything else is exactly the SAME. So how on earth is it transfering faster with a 30M cable in the way, especially when it is registering the network as being slower? Also, I assume the reason it comes up as 100Mb is from the 30M cable...which is odd as it did say 1Gbit on Vista (and I have changed drivers on 7, which doesn't make a diff anyway as upstairs it says 1GBit). Would this be the cause?