I agree that it's pointless.
Consoles, set-top boxes and PCs use practically the same digital video signal.
Just connect the consoles directly to the monitor, through a HDMI-to-DVI connector.
As for the Wii, you can just use
this cable.
There's really no reason to have the PC passing-through a video signal to play console games in a monitor.
You need a Wii Component cable which can go with that , the one that is included is composite only.
Wii component cable 30$
with this component to VGA is 40$
That's 70$ i must spent just for the connection , but wait , here wii cable probably costs 25€ and that another 35€ , that's like even more.
And I can't buy that on the site you provided because I do not live anywhere near there , I do not use credit card buying , because I do not have an international credit card.
I can get the Wii component cable here , but I am not sure about component to VGA.
I'm sure ATi could bring back the AiW series or something with multiple capture ability if they REALLY wanted too but they would be so much more expensive now.
Back then everything was analog and SD resolution. Basically everything was S-Video in. But now you would have to have HDMI and HD capability. HDMI capture isn't cheap. There are technical reasons and royalty reasons too IIRC. So they could at the least put a single HDMI input on the card. But it would add a lot to the price of the hardware. Just look at HDMI capture boxes. They cost way more than almost any video card just for that!
But then oh wait just like you said the Wii doesn't have HDMI output. Only Composite, S-Video, Component and RGB/VGA (I think) witch are of course all analog.
So it would be crazy awesome to have a capture device built into the modern video cards with all inputs. HDMI, Component/RGB and Composite/S-Video. I'm sure ATi/AMD of all people could do it if they really put their minds to it. But now the price is really jumping up. Extra up converter hardware would be needed to handle SD and HD resolutions, Analog to Digital converters. The card would get bigger. Remember AiW cards already had lower clock speeds than the normal versions of the same cards because of the extra hardware on them. The whole card needed more power and had less room for cooling.
The other reason AiW went away right around the time Windows Vista came out was because the straight up didn't work in Vista.
I think it has something to do with the fact an AiW is seen as two devices. One being the video card and the other being the TV Tuner/Capture device. Vista didn't support this "two device" on one slot or IRQ or something. I don't know for sure but I remember that being a problem and ATi saying it wasn't something they could get around with software/driver changes.
Like I said I'm sure AMD could do it. But it would be so expensive. There are some good TV Tuners out there if you look. Most of the current Hauppauge PCi-e cards work fine in Windows 7 and there are a few HDMI capture boxes out there too. But like I said they are expensive.
Now I see this more for recording purposes.
as many have already pointed out if you just want to play the games on the LCD monitor than all of this is a HUGE waste of time, money and energy!
Most monitors have DVI and or HDMI inputs. So either get a simple $5 DVI to HDMI adapter or an HDMI cable and plug your PS3 or 360 right into your monitor. If your monitor has both DVI and HDMI inputs you might be able to switch between the two (PC and game) right on the monitor! Simple, straight forward and no converting required! If DVI and HDMI are there... use it!
as for the wii. Well I'm not sure but getting yourself a VGA box for it might just be the best option. Once again converting analog component from the Wii to Digital for HDMI/DVI isn't going to be cheap. The Wii should have had HDMI output in the first place but hey it should have supported HD resolutions too.
How would i have a TV tuner card with the Radeon simultaneously ?
Well if that's the problem , then they should do HDMI and DVI input's too , what's the big idea , I don't get it , it's just take the signal and then send it to the monitor , it just goes throught a few wires in the GPU , there would not even be a need for a third-party converter if it would be implemented and processed by the GPU Core it self , there's no need for capturing or anything , everything would be inside the GPU and Windows won't know a thing about it , this means ATI would have to come up with the tech on their own , but since their GPUs are and will sell good , there's nothing to loose. It's not like people won't going to buy it because of this.
What you can get but for a premium is that , the third parties are only to have the alternatives , and they are charging for it , that's why all the cables are so expensive , they are charging for it , because it WILL sell practically for masses , it's like food and drink and people can't buy other stuff if it doesn't exist. The tech is ridicolous , one adapter can do what a VGA box can , I've seen such adaptor on amazon , VGA boxes are just selling point which costs so much for , wow , you can record console video, im not paying 50€ for that and then the screen will suck...