• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Problem with HDTV as computer screen!

Weq

New Member
Joined
Jul 18, 2012
Messages
10 (0.00/day)
Hey guys! First post in these boards btw, woohoo! :)

Anyways, on to the topic. I'm about to lose it, I think that's the best way to put it.
I bought a UE32ES5505 HDTV, which I intended to use as a PC screen. Boy, I was in for a rude awakening. Never have I had so much trouble with a screen.

So, first off, my PC (which has a GeForce 260gtx graphics card, if that has any relevance), can detect the screen just fine, and actually extend the screen between the HDTV and my old Samsung Syncmaster PC screen. The entire purpose I bought it, was to use it as a main screen, and use my other as a secondary. I'm connecting the screen with a DVI to HDMI cable, because my GPU doesn't support HDMI. I sat by myself (and with a friend) and tried everything. Naming the ports on my HDTV different things (pc, DVI PC and so on), and littereraly everything I could imagine being wrong, such as graphics driver, ports, cables (tried to different ones).

Now here comes the wierd part. When I was about to give up, I thought I'd just try and change the port one my time: BOOM, my screen was going. It actually worked. Next day, when I turned on my PC, it still worked.

Now today, when playing Day Z, the screen suddenly went out. I'm back to scratch, have no idea on what I did last time to get it to work, so I would really like some help. I've searched google for every phrase I could imagine the problem would consist of, but found no help!

Sorry if my posts a bit confusing, english is not my first language and so on.

Thanks in advance!

-Weq
 
Does the TV have a VGA PC input? Try to hook your PC to TV there and set in NV CP that as a primary monitor.
 
It doesn't have a VGA input. I do however have an older Prosonic TV, where when I use my DVI to HDMI cable, it's instantly detected.
 
Maybe the cable or the adapter is going bad.
Also make sure it is connected firmly at all ends and the gpu is also seated fully in the pcie slot.
gl ^^
 
It's entirely new cables.
Also, it works instantly when connected to other devices
 
The wierd thing is it worked for a short time, and then stopped.
 
This is going to sound stupid, but have you checked the source input?

Modern TVS have a bad habit of switching sources without being told to.
 
Yeah, I've checked it. Ive litterely switched port likes 20 times now, just to see if I can get any repsons somehow. It just won't detect it, even though my PC detects the HDTV fine.
 
Try connecting the TV to the port nearest the mobo and nothing to the other port and see how it goes. That should give you a reference point. It should work reliably like that.

Before you do that however, connect the Samsung monitor to that port and configure the driver/Windows to work with only one monitor. Also, do you have the latest graphics driver installed? There might be a configuration file (.inf) for your TV that might just help, as a long shot.

Finally, it's also possible that you've got a faulty TV. Try connecting another compute or console to it and see if it does the same thing.
 
It's most definitely not a cable issue, as it worked before, and I've also tried another cable. I can connect my PS3 fine to my TV. I just don't think it's a faulty TV, when it worked for like 10 hours.
 
When you are connecting things up, are the tv and pc off?

Try this.

Only have the TV connected to the pc ( no other monitors) ( remember both devices OFF)

Then switch the TV into standby mode, then power on the PC.

In theory the TV should wake up when this happens and use the PC as it's input.
 
also, double check the refresh rate that the signal is being outputted to.
 
When you are connecting things up, are the tv and pc off?

Try this.

Only have the TV connected to the pc ( no other monitors) ( remember both devices OFF)

Then switch the TV into standby mode, then power on the PC.

In theory the TV should wake up when this happens and use the PC as it's input.

Did this twice, it didn't work the first time. After that I tried to change the DVI port my screen was connected to, and it got detected, but didn't automatically regonize it as input. It does however work now, so thank you!

Now I'm just concerned it will stop working again :S
 
Did this twice, it didn't work the first time. After that I tried to change the DVI port my screen was connected to, and it got detected, but didn't automatically regonize it as input. It does however work now, so thank you!

Now I'm just concerned it will stop working again :S

If it does it again just repeat the steps you know work and are recorded on this forum :toast:
 
all the best Weq!
 
When you are connecting things up, are the tv and pc off?

Try this.

Only have the TV connected to the pc ( no other monitors) ( remember both devices OFF)

Then switch the TV into standby mode, then power on the PC.

In theory the TV should wake up when this happens and use the PC as it's input.

And I have a problem with my TV now. It has worked on off since I made this thread, I just can't get it to detect anymore. This method doesn't work for me anymore :(
 
Urban street story is HDTV "search" for a DRM flag in the connecting device and sometimes don't find it.

Update your Nvida drivers, all to default.
Update your HDTV drivers (yes yes!) and all to default.

Connect, select the correct source on the TV and open a wine or champagne...
 
Urban street story is HDTV "search" for a DRM flag in the connecting device and sometimes don't find it.

Update your Nvida drivers, all to default.
Update your HDTV drivers (yes yes!) and all to default.

Connect, select the correct source on the TV and open a wine or champagne...

Yeah, that doesn't work. Everything's up to date.
 
Uh, Samsung customer service?
 
Cable issue.

Adapters do not play well with HDMI. Time for a new GPU if you want dual support for DVI and HDMI. It will keep having problem being detected.
Also if your going to run expanded both have to be the same res. The HDMI issues is because the signal goes both ways with 1.4 and went you use an adapter it stop this.
 
Adapters do not play well with HDMI. Time for a new GPU if you want dual support for DVI and HDMI. It will keep having problem being detected.
Also if your going to run expanded both have to be the same res. The HDMI issues is because the signal goes both ways with 1.4 and went you use an adapter it stop this.

I'm petty sure that's false. I've been using a DVI to HDMI adapter for over 7 years now on 3 different GPUs (both ATI and Nvidia) and two different TVs and never have I had a problem with the PC recognizing the brand and model of TV.

And as far as two-way communication added with HDMI v1.4, I'm pretty sure that's audio only, regarding ARC. The only real caveat of using a DVI to HDMI adapter is it's restricted to single link DVI.

Weq, did you follow this procedure as per your TV manual?

Using the TV with Your PC Using Your TV as a Computer (PC) Display

Entering the Video Settings (Based on Windows 7)

For your TV to work properly as a computer display, you must enter the correct video settings after you have connected the TV to your PC. Depending on your version of Windows and your video card, the procedure on your PC will probably differ slightly from the procedure presented here. However, the same basic information will apply in most cases. (If not, contact your computer manufacturer or Samsung Dealer.)

1. Click “Control Panel” on the Windows start menu.
2. Click “Appearance and Themes in the “Control Panel” window. A display dialog box appears.
3. Click “Display”. Another display dialog box appears.
4. Click the “Settings” tab on the display dialog box.

–On the Settings tab, set the correct resolution (screen size). The optimal resolution for this TV is 1920 x 1080.
– If a vertical-frequency option exists on your display settings dialog box, select “60” or “60 Hz”. Otherwise, just click “OK” and exit the dialog box.


Source: http://www.elgigantenbusiness.se/pdf/E16C2435-1A5C-4FCC-9A23-BF241766461E.pdf

What OS are you using btw? W7 can be a bit more finicky on various resolutions precisely fitting the display, but XP can be more finicky on detection.
 
Last edited by a moderator:
I'm petty sure that's false. I've been using a DVI to HDMI adapter for over 7 years now on 3 different GPUs (both ATI and Nvidia) and two different TVs and never have I had a problem with the PC recognizing the brand and model of TV.

And as far as two-way communication added with HDMI v1.4, I'm pretty sure that's audio only, regarding ARC. The only real caveat of using a DVI to HDMI adapter is it's restricted to single link DVI.

Weq, did you follow this procedure as per your TV manual?

Using the TV with Your PC Using Your TV as a Computer (PC) Display

Entering the Video Settings (Based on Windows 7)

For your TV to work properly as a computer display, you must enter the correct video settings after you have connected the TV to your PC. Depending on your version of Windows and your video card, the procedure on your PC will probably differ slightly from the procedure presented here. However, the same basic information will apply in most cases. (If not, contact your computer manufacturer or Samsung Dealer.)

1. Click “Control Panel” on the Windows start menu.
2. Click “Appearance and Themes in the “Control Panel” window. A display dialog box appears.
3. Click “Display”. Another display dialog box appears.
4. Click the “Settings” tab on the display dialog box.

–On the Settings tab, set the correct resolution (screen size). The optimal resolution for this TV is 1920 x 1080.
– If a vertical-frequency option exists on your display settings dialog box, select “60” or “60 Hz”. Otherwise, just click “OK” and exit the dialog box.


Source: http://www.elgigantenbusiness.se/pdf/E16C2435-1A5C-4FCC-9A23-BF241766461E.pdf

What OS are you using btw? W7 can be a bit more finicky on various resolutions precisely fitting the display, but XP can be more finicky on detection.

Yeah, I've done all that, and resolution is not a problem at all. As soon as it's regonized, it will go 1920 * 1080 Now, I actually got it to work yesterday, in a real wierd way. I had another HDMI - DVI cable, and I connected both. Suddenly it could detect one of them, even though I checked all ports multiple times. I then disconnected the cable that it didn't detect, and now it's working fine... again. Stiil, I haven't found out what the problem is, and I've been in contact with Samsung. They're no help, they're just telling me the usual "have you updated, have you restarted etc. etc.
 
Sounds like it is a bad plug. I had purchased a hdmi plug while back really cheap, but it kept dropping signal. I would have to go and reconnect plug each time but sometimes it just wouldn't outright work. Sometimes with cheap prices you get what you pay for. Just something to consider.
 
Back
Top