• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Need EDID override fix

Fordified

New Member
Joined
Sep 30, 2011
Messages
4 (0.00/day)
http://forums.techpowerup.com/showthread.php?t=124843

It is a .NET Framework 2.0 project so if you install .NET Framework 3.5, you should be good to go. You can get them via Windows Update, Software Updates portion.


What operating system are you running?

I fixed this myself last night by moddifying the monitor driver to include EDID override lines using the EDID my display was giving out. I then talked to fordGT90 last night about the problem we worked together and he knocked up a nifty little program to do what I did automatically.

This program will fix incorrect EDID readings from HDTV's and HD displays which results in grainy picture when using nvidia cards with dvi/hdmi outputs.

http://downloads.solarisutilitydvd.com/Misc/EDID Override Tool.exe

After the program creates the .inf go to your device manager.

go to monitors.

right click on your display and click "update driver software"

click "Browse my computer for driver software".

find and select the modified INF.

install it. When it asks you if you would like to install it because it isnt WHQL select "install anyway"

After installation reboot your PC.

Sorry I'm resurrecting an old thread. You can blame Solaris. I'm at work otherwise I'd try it right now. I just saw this thanks to Salaris on tom's hardware and thought there might be a chance I can use my tv at the 1080 resolution and not have to settle for a smaller monitor just to hit that resolution.

I'm ignorant when it comes to this but I am having the same problem on my Sony Bravia 32" tv. The TV is a 1080p tv and the input on the tv screen displays 1920 x 1080 but the pc recommends 1366 x 768 and if i force it to 1920 x 1080 with nvidia's software it looks grainy and bad.

I am running Windows 7 64 bit and I have a GTX 560 ti superclocked card which has two dvi outs and a mini hdmi out (which i use the mini with a converter). Will this .net framework app work for my purposes?

Any help you can provide would be greatly appreciated.
 
http://forums.techpowerup.com/showthread.php?t=124843





Sorry I'm resurrecting an old thread. You can blame Solaris. I'm at work otherwise I'd try it right now. I just saw this thanks to Salaris on tom's hardware and thought there might be a chance I can use my tv at the 1080 resolution and not have to settle for a smaller monitor just to hit that resolution.

I'm ignorant when it comes to this but I am having the same problem on my Sony Bravia 32" tv. The TV is a 1080p tv and the input on the tv screen displays 1920 x 1080 but the pc recommends 1366 x 768 and if i force it to 1920 x 1080 with nvidia's software it looks grainy and bad.

I am running Windows 7 64 bit and I have a GTX 560 ti superclocked card which has two dvi outs and a mini hdmi out (which i use the mini with a converter). Will this .net framework app work for my purposes?

Any help you can provide would be greatly appreciated.

that happens simply because the TV has the Capacity to display 1080P Content, but the physical pixels it has are 1366x768, it's nothing to do with the software, you loose picture quality due to the picture being scaled to fit the pixels.

i have a 32" bravia (possibly the same) and the same thing happens, if you look at the specification in the manual you'll see the physical pixels it has are 1366 x 768, not to say it's the same with all bravia's, there are quite a lot of 1080P one's, but the Card usually reccomends the physical resolution of the screen
 
that happens simply because the TV has the Capacity to display 1080P Content, but the physical pixels it has are 1366x768, it's nothing to do with the software, you loose picture quality due to the picture being scaled to fit the pixels.

i have a 32" bravia (possibly the same) and the same thing happens, if you look at the specification in the manual you'll see the physical pixels it has are 1366 x 768, not to say it's the same with all bravia's, there are quite a lot of 1080P one's, but the Card usually reccomends the physical resolution of the screen

Well that's no good. I was hoping to squeeze that out of it. I guess I'll have to start researching a new monitor.
 
Well that's no good. I was hoping to squeeze that out of it. I guess I'll have to start researching a new monitor.

what's the exact model number of your TV?
 
that happens simply because the TV has the Capacity to display 1080P Content, but the physical pixels it has are 1366x768, it's nothing to do with the software, you loose picture quality due to the picture being scaled to fit the pixels.
Someone else posted with a similar problem not long ago...it was also a Sony:
http://www.techpowerup.com/forums/showthread.php?t=149747&highlight=1080p

He didn't post back after switching cables so I have to assume that fixed it. How long is the cable between the TV and computer?


EDID override addresses a very specific issue in NVIDIA cards. This is likely a TV/cable issue, not a graphics card issue.
 
what's the exact model number of your TV?

The model number is KDL - 32L504

Someone else posted with a similar problem not long ago...it was also a Sony:
http://www.techpowerup.com/forums/showthread.php?t=149747&highlight=1080p

He didn't post back after switching cables so I have to assume that fixed it. How long is the cable between the TV and computer?


EDID override addresses a very specific issue in NVIDIA cards. This is likely a TV/cable issue, not a graphics card issue.

The cable between the tv and computer is 15 ft. I bought the HDMI cable graded for that length.
 
Back
Top