Black bars on TV with ATI/AMD video

2011/12/05

Categories: GeekStuff

Okay, this one took me a while to figure out, and it turns out there’s not a solution I can find. (Before you rush to link me to the articles on this: Yes, I know about the overscan setting, I messed with it, it did not do what I need.)

Problem: Hooking up a laptop with an ATI video chipset to a flat panel TV, I get black borders around the display, and everything looks like crap.

Solution: None, really.

Background: In days of longago, our ancestors used analog devices. For instance, a display did not actually have a fixed number of well-defined pixels; rather, it had an area of phosphors which were hit by an electron gun, and you could smoothly adjust the size of the area the gun worked on. The corners of the display were usually unreliable, so what people did in general was arrange to send slightly more signal than you could physically see, so the picture would go all the way to the edges of the screen. Of course, this meant that some of the graphical signal was invisible to the user. So on things like the old Amigas, you’d have software controls for the actual size of the image to send, and knobs on the monitor to adjust how it displayed things, and you’d mess with these to get more pixels and still be able to see them all.

Modern displays, such as LCD TVs (and “LED” TVs are still LCD displays, they’re just using LEDs instead of fluorescent lights), do not work like this. The “panel” has a fixed resolution, the entire panel is visible, and the corners are not particularly special.

When working with old-style TVs, “overscan” controls were extremely useful. With new-style TVs, they’re still reasonably useful when using analog inputs (because the conversion hardware is going to be making guesses about which parts of the signal to translate to the screen), but they’re mostly optional. With new-style TVs and digital inputs, they are basically worthless.

Which comes to the weird part: the Catalyst Control Center (the ATI/AMD driver) does not appear to allow you to disable this functionality. It starts out by default set to significant underscan. There exists a control for this, but every new graphics mode will default to significant underscan again. Worse, the underscan/overscan thing is set such that the range reported is “-15” to “0”. However, at least on the display I have access to, 0% is actually overscan. Which is to say, if you size the display up to “0%”, the outside edges of the display are clipped.

This could just be the TV (a Dynex 42E250A12), but I tried it with a Macbook Air (nVidia 320M) and with an Acer TimelineX 3830 (nVidia 540M), and both of them work perfectly well in 1920x1080, with every pixel lined up.

With the ATI display, things are worse. The overscan/underscan control is a slider with about 15 positions. None of the positions corresponds to an exact 1:1 display. About 2/3 of the way across the slider, there’s a pair of positions such that at one, there’s a few black pixels below the graphical display, and the next, a few pixels on the left side of the screen are cut off. Which is to say… None of them are doing what the nVidia hardware does, which is to just trust the display’s claimed resolution and stop trying to outsmart it.

Research reveals that this is a very common complaint, and that for some people with some displays, “0% underscan” appears to produce the expected behavior. I can’t say why it isn’t working for me, but I am not too inclined to blame the TV, simply because it’s working with the other devices I’ve tried it with.

Interestingly, this doesn’t happen with any monitor I’ve ever tried, only with a TV. Obviously, ATI’s hardware is capable of displaying cleanly on a digital display over digital – they just chose not to.

Why is a bit of a mystery, but here’s my guess: There exists, somewhere, a set of driver requirements, and one of them is that for TVs, the display shall have overscan/underscan controls because TVs need that, and no one has updated this list of requirements since the 1990s.

Comments [archived]


From: Dave Leppik
Date: 2011-12-05 10:44:05 -0600

My mother in law has the same problem with her Mac laptop connected (via HDMI) to her TV. It’s really frustrating to me, since I know I should be able to find a resolution that works perfectly, but I can’t. This is a stock MacBook (not sure which one), and Apple provides a certain number of resolutions particular to the display, except that none of them are particularly good.


From: Sijmister
Date: 2011-12-31 04:16:00 -0600

I have an nVidia card, so I might not have had the same issue as you, but what I did was I left the overscan settings on my graphics card on default, and I just changed them on my television, I have a sony Bravia, and to do thatn you change it from 1:1 to Full Pixel, which makes it display every pixel that is sent instead of trying to correct the image. If your TV doesn’t have scanning control built in, then I hope you find some other way to resolve this issue.