NVIDIA, HDTV, and Overscanning

Nvidia GeForce Logo First off, an apology for not writing in a long time, but since I’ve switched to Arch Linux I really haven’t had any brain-wrecking problems to write about. But you know, that just means the next problem I run across is going to be just that much bigger, right?

I had a couple of issues with my HDTV. First of all it was a pain in the ass to switch between two inputs I might be using at one time. This is because there is a plethora of connection options, no way to disable inputs, and no way to switch directly to an input. I have to cycle through TV → AV → S-Video → YPbPr1 → YPbPr2 → VGA → HDMI → DTV. Most of the time I was using multiple inputs, it was VGA (computer) and HDMI (Xbox 360). Second issue was that, while I could set a custom resolutions in both Windows and Linux to match the exact resolution of my HDVT (1360/1366/1368×768) the connection suffered the fate of most VGA connection to digital displays — phase gets out of sync and text starts getting fuzzy. To make matters worse, you really needed to have text on the screen for the automatic adjustment to work its magic or else the picture actually got worse.

So I finally broke down and bought an DVI-to-HDMI cable and an HDMI switch box. However, this only made matters worse. The switch box worked fine, but the TV absolutely rejected the BIOS boot screen resolution (640×480), the GRUB menu resolution (720×350 text), the Windows boot screen resolution (1024×768), and the Arch Linux boot process resolution (800×600) over HDMI. Worse, on first boot, the Windows login screen was using a resolution the HDMI connection didn’t like either. My initial solution was to bring out an LCD monitor and use two screens. My final solution was to hook up the GPU’s DVI1 to the HDTV’s VGA, hook up the GPU’s DVI2 to the HDTV’s HDMI, and then set Windows to only use the second display. This way, I can tell what the heck is going on during boot if need be and switch over to digital when ready. Not ideal, but it’ll work.

(As a side note, I did NOT have this problem when setting up my brother’s computer. However, he does have a different HDTV and a GPU with an HDMI port. So either his TV can handle this stuff or using the HDMI port automatically triggers hardware scaling until the drivers are loaded.)

Now here comes the really fun part!

Without a doubt, NVIDIA seriously needs to reconsider how it presents HDMI resolution to the user. In fact to save a lot of confusion, here’s a proposal I whipped up in HTML. Of course, this also means NVIDIA has to get rid of some lame restrictions on what features are enabled when, but more on that later.

While it might be a little over-whelming at first, I think this is laid out well enough that users would be able to pick up and really tweak their resolutions to their liking.

Instead, NVIDIA gives us this:

  • 1080i, 1920×1080
  • 1080i, 1768×992
  • 1080i, 1680×1050
  • 1080i, 1600×1024
  • 1080i, 1600×900
  • 1080i, 1366×768
  • 1080i, 1360×768
  • 1080i, 1280×1024
  • 1080i, 1280×960
  • 1080i, 1280×800
  • 1080i, 1280×768
  • 720p, 1280×720
  • 720p, 1176×664
  • 1080i, 1152×864
  • 1080i, 1024×768
  • 720p, 800×600
  • 720p, 720×576

Instead of have things broken up into categories, all possible resolutions are just lumped in with each other. If that looks dizzying and more confusing than my suggestion, you’re not alone. It’s also dishonest. 1080i always sends a 1920×1080 signal and 720p always sends a 1280×720 signal. Here’s what is should say:

  • 1080i, 1920×1080
  • 1080i, 1920×1080 use 1768×992 (over-scan crop)
  • 1080i, 1680×1050 → 1920×1080 (keep 16:10 aspect in 16:9 signal)
  • 1080i, 1600×1024 → 1920×1080 (keep 5:4 aspect in 16:9 signal)
  • 1080i, 1600×900 → 1920×1080 (native 16:9 scaling)
  • 1080i, 1366×768 → 1920×1080 (native 16:9 scaling)
  • 1080i, 1360×768 → 1920×1080 (native 16:9 scaling)
  • 1080i, 1280×1024 → 1920×1080 (keep 5:4 aspect in 16:9 signal)
  • 1080i, 1280×960 → 1920×1080 (keep 4:3 aspect in 16:9 signal)
  • 1080i, 1280×800 → 1920×1080 (keep 16:10 aspect in 16:9 signal)
  • 1080i, 1280×768 → 1920×1080 (native 16:9 scaling)
  • 720p, 1280×720
  • 720p, 1280×720 use 1176×664 (over-scan crop)
  • 1080i, 1152×864 (keep 4:3 aspect in 16:9 signal)
  • 1080i, 1024×768 (keep 4:3 aspect in 16:9 signal)
  • 720p, 800×600 (keep 4:3 aspect in 16:9 signal)
  • 720p, 720×576 (keep 4:3 aspect in 16:9 signal)
Over-scan crop
Uses an inner subset of pixels to compensate for over-scan. Tells Windows only about the pixels you want used. If you use these resolutions on a display that does not over-scan, you’ll notice a black border around the screen.
Keep 16:10 aspect in 16:9 signal
Using a signal that uses the LCD screen’s native resolution yields the best picture. However, this isn’t really an option for 16:10 as the official HDMI spec only uses 16:9 resolutions. The next best thing, in theory, is to start with a signal that uses the LCD screen’s native resolution, then up-scale to 1920×1080 for official HDMI compliance, then let the LCD screen scale it back down. A 16:9 display (and 16:10 displays that blindly stretch the image to fill the screen) will show black borders to the left and right. This is only useful if the monitor enlarges a 16:9 signal to 16:10 and then ignores the extra pixels on each side.
Keep 4:3 / 5:4 aspect in 16:9 signal
While in theory this could be used for LCD displays that are not wide-screen, I’ve never seen a non-wide-screen LCD display that had HDMI. This is more useful for old full-screen apps/games that don’t handle widescreen well or at all.

Native 16:9 scaling
Basically the same picture-quality theory as with keeping 16:10 aspect in a 16:9 signal. Not all wide-screen displays are natively 1920×1080 or 1280×720 in resolution. In fact, most 16:9 displays are natively 1366×768. (Chiefly, the lower-end HDTVs, but in the name of profit motive leftover displays are stripped down and also sold as low-end computer monitors.)

If all that wasn’t confusing enough, NVIDIA also throws in Desktop Resizing. You may have already noticed that NVIDIA includes two resolutions that already do something similar. The Desktop Resizing utility will give you on-screen guidance to creating a percentage by which to either pre-scale or crop a signal. However, while the percentage part will be applied to all resolutions, the cropping part only works for the native 1080i and 720p resolutions.

Confused? I was at first. Especially since the on-screen tool only gives you two slider bars to putz around with — one for height and one for width — and the only thing the tool tells you is how big the resulting box is in pixels. It doesn’t tell you that it’s going to calculate a percentage to pre-scale all the previous resolutions by, nor does it tell you that it will create two new cropping resolutions and throw them in with all the previous resolutions you saw before.

After I used the tool, this is what my resolution list looked like. Note that I put the new resolutions in bold face so we can get on with the rest of our lives. This is not done in the NVIDIA Control Panel at all.

  • 1080i, 1920×1080
  • 1080i, 1824×1026
  • 1080i, 1768×992
  • 1080i, 1680×1050
  • 1080i, 1600×1024
  • 1080i, 1600×900
  • 1080i, 1366×768
  • 1080i, 1360×768
  • 1080i, 1280×1024
  • 1080i, 1280×960
  • 1080i, 1280×800
  • 1080i, 1280×768
  • 720p, 1280×720
  • 720p, 1218×684
  • 720p, 1176×664
  • 1080i, 1152×864
  • 1080i, 1024×768
  • 720p, 800×600
  • 720p, 720×576

Here’s what’s really confusing: All these modes have an asterisk beside them, but that asterisk leads to a foot note underneath the selection list that only reads (and I kid you not) Modes with a resized desktop. Really. Freaking. Useful.

All this time I’ve been talking about official HDMI resolutions. As we all know, just because the standard doesn’t specify it, doesn’t mean something can’t be used to do something else. Sure enough, you can send standard PC resolutions over HDMI.

So why didn’t I just do that? I did! But I ran into one HUGE stumbling block. My HDTV ALWAYS over-scans HDMI signals weather they’re standard high definition signals or PC-specific signals. There’s an option to enable or disable audio over the HDMI as some displays toggle over-scan based on that (assuming an audio-less HDMI signal is coming from a PC and one with audio is coming from a dedicated video-playing device). The real kicker is that Desktop Resizing is disabled when using PC resolutions!

In the end, I was hosed out of using the native resolution I’ve been using for years now, but I’ve come to actually like 1080i cropped to 1824×1026 @ 192 (200%) DPI more than native 1366×768 @ 144 (150%) DPI. The text only shrinks by about 0.02% but it actually opens up a lot. The picture actually looks like a picture now, even up close, instead of over-blown pixels best looked at from a distance.

Advertisements