Less Is More: Why 24fps is Important in Home Theater PCs

Posted by Shashank Krishna Tuesday, July 29, 2008


sharethis:

When I play games, I want my PC to be able to generate as many frames per second as possible; it's a measure of the strength of the CPU and GPU solution which is used consistently in gaming benchmarks to show which processor and/or graphics card is better. My PC can beat up your PC if I can generate eye candy at a whopping 130fps and you can only do a mere 100.

Why then are home theater PC folks so obsessed with playing back at a paltry 24 frames per second when graphics solutions (including integrated graphics) are easily able to send out 60?
Judder-bug

Okay, so it's not an actual bug, it's just a bad pun. But the key is the word "judder". To understand why it matters, we must understand how films are made.

Motion pictures in 99% of Hollywood cinema are recorded at 24 frames per second and then each frame is strobed twice before the next frame is indexed. In effect, film in theaters is displayed at 48Hz, though the actual content is 24Hz.

Historically speaking, TVs are 60Hz (to match the AC frequency of the wall socket; in other countries with different wall socket frequencies, the TV rate is different). While I won't get into the specifics or the full history here (there are lots of sites out there which cover why this was so and how it was compensated), the major understanding to carry forth is that in order to get 24 frames of content into something that is refreshed 60 times per second, you have to repeat frames in an irregular way.

In short, the first frame in every second is repeated 3 times, the second frame is repeated 2 times, the third frame is repeated 3 times, the fourth is repeated 2 times, etc. This process is known as "3:2 pulldown", and you can see from the math that if you repeat 12 frames 3 times each and 12 more frames 2 times each, you will get 60.

So all of the picture content makes it onto the television screen, so everything's great, right? Alas, the human eye is only partially fooled: on scenes with high amounts of motion or where the camera pans across a landscape, the phenomenon known as "judder" manifests. The motion should be smooth, but you can see some jerkiness to it due to the fact that every other film frame is of slightly different duration.

This isn't a new problem: film has been 24fps for a long time, and TV has been 60Hz for almost as long. When TVs were interlaced the issue was reduced because at the transition between the 3-frame and the 2-frame the middle frame was essentially a mix of the two (again, not going into the specifics on interlacing history). Nowadays, TVs are made with non-interlaced (aka "progressive") components and you can see the judder artifacts better.

There are ways around this; people with computer monitors have a bit more flexibility with their refresh rates than TV users traditionally do. Setting the graphics to output at 72Hz generates a nice image (since 72 is an even multiple of 24), but the vast majority of television sets won't accept 72Hz signals. And while there are some computer monitors you could reasonably call "home theater worthy", most are a bit small and limited.

As I mentioned, this is not a new issue, but until now most sources were interlaced (even DVDs are generally interlaced, though flags in their software exist so progressive player hardware can pick out the original frames) so the issue wasn't high profile. With new technology like Blu-ray, the video is stored as 24 actual pictures per second, and the only limiter left is the display.

Of course, when you're a Consumer Electronics manufacturer, any nifty gadget you can add to your TV to distinguish yourself from your competition is a good one, and someone got the bright idea to include inputs which would accept and display 24fps. Everyone else soon followed, and now a lot of mid- to high-end equipment has the ability to take in images at 24fps and display it at some multiple of that (technology in the generic TV set has changed enough to decouple it from the original 60Hz number).




Early drivers for the 945G, G965, G33, and G35 were all capable of 24Hz output. You had to play tricks on them sometimes to convince them they could do a resolution/refresh they didn't know about by default, but if you got the drivers to accept that they could do it the chipset graphics would deliver 24fps goodness. After all, theoretically in sheer processing power it should be easier than the normal 60fps; any integrated graphics chip should be able to do it.

In reality, there were some difficulties, and they remain today. Desktop work at 24fps was fine, but watching media at 24fps exhibited very jerky behavior every several seconds or so... then returned to silky smooth motion once more. This was across a variety of software players-- both free and consumer-- and across multiple motherboards and chipsets. And didn't manifest on either ATI or Nvidia graphics... suggesting, once more, that this was a problem with the drivers. (It's vaguely possible this is a hardware issue with the chipsets, whose design concentration would logically have been "ensure high refresh rates".) There was some sign this might be modulated by the renderer used (EVR vs VMR) as well as by the presence of extra frames in the source files, but as of the 15.7 drivers there manifested a new problem: all support for 24fps is gone.

Whether your EDID is advertising the 24fps capability or not, even if you use the trick I linked to above, the resolution won't show up as selectable in the Intel drivers. 48Hz still seems to work fine... but as there are only a handful of TVs which accept 48Hz signalling, this is small consolation.

It could be that someone on the driver team recognized that 24fps was broken and removed it, or there could be something else going on that's even more mysterious. It could entirely be an oversight. But the fact is, even if you want jerky 24fps on Intel graphics you now have to use older drivers, keeping yourself in the past and avoiding fixes you might want for other bugs.

I haven't gotten my hands on a G45 system yet, so I don't know if this issue is fixed on newer graphics chipsets yet, but I sure hope so.

If you ask around in the home theater PC community, there are three issues which prevent Intel graphics from being an all-around winning solution for HTPCs:

1) The HDCP repeater bug

2) The 24fps problems

3) The Denon HDMI EDID issue

You'll note that these three issues just happen to be the ones I've covered in my blog. This is not coincidence.

I've seen internal presentations, and if our drivers meet or exceed the hardware the G45 is going to be a great chipset from a home theater standpoint. But it's important that we address the enthusiast in this space (because the enthusiast is going to tell his friends what to buy) and the enthusiast sees all three of those issues as important to have solved in a quality system.

So far I've not noticed a driver release which fixes any one of them. I don't know that any of our Graphics (or Audio) folks read this blog, but I sure hope so.

P.S. I won't even discuss the fact that it should really be 23.976fps instead of exactly 24fps. You'll have to figure that one out on your own. :)

Spread Firefox Affiliate Button | edit post .

0 comments

Post a Comment

Are You Planning on Quitting Facebook? Why?

@Flickr

www.flickr.com

About Me

My Photo
Shashank Krishna
Bangalore, up, India
nothin much to say.........doin B.tech in IIIT allahabad loves bloggingn hacking.... :) and loooves blogging
View my complete profile

ads2

topads