interlacing basics 

Newsgroups:  gmane.comp.video.transcode.user
Date:        Thu, 05 Jan 2006 12:43:59 +0100
> I would like to get a basic understanding about interlacing. Especially
> I have the following questions.
>
> 1.) Is it correct that the comb effect in fast motion is only visibly on
> PC screen and not on TV ? Why?

To fully understand this (and why interlacing should be abandoned asap) you need to have a little insight in the technology behind this, which I will try to have a go at.

> 2.) When I create a DVD that I only intend to use with TV I would not
> need to de-interlace with transcode?

No, never do that (as the others already mentioned).

> 3.) Does it hurt to de-interlace for videos intended to be played on TV?

See #2 ;-)

> 4.) When I watch a normal movie DVD on PC I don't notice the comb effect
> (or at least not very much) but after ripping and transcoding the VOB
> into something else like AVI I see it very strongly, why?

Beside the other explanations, there is also the possibility that one field (half frame) was lost, so the pairing of the fields into frames is no longer valid; field #1a belongs to frame #1, field #1b belongs to frame #2, field 2a belongs to frame #2, etc.

Rationale about interlacing: back in the ol' days (~1940-1950) when the first television sets where produced, it was decided that a refresh rate of 50 Hz / 60 Hz (depending on country) was necessary to get fluent motion. That would mean that the resulting video signal would be > 15 Mhz, which apparently was not believed to be acceptable. Also it apparently was too expensive to make television sets that would be able to handle this. This is where the concept of interlacing was introduced; the even and odd numbered lines were to be broadcast separately, sequentially. The result is the required 50/60 Hz refresh rate, keeping the full resolution, BUT only when there is no motion. When there is motion, the actual resolution is only half of it. It was easy to implement this (= cheap!) by choosing tube phosphors that keep emitting light for roughly the time period the complete field is to be seen on the tube, after being "hit" by the electron beam.

So, this means you actually can only view interlaced material on a system that has exactly these characteristics. Computer monitors and TFT/Plasma screen do not have these characteristics. All their pixels are visible all of the time, they do not fade. Also their refresh rate is typically 60 Hz (exact, no 60001/1001 as used in ITU-R "M" TV broadcasting). In case of a computer monitor, there is also a VGA card between the computer and the screen, which may use it's own, different refresh rate (although this is not recommended).

This means that the two fields (= odd lines / even lines) always get send as a complete frame (= all lines) to the VGA card (in a computer) and to the screen (computer/TFT/Plasma). There is no sense in separating them because the digital screen won't be able to show them separately (as a tube does) anyway, all pixels are always visible AND the internal pixel update frequency is never coupled to the input signal.

The result is that on a digital screen (computer monitor/TFT/Plasma) you always see both fields at the same time, which really is not the intention, because the fields are recorded half a frame time from each other. This is where the comb effects come in. What you actually see is the same object, at two different points in time, half of it from time x, the other half from time x+1/2.

You can never get this completely right. There are several algorithms that try to minimise the loss of resolution and motion data, but the result is always more or less disappointing. The more resolution you try to preserve, to more motion data is lost and vice versa.

So, how does it come interlacing is still used? I honestly don't know. Especially using digital TV there is no reason at all to use interlacing, if it were only because digital screens cannot even display it properly! When I see interlacing is supported for mpeg4 and even mpeg4-avc (aka mpeg4-part10 aka h264, imho the future of digital TV), I get very very sad.

I consider myself lucky, in that I live in "25 hz land", so my DVDs are, although interlaced, in fact actually progressive (because both fields of all frames actually the same).

Erik Slagter

interlacing basics 

On Thu, 2006-01-05 at 09:30 -0600, Kenny Gow wrote: > > I would like to get a basic understanding about interlacing. Especially > > I have the following questions. > > > I found a nice website with many examples about interlacing > here: > > http://www.100fps.com/

Nice, informational pictures, but I do miss some basic, real technical stuff here to explain the whole. The whole story about the electron beam in a legacy tube is not mentioned while it is fundamental to the interlacing concept.

Erik Slagter

interlacing basics 

>I would like to get a basic understanding about interlacing. Especially
>I have the following questions.

There's a fairly good description (written by yours truly, so IMHO ;) ) at http://www.transcoding.org/cgi-bin/transcode?Interlacing that covers most of these points; see it for more details.

>1.) Is it correct that the comb effect in fast motion is only visibly on
>PC screen and not on TV ? Why?

Yes, that's correct; CRT-style TVs are interlaced, and newer LCD or plasma TVs have hardware that filters out the "comb" effect (essentially deinterlacing the video on the fly).

>2.) When I create a DVD that I only intend to use with TV I would not
>need to de-interlace with transcode?

That's correct. Use the —encode_fields option to transcode to indicate that the source video is interlaced, and the output video will be interlaced as well.

>3.) Does it hurt to de-interlace for videos intended to be played on TV?

Absolutely yes. By deinterlacing, you lose half of the original video data, which can make movement look visibly jerkier and still images look blurrier.

>4.) When I watch a normal movie DVD on PC I don't notice the comb effect
>(or at least not very much) but after ripping and transcoding the VOB
>into something else like AVI I see it very strongly, why?

Assuming you're talking about commercial movies, such movies are originally recorded at 24 fps (frames per second) in progressive (non-interlaced) mode. In PAL countries, the video and audio are sped up slightly to match the 25-fps TV standard; in this case, there's no need to modify the video data, and it shows up on both TV and PC in its original progressive form. In NTSC countries, a process called "telecining" is used to convert the movie to the 29.97fps TV standard, which results in some frames being interlaced, but PC players are generally smart enough to reverse this. However, when you re-encode the video to another format, the video data is modified and the telecining process can no longer be detected, so you occasionally see interlaced frames show up on your PC.

Andrew Church

interlacing basics 

>Why does interlacing not result in half the brightness?

That's sort of like asking "why does only half a liter of water fit in this half-liter bottle?" ;) Interlaced TV sets are designed to produce the "proper" brightness given a certain input video.

Andrew Church

interlacing basics 

On Fri, Jan 06, 2006 at 12:13:41PM +0100, Erik Slagter wrote:

> > I have a question, is there any tool that can tell whether the source is
> > interlaced or not?
>
> > all my readings say that I need to look...
>
> Someone did a nice job as transcode filter. I can't tell you more
> because I never used. The author may very well be on the list.

it's called fieldanalysis.

http://www.transcoding.org/cgi-bin/transcode?Filter_Plugins/Filter_Fieldanalysis

there has been discussion of the usage in the list before … maybe find it in the archives.

Jacob Meuser