Personal View site logo
Misconceptions about interlaced video footage.
  • This is a discussion about interlaced vs progressive video footage.
  • 8 Replies sorted by
  • @Ian_T
    How does having half of a spatial resolution result in a 20% lower resolution?
  • Interlaced = bad. Progressive = good. :)

    1080i/60 is actually 1920x540 @ 60 FPS. The deinterlacer must interpolate in the extra 540 vertical lines to get 1080p @ 60 FPS from it. Your results are completely dependent on your deinterlacer to do the interpolation. Your results will be very similar to upscaling 720p to 1080p because in reality the GH2 only produces about 900 horizontal lines of resolution and 850 lines of vertical resolution in 1080p mode. The 1920 horizontal lines spec means nothing because it doesn't even achieve half of that in resolvable resolution.

    With the GH2 maximizing vertical resolution is critical because the vertical resolution is the limiting factor. If you decrease the vertical resolution of the frame too much the sensor and glass can out resolve the video container's resolution. You will never out resolve the horizontal resolution of the container even at 720p.
  • ...what he said..^^^^ :)
  • Yeah, what he said is that it is more than 50% reduction in resolution, not 20%.
    My question was not to ask, but to prompt you to explain your statement.
  • Actually the vertical resolution of progressive video is 10%-25% lower than resolution of interlaced video but when that interlaced image is deinterlaced you end up around 20% lower than its progressive equivalent (as I mentioned in the other thread). So...if you do the math..basically what mpgxsvd said above is very true. Now, that all depends on how good or bad a deinterlacer you use. So...it can vary. But in the end...you'll always lose resolution by deinterlacing (compared to a true progressive image).

    Hope that helps.
  • 1080p60 = 60 frames of 1920x1080.
    1080i60 = 60 fields of 1920x540
    1080p60/2=1080i60
    No?
  • I'm referring to actual resolvable detail. Most cameras do not resolve that kind of detail. But if you really wanna break it down....1080/60i is really only 30fps.

    Even if cameras could resolve that kind of detail...if you use mpgxsvcd's example above where he says "Your results will be very similar to upscaling 720p to 1080p" (this is in regards to deinterlacing) then according to the math you will see that 1080p is only 33% larger than 720p. But...because cameras really do not resolve that kind of detail (1920x1080)...AND if you employ a decent deinterlacer...you will more than likely be closer to a 20% reduction instead of the 50% that you are suggesting.

    At least this is the way I've always understood it.

    Note: I would think resolving 60 fields of 1920x540 is easy for most cameras. You are basically resolving two "fields" (or what I call "half frames") at different times. Or maybe I'm looking at this all wrong.
  • @ Ian_T

    Actually 1080i is more 60 FPS than it is "Full HD". It does contain 1920 distinct pixels for every 1/60th of a second. However, it only contains 540 vertical pixels for every 1/60th of a second. That is the issue. You can easily etract 60 FPS material out of 1080i. You can't extract true 1920x1080 out of 1080i unless the subject, camera, and scene are absolutely stationary which never happens in reality.