Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
Please, support PV!
It allows to keep PV going, with more focus towards AI, but keeping be one of the few truly independent places.
CODECS: Bit rates comparing MPEG2 vs H.264
  • I'm curious to validate this information I found about the effective multiplyer factor for comparing the "broadcast" bit rates of various codecs. From what I understand, most TV broadcast bit rates are based on using MPEG2 encoding -- which is much less efficient than the H.264 used within AVCHD. According to this article, the author uses a ratio of 2.3 to compare the rates of the two codecs. Is this a reasonable number, and do my fellow PV'ers agree with his analysis?

    As an example, he says that the Canon 5D Mk II which has a bit rate of 48 Mbps actually has a broadcast effective bit rate of 110 Mbps when compared to similar MPEG2 material. What do you think?

    http://www.rgbfilter.com/?p=10018

    Edit: I looked in the FAQ but couldn't find anything directly on this topic.

  • 10 Replies sorted by
  • 2:1 is a rough approximation of the efficiency of h.264 compared to MPEG-2. But it depends on a lot of factors, including the efficiency of each of the compressors, and the content. The greater the noise or motion, the less difference between h.264 and MPEG-2. The higher the bit rates, the less difference.

    Differences between individual encoders can be very large, so it's not all that useful to generalize.

  • I'm not qualified to give an answer about the efficiency of different compression processings. But I could sign that the effective bitrate value doesn't discover the real video quality.

    Last year I did whole audio production for one release in quite new CD+ format where also short video trailer was included. As example for it the company sent me just for the demonstration a similar product with superb looking video quality of the trailer in width of only 800 pixel, bitrate never higher than 600 kbyte/sec coded as MPEG4 / AVC / Main@L2.1/ CABAC: no

    I was happy to be able to shoot for this release also video trailer myself. My edited trailer made of simply superb looking GH1 files in AVCHD 1920x1080 looked so good. After converting it into format listed above it looked just like poor shit. Conversion in different softwares brought very, very different results.

    At the end, it is not so important how that problem got solved - the point is, that that one video of total size of 38MB for nearly 4 minutes looked so darn good. Can somebody who understands more about the codecs and compression explain me that?

  • I figure we also need to factor in the complete production workflow, from acquisition codec, through transcoding, NLE output, grading and final rendering. Ideally, each stage should maintain the maximum available bit rate, or even exceed that of the stage before, so as to ensure no information is lost.

    One thing I certainly agree with -- a direct comparison of bit rates (i.e., 19 Mbps versus 80 Mbps) is really not helpful without also comparing codec and use of I-frame or B-frame, quantization (8-bit being typical for AVCHD) and even interlace, PsF and progressive scan details. It all starts to get very technical, and possibly less than helpful when both sides in a discussion aren't at the same level of understanding.

    I'm no broadcast engineer, but it would be great if we had an objective way of describing overall image quality that covered bit rate, quantization and even dynamic range. I guess it's more of an art than a science, but with plenty of science too.

  • The reason why mpeg2 is still used by majority; broadcast cos / industry are very slow to move / change hardware inc. encoders.

  • The other issue is that while MPEG-4 is amazingly better than MPEG-2 at low bit rates, the difference is really much smaller at very high bit rates. Sensibly encoded 1 Mbit MPEG-4 vs 1 Mbit MPEG-2 is no contest. It requires much more than 2x the bit rate to get similar quality with MPEG-2. OTOH, 50 Mbit MPEG-2 really doesn't look that bad. MPEG-4 may be measurably better, but not hugely noticeably so. As an intermediate codec, I would care, but as a broadcast codec, it just needs to look "Good Enough."

  • I can confirm @driftwood's words if we talk about audio, I assume it is similar with the TV part. Big radio corporations usually skip few steps of technological improvements also for sake of very high investments. They also buy different equipment that is out of reach of many private low-budget producers.

    For instance, audio mixing consoles still used today in many radio corporations around where I live date from the years when 48kHz was the common used audio bitrate. Although other hard- and software they use today can record in PCM format up to 192kHz/24bit (software internal 32bit) the mixing consoles just can not handle more than 48kHz so that's uppest limit.

    It is easy to understand that such a thing which costed over 100k should be used for some years before it gets thrown away (by use of the construction crane, though, regarding the size)....

  • It all depends on country, etc.

    Most new cable channels use only DVB-C and H.264.

  • @subco Yes, I was employed number of years there, and it was not a small one.