Personal View site logo
GH2 Cake v2.3: reliability and spanning in 720p, HBR, 24p, and VMM at 2-2.5x stock bit rates
  • 609 Replies sorted by
  • Sorry if this is a dumb question but, how are you guys shooting HBR24p? HBR only gives me 30p and I can't figure out a way to change it...unless 24H is supposed to be the optimal setting here?

  • @balazer great stuff, will try out cake 2.3 tomorrow morning, thanks a lot balazer. I agree, time to stop testing new settings/patches and get out shooting. Your settings mean this to me: stable and spanning settings for all shooting modes in middle/high bitrates - thats it. Cheers

  • @albertdros Yeah, that's what puzzled me. When I look at the original media (H.264), it shows 44 Mbps. When transcoded to high quality (ProRes 422), it shows 146 Mbps. When I export as ProRes 422 (HQ) it shows 209 Mbps.

    I'm guessing some of this must be interpolation, since the original files are only 44 Mbps.

  • Cake 2.3 beta 1:

    • Improved rate control in 24H

    Cake 2.3 timelapser 1:

    • HBR 30p (NTSC) mode optimized for low bit rates with slow shutter speeds

    • 1/2.5 s shutter recommended

    • Approximately 6 Mbps

    • 30i is unusable; other modes are the same as Cake 2.3b1

    In Cake 2.3 beta 1 I've tried to keep the 24H average bit rate as close as I could to how it was set before, while keeping the max below my safe level and correcting the problem of the encoder going into fallback mode on some high detail, low motion scenes. Please test. If there are no problems, this version will become 2.3 final. With any luck, this will be my last update to Cake. We should all spend more time shooting and less time testing.

    I'm sure there are scenes in nature that can still cause the encoder with these settings to go into fallback mode, but they won't be very common. I could create settings that never go into fallback mode, but that would force me to be more conservative about how I set the bit rate. My goal was always to get the bit rate as high as I could while still having near 100% reliability and spanning. If a rare detailed scene raises the bit rate enough to force the encoder into fallback mode, that's a good thing: it means you won't get a recording error or a spanning failure.

    Cake 2.3 timelapser 1 is a tiny experiment for people who want super long recording times. The average user will be better off using regular Cake in VMM 300% mode, which is stable and gives higher quality. Timelapser 1 uses one half to one third the bit rate of VMM 300% 24H. I did not test spanning in timelapser 1. At this bit rate, you'll get about 90 minutes of recording before you reach 4 GB. If Duartix or anyone else wants to take over this line of development, be my guest.

    Edit: please see 2.3 beta 2, here: http://www.personal-view.com/talks/discussion/2123/gh2-cake-v2.2-reliability-and-spanning-in-720p-hbr-24p-and-vmm-at-2-2.5x-stock-bit-rates#Item_380

    Cake 2.3b1.zip
    857B
    Cake 2.3 timelapser 1.zip
    836B
  • 205 mbps uhh what? Bitrate of cake is like 40-60 mbps right? Btw just for you guys info. Im using transcend 32 gb class 10 with cake (HBR) without any problems.

  • @mrttt Yes, and that's a puzzle to me. I'll have to check if I threw in any post filters when I did the edit... I might have done some color correction. What's the timecode where you see this?

    What really puzzled me is that Cake 2.0 delivers 205 Mbps... I never expected that, although I know that water motion really increases the complexity of the visual field.

  • @ahbleza

    Re: your test with the boats in the river.. On the Second shot of the ship passing from left to right, the high bitrate one looks much darker to me. A large difference in contrast. Have you noticed this?

  • Since there seems to be some demand for settings that combine Timebuster 2 + Cake 2.x, I'm willing to tailor a branch that is the best merger for both. For this I need to relax some of Timebuster's main goals, so I need those that are interested to answer a few questions:

    • Are you ready to sacrifice absolute control over IQ (either down for reaching 24h or up to get better IQ) and can you settle for the default quantizer parameter =20?

    • Must you use a 360º shutter (and therefore use HBR) or you don't mind about it and prefer to sacrifice 24p instead?

    VBR patches like Cake are already good approaches to timelapsing and all the better if they have a long GOP nature, because the temporal redundancy raises it's efficiency to a whole new level. What a specifically tailored settings definition can achieve in case of pure Cake, is a ~2.5x recording length when very low (1/2s - 1/2.5s) shutter speeds are used. And this achievement is mainly due to adjusting the GOP length.

  • @Wigginjs - love your sample clip, especially the hose part.

    I feel dumb, but is VMM = Variable Movie Mode, the one where you can go slow as 80% and upto 300%?

    I'm going to be using your Cake Codec in a gh2 im using to record our wedding. Very happy (for the wedding and the codec!)

  • @feha Samsung 32gb "white" cards are class 10, but their performance is poor.

  • In general the minimum class 10 should be used, so why go below class 10 anyway ?

  • @Rammstein

    Cake 2.x ALREADY requires SanDisk Extreme Class 10 cards for maximum reliability.

    If we all start asking for @balazer to tweak Cake for our specific card (Samsung, Transcendent, Delkin, Kingston, Ultima, etc. etc.) we will drive him mad :-)

    Think about the time he spent, and still spends, for Cake... Isn't it worth more than 40$ for a new card? We are lucky to have such a deal.

  • No, I wasn't going to change the reliability or memory card requirements. The way the settings are working, a high detail, low motion scene will generate sustained bit rates of around 67 Mbps. But if you pick the camera up and move it around, the bit rate drops to around 40 Mbps, even though the encoder could still be using less quantization (to make higher quality and higher bit rates). I wanted to find a way to boost the average bit rate while keeping the max bit rate the same. But I couldn't find a way to do it, without resorting to frame limits or fallback mode. So instead I will keep things much the same as they've been, with the same max bit rate and the same card compatibility, except correcting the problem of 24H sometimes going into fallback mode for some high detail, low motion scenes. I will post settings tomorrow after one more round of tests.

  • @balazer - "Now I'm trying to boost the average bit rate and get it closer to the max bit rate" - this means i have to change my memory card? Remember "reliability and spanning".My bad samsung memory card is not even class 10 . Class 10 is at least 10 MB/s write speed . "Up to 24 MB/s read speed and 13 MB/s write speed" say in advertising but in reality is 18 MB/s read speed and 9.5MB/s write speed .And like me there are many. http://www.amazon.com/Samsung-Flash-Memory-Brushed-Metal/dp/B005TUQU00/ref=sr_1_1?s=electronics&ie=UTF8&qid=1334719200&sr=1-1

    Capture.JPG
    480 x 611 - 81K
  • An embarrassing mistake is 2+2=3, the one you made is just a human mistake...

  • Chris was right. The encoder was going into fallback mode in 24H in some highly detailed scenes. Other modes are fine. I had simply not set the bit rate correctly in 24H. It's an embarrassing mistake, especially since I've done this hundreds of times across the different modes. I'll post an update tomorrow after I get my sun back.

    Edit: I'm still working on this. Getting the settings to not go into fallback mode was easy enough. Now I'm trying to boost the average bit rate and get it closer to the max bit rate.

  • Shot with Cake 2.1 using the 300% VMM and 720p60 (SH).

  • @balazer I have a documentary shoot after tomorrow, I shoot in 24P, would that be a problem, sorry I just don't get the problem?

  • Thanks for the info, Chris. It seems I have some work to do to fix this.

  • @balazer @44M

    The "cadence" phenomenon nearly always happens as a consequence of the codec using fallback quantization scaling matrices. With P and B frames there are basically three things going on: There are "delta" macroblocks where the current macroblock is represented as a change (or delta) from another frame's (I or P frame) macroblock in the same location. There are "motion vectors" where the current macroblock is a delta based on a macroblock from another frame's macroblock shifted left/right/up/down. Finally, there are "stand alone" macroblocks coded in the same way as I frame macroblocks. In the case of motion vectors there is also data that represents small changes that need to be applied to the macroblock to maintain fidelity - so motion vectors are also combined with delta values.

    This means several things. When fallback matrices are used (i.e. when frame size falls to a very small value), changes between current macroblocks and their reference macroblocks are crudely (very crudely) encoded. With static scenes you would probably not see the difference because the subject isn't changing. Even with dynamic scenes where lighting etc... doesn't change it might be hard to see a difference if things are moving but not changing otherwise (like with a slow pan across a static scene). However, if a scene is moving, and lighting qualities are also changing dynamically, the results could be very bad indeed.

    The bottom line is that when these cadence issues come up the codec's ability to faithfully reproduce changes from frame to frame is seriously compromised. Now, with certain scenes that might not be so visible; but don't kid yourself - the codec is significantly crippled. It seems to me that the purpose of hacks is to get the best codec performance possible. When it operates in a mode that is known to seriously undermine fidelity, that seems contrary to the purpose of hacking in the first place.

    It's easy to detect this by running the JM-SP decoder in StreamParser. It places an asterisk next to each frame (QST-High column) that has been encoded with fallback matrices. Another thing that can happen is that lots of macroblocks may be skipped (also shown by the JM-SP decoder) - which is also bad because that means that no changes whatsoever are being encoded. Remember, though, that you have to create an elementary stream file (under "Tools") before you can run the JM-SP decoder (also under "Tools").

    Chris

  • @balazer cake 2.2 spans also on GF2 FSH Sandisk 32GB Extreme - great patch for that little camera!

  • I'm not good at pixel-peeping but I haven't noted any significant decrease in quality.
    I just reported it thinking it might interest you. Good to know you were already aware of it and that, usually, it's not an issue :-)

  • I can't say anything more about what you've shot without a more detailed analysis. You are welcome to raise the quantizer setting, to tune the settings more to that kind of scene.

  • @balazer Hi, thanks for your effort and diligence. I'm not sure I completely agree about IQ not being affect when the cadence is hatched. I've watched a couple of those sequences carefully and it looks likes fine detail get softened when it happens. It seems to degrade the IQ slightly, to my eyes anyway.

    cake21-24H_cadence.PNG
    746 x 412 - 145K
  • It is normal to see ups and downs of the frame sizes. That is the encoder's rate control adjusting the quantizer setting to keep the bit rate in range. Whether or not an uneven pattern of frame sizes is really a problem depends on what quantizer settings the encoder is using for the frames. As long as the small frames aren't too low in quality, it is fine. Have you looked at the video? The video can appear perfectly normal even with this kind of unevenness in the frame sizes. Occasionally on a very high detail, low motion scene the frames get too big, and the encoder is forced to lower the size of subsequent frames more than you'd like. It is a consequence of having short GOPs, a high maximum bit rate setting, and a low minimum quantizer setting. But am not going to change those things just to get a little more even bit rate in those kinds of scenes, because that would reduce the quality of more common scenes. If you want to tune the settings to better handle that particular kind of scene, raise the quantizer setting in PTool to 20 or 22.

    Attached is an example of uneven frame sizes in 24H. That unevenness looks bad in Stream Parser, but there's not a thing wrong with the video. The quantization parameters are not too high, and the video looks great. What matters here is that the small frames are good quality, and they are. Don't let the relatively large size of the large frames throw you off. Those frames are huge. If anything, the problem is that the big frames are too big, and hitting the frame limit. This would be fixed by raising the minimum quantizer setting, but again, at the expense of quality in more common scenes.

    Thanks for the feedback, gameb. Glad it's working well for you.

    uneven_frame_sizes.png
    1152 x 698 - 26K