Personal View site logo
Make sure to join PV Telegram channel! Perfect to keep up with community on your smartphone.
BlackMagic Pocket Cinema Camera 4K
  • 484 Replies sorted by
  • WTF! Blackmagic didn't invite me to their camera party! I'm buried just a few km away! - Stanley Kubrick #rollinginthegraveMAD!!!!!

  • Voldemort is apparently not going to buy one of these now because Black Magic won't give him free stuff and because @johnbrawley didn't kiss his butt. I guess we'll have to miss out on a bunch of bad pseudoscientific review comments about how "cinematic" and/or "filmic" it is. :(

  • @DrDave

    You can't have fanless design with powerful FPGA they use. As it is around 10-30 times les energy efficient compared to custom LSIs big companies use.

  • It's probably comparable to the IBIS noise on a lot of modern cameras. My G85 buzzes away the whole time you're using it and it hasn't caused me any problems yet.

  • Fan noise? Hmmmmmmmm.

  • @johnbrawley Thanks for taking time to respond to my post & I look forward to seeing you shoot the "8K URSA MAJOR" they must have under wraps sometime soon- I hope they let you keep it permanently... ;-)

  • @NickBen

    Sorry for the delay.

    It supports HDR in that it captures a high dynamic range image. How you map that is up to you. But there's nothing formal that I know of that does that as yet.

    They have done a new version of what used to be called "Video" which was their REC 709ish LUT. It's now called "extended video" and it's a real improvement when used with the new gen 4 colour science.

    The screen is the same screen as used on the Video Assist 5" and the Ursa Mini 4.6K. A good screen, but not a high nit screen for any kind of serious monitoring. I think of it as a touch / menu / interface screen.

    There is fan noise, it's not especially loud, on a par with the Ursa Mini. It's a constant fan, like all BMD cameras. Remember they use solid state cooling, not fans to maintain the temperature. Weirdly it vents out the bottom instead of the top of the camera. I find my hand gets warm when holding it underneath. Not uncomfortable, but you notice it.

    I didn't try it on a gimbal. I kind of hate them :-) (I own a Ronin 2 and a a Ronin S)

    I didn't use the SSD feature.

    The rear display uses almost no power, and just like on all BMD cameras, it's always on because it's your status screen.

    I've been using the existing screens in other iterations in all the weather situations you mention and it's worked fine for me.

    I didn't do any stills, and yes it can do timelapse but I didn't use it for that. I literally had a very short amount of time with it. When I shot with it I believe it was the only "functional" version in the world. It was hand carried and flown to me to use and then taken away :-)

    I imagine it will do great green screen, but I didn't use it in this scenario.

    It's not technically late yet, whereas RED...


  • @Vitaliy_Kiselev can you please split off all posts about colour "science" into an extra topic and cleare this one up to be only about the new BMPCC4k?


  • @GeoffreyKenner Have you ever produced an HDR video? Have you ever seen one on a 4K HDR TV, even a $299 one? Or even on an HDR-capable smart phone? If you have not, then I understand why you don't get it. If you have, then you must be blind. More colors are better than less colors, 1000 nits is better than 250 nits. And 12 stops of DR is better than 5-6. The difference between SDR and HDR is stunning seen on any HDR-capable device. Most of this is the DR advantage rather than the color, but still, to argue against reproducing more colors is silly. Is this - if it's not perfect, it's not worth doing?

    The Shogun Inferno can display 1500 nits, so you can even monitor in HDR in the field, albeit with less than the full color gamut.

    I agree that not many displays can display the full REC2020 gamut (one m, btw). So therefore, only shoot in REC709? What exactly are you recommending? And no, going from REC709 to REC2020 produces false colors. You cannot recreate colors in the world that are not recorded, by any math transform. Look at the 2D gamut graphs, REC709 misses more colors that we see than does REC2020 or S Gamut or V-gamut.

  • Considering that even the top of the line 30 000$ high end color grading display don't even have the capability to properly display Rec2020 nor reproduce peak brightness necessary for HDR, I don't even know why the heck so many people actually want it in camera. Right now your only option unless you own two 4K laser Christie Projector is to generate a profile LUT to adjust your maximum screen capability to display correctly what it'll look like as a HDR output but you'll never be able to play with full gamut. And yes Vitaliy_Kiselev is right, you do color space transformation even from Rec709 to Rec2020, it all depends of your screen capabilities. The last tendency I've seen that messes up a lot of final output is people rushing to color Grade in P3 on their monitor (Eizo & BenQ to cite the most affordable). It's a huge mistake considering that it first don't display 100% of the gammut but has a gamma response that differs from the actual screen signal. It's far better to grade in RGB Rec709 and do a color space transform to DCI-XYZ, the result will be much more faithful if the display is well calibrated to Rec709.

    But if it's a concern, shoot Raw and do your own color management. Footages looks fantastic btw, I'm waiting though to see some report on bug/crash and what that cheap price has to bring forward as a compromise. But just for the fact you might get Resolve studio, it's a very good thing.

  • The previous BMPCC wasn’t known for its low noise. It produced a fantastic picture, but it wasn’t exactly a low light performer. Fast forward to the BMPCC4K and things have changed, a lot. The new camera has dual native ISOs of 400 and 3200. However the camera will switch automatically to the upper gain circuit at ISO1250. Using ISO settings above or below those two native settings will have an effect on dynamic range, but only minimal.

  • @bannedindv specs provided by the shooter of the Bubblegum video:

    • RAW 3:1 4K DCI
    • 24fps Project / 60fps Off-Speed
    • ISO 400, 1250, 3200, 5000
    • Samsung T5 SSD through USB-C

    DJI Ronin-S

    • Metabones EF to MFT T Speedbooster XL 0.64x
    • Shot with Contax Zeiss MM primes: 35/f2.8, 50/f1.4, 85/f1.4, 135/f2.8 + Hoya Pro NDs
    • 95% Ronin-S / 5% handheld.
  • I find the lee filter swatches very useful. There is a great app for android and iphone.

    The x and y are the coordinates in the CIE 1931 colorspace, which is the theoretical reference of every color we see and are able to reproduce, and then some presumably.

    @Lohmatij I agree with the absolute subjective dimension of colour, there is no way to determine if we two perceive a given wavelength in the same way. Any comparison is done through language. Wittgenstein's Remarks on color is a fascinating though pretty demanding read.

    Hope I am not hijacking this thread.

  • @eatstoomuchjam

    I wasn’t correct when I mentioned purple “spectrum”, the proper way would be to say “a group of purple colors.” I was talking about XYZ color model which can’t describe those colors accurately, and because RGB color model and all existing RGB color spaces are based on it, we can say that they will have trouble with rendering of those colors too.

    Keep in mind that while some colors, like violet, can be described as a a single wavelength , some colors has to be described as a combination of different wavelengths. Purple doesn’t have any corresponding wavelength, it’s a combination of pure blue and pure red, 2 colors on the opposite side of spectrum. The other simple example is color “white”, which consists of equal combination of all visible wavelengths.

  • The last shot in the Bubblegum video is of the operator holding what looks like a Ronin S, FWIW.

  • @Lohmatij "for example there is a big chunk of purple spectrum which just can’t be described in math."

    This statement seems ridiculous. What we perceive as colors are portions of the visible spectrum of wavelengths. Purples are the wavelengths at the short end of the visible spectrum (wavelength 450–400 nm, frequency 670–750 THz) and consequently can be easily "described in math."

    If you want to say that no existing color space can accurately render every possible shade of purple, on the other hand, that may very well be true. I'm not familiar enough with color spaces to say.

  • somewhat unrelated question. I've been studying camera movement lately. We all the slow motion shots mostly handheld? thanks.

  • Oh my god. What a babe.

    And the footage looks damn good too. But maybe I’m blinded by the model’s beauty.

    She’s thrown off my gamut response.

  • @libertas

    Color is very subjective, there is no way to tell how different people perceive color, you can only make tests of how different people distinguish different colors. Studies did show that around 8% of men can’t perceive some colors, so they have a color blindness of some kind.

    The only thing you can measure is how color on screen corresponds to color in real life. The problem here that there is no technology (real or futuristic or imaginable), which can fully simulate all light characteristics of a real subject, so you always have to make a simplified model for the sake of comparison. Even XYZ color space (the base of all color models we use in our days) can’t describe all colors accurately: for example there is a big chunk of purple spectrum which just can’t be described in math.

    There is also no proper way to compare eye “parameters” to monitors and cameras. Everything you do is subjective: something can look good enough now and really bad after few decades. Eye is a really unique instrument, I don’t think there will be a day anytime soon, when we will be able to completely “simulate” it. Just a simple example: in Soviet Union there were experiments which tried to find out what is the minimum amount of light the eye an register. It was found out that after a proper adaptation (staying in a dark room for long time) a person could register a single photon (!!!) of light.

  • @Vitaliy_Kiselev This is incorrect. Color science actually started from such experiments and advanced extremely.

    Do you mean psychological experiments that try to determine the nature of perception or experiments that measure physical characteristics of light? I would like to know more, if there is some soviet type experimentation that is not well known.

    As per gamut and DR I think these diagrams can clear things up a bit. Note that the DR of what our eyes can see is adaptive, meaning that night vision and seeing in strong sunlight are two different functions.



  • @markr041

    I’d recommend you to download DaVinci Resolve Manual and read it throughoutly. You’ll get answers and better understanding for most of your questions.

    For proper gamut transform I’d recommend using ACES or “DaVinci Color science”, it’s not gonna clip your DR or color gamut (LUT’s will). You can use it for BMPCC as well.

  • Blackmagic Film Log has its own gamut, it's not Rec709. (You can easily test this with the Color Space Transformation tool in Resolve.)

    The difference between Raw and ProRes on all their cameras btw. is that Raw is 12bit and ProRes is 10bit.