Personal View site logo
Short video about film and modern sensors
  • 20 Replies sorted by
  • Dang....I never knew that high altitudes and gamma rays affect our cameras in this way. It makes a whole lotta sense though. Cool video.
  • Really interesting stuff - particularly about wide lens angles and CA (chromatic aberration) and as Ian_T says, the effects of gamma rays on sensors. Educational!
  • I'm not so sure how accurate the high altitude segment is. This isn't an issue I have ever heard about until this video was released and most of the consensus I've heard that camera sensors are fine on airplanes.
  • Bayer, gamma correction, subsampling, color space, algorithms, 4:2:2 4:2:0 etc.

    Are we sure we are ready to move to 3D?
  • no. I've been working with 3D at my internship. Lots of footage of a 5Dmkii side-by-side, using Cineform. 3D will be a cinema novelty for a long time to come. Here are my reasons for this:

    1. Narrative film is a language (think Hollywood style as one "language"). Viewers have learned this language well. They understand cuts and scenes and jumps in time, music/sound cues, etc. Filmmaking is a "shorthand" language. it jumps and abbreviates, almost ridiculously so. But with 3D, this doesn't work so well. Jumping from shot to shot requires the viewer to literally refocus on something different with their eyes, and what's more, shallow DOF simply looks strange because its like stuff is there, but you just can't see it. Until a new "3D friendly" editing cadence is learned by viewers, 3D will feel awkward.

    2. Two words: SCREEN SIZE. Guess what? 3D requires two cameras to be placed in two different viewing positions, and the position changes based on various factors (closest object, farthest object, etc.) One of these factors (SURPRISE!) is the size of the screen it is intended to be played on. So you know The Hobbit? Peter Jackson is shooting that on paired RED cameras in 3D . . . and for every shot, the cameras are being set to either a specific separation or a specific toe-in, and no doubt this setting is being optimized for a whopping huge cinema screen. Guess what? if he wanted it to look good on your plasma when you buy it on Blu-Ray, the cameras would need to be set differently. So what? Don't but a 3D TV to watch 3D movies.

    My guess is that sports and TV in general is more likely to move to 3D and work out okay. Screen sizes for TV are all reasonably similar, and TV involves less cutting from shot to shot.

    P.S. I got the pieces of my self-designed mirror rig back from the anodizer today, and it was like Christmas! (Too bad it belongs to the boss at my internship, not me, or I'd get a 2nd GH2 and shoot HDR stuff.)
  • I think for 3D to REALLY work they have to treat it like Cinerama- i.e., every shot is super wide angle and with deep focus. As B3Guy points out, having shallow DOF with 3D just seems weird.
  • Plus Cinerama didn't cut nearly as much as "traditional" narrative cinema. Often shots and scenes played out in masters (see HOW THE WEST WAS WON for examples).
  • yep, a whole new language essentially needs to be "written" for 3D, and that just takes time. Look at how long "hollywood style" language took!

    And the importance of knowing end screen size when setting up the cameras really is critical. My boss asked me today to convert their demo reel (shot with their 50" Sony in mind) . . . convert it for viewing on an iPod. It looked horribly flat, because the footage was shot for a bigger screen. if the same footage were projected on a cinema screen, the separation would be too great, and things would appear to be "behind your head" (even though you could still focus on them). This is what causes the headaches from watching 3D, when the cameras were sloppily set up in the first place. Your eyes diverge in this situation in order to justify the two images . . . but the human eye is used to having only to converge (toe in).
  • At 82, a my dear photographer friend who'd always used film said to me, "With these digital cameras I see a line around everything which nobody wants to talk about." I explained what I thought was chromatic aberration - from the sensor. He was happy to have found someone who'd seen it, too.
    Now, Robert, we have someone who's explained it.
    [Robert's Eulogy:]
  • (Doesn't 3D deserve its own Topic?)
  • yes, apologies :-) but you got me a-talking!
  • Can anyone find anything that confirms Hummel's information on gamma rays and sensors? I could not find anything that supported it; just people saying CMOS sensors are not affected by airport scanners. It seems a bit unlikely; there are low levels of background gamma radiation all around us anyway.

    I hope its not true; GH2 has been on two flights now!
  • @Ptchaw
    >Can anyone find anything that confirms Hummel's information on gamma rays and sensors?

    You are right to seek better data. Hummer was possibly talking so fast he didn't quote his sources - as he should have. This is science, not sales. Of course Hummer did declare his Kodak interest and is therefore ethically in the clear.

    Here's what I found:

    Specifically, (3MB PDF):

    [Since this gamma-ray damage story could so easily have been an internet hoax] A recent ABC Science Show journalist interviewed laypersons about a current scientific issue upon which public views differ to those of scientists. One woman interviewed said (to the interviewer's disbelief) that scientists' findings should carry no more weight than popular opinion - that the issue should be debated in some kind of [Trial-by-Facebook]!

    In this video, find the sensor/film capture information fascinating on its own. On the other hand, could we maybe start a different thread about the issue of camera protection?
    I once wrote a course on camera care; statistically, the biggest damage was caused by dropping the camera - so we worked a lot on preventing those occurrences.
  • @Roberto

    I read a few of the papers you pointed to. Most of them talk about space applications. The difference in gamma rays between sea level and, say, 10000M isn't all that big. The difference in space, however, is.

    One interesting thing mentioned in some of the papers, though. It seems that CMOS sensors are considerably less sensitive to this than CCD's. I've taken lots of CMOS sensor cameras on airplanes multiple times - I've never gotten any new dead pixels.

  • @cbrandin
    My URLs really only pointed out the enormous resources available via Google Scholar. Ptchaw wrote,
    "I could not find anything that supported it; just people saying CMOS sensors are not affected by airport scanners."
    so, rather than just reading the anecdotes we get on photography forums, try Scholar. If I add the search terms:
    ["gamma rays" ccd cmos sensor damage aircraft]
    the results returned at do start to include low-level aircraft.
    At first glance I see, from
    .."the trapped radiation is found at lower altitudes. This is called the South Atlantic or Brazilian Anomaly (SAA) and dominates the radiation received by low earth orbits".

    I see we do need someone besides Kodak to express this again in layperson's terms.
    NASA does a good job at

    However, one thing spokesman Hummer does clarify quite well is the way in which CCD output is re-mapped to compensate for inactive pixel-rows, accounting for the way we users do not see them. (This is a different from the separate phenomenon we call "dead pixels").

    And, once again, rather than this side-issue of gamma radiation which we've just got to live with, l'm personally more interested in the video's content showing how sensors work! The video seemed to indicate there's great room for improvement. Maybe sometime soon we'll have far higher resolution, at the very least.
    512 x 488 - 106K
    500 x 259 - 31K
  • I liked the video too. I ran into the angle issue when using shift and tilt lenses - depending on the angle the light hits the sensor, you get color shifts across the frame. You can correct this - but it's a real pain in the ass.

    One difference between "digital" lenses and older lenses is that the newer ones are designed so that light hits the sensor at a right-angle. Because film is essentially two dimensional, this was not a consideration when designing older lenses. That means you can get more CA and color shifting with some older designs. In fact, some lenses famous for being extremely sharp with film are rather soft around the edges with digital sensors.

    Interesting article, I guess radiation increases at a higher rate than I thought. I live at 2200M - I guess that means I'll be seeing dead pixels before long.

  • @cbrandin

    That figures. All my film lenses are softer than my Pentax 6mm TV lens. But the softest is my ENG Fujunon meant for 3CCDs. Funny, I'd always assumed manufacturers would try to have light rays travelling parallel just at the back, before the film plane or sensor.
  • Well, if you think about it video lenses are only designed to be sharp at video resolutions, which is 2 megapixels for HD and 1/3 megapixel for SD. I wouldn't expect too much from any video ENG lens in the sharpness department. Film lenses, on the other hand, are designed to be sharp at much higher resolutions - 20 megapixels and above. If a film lens is soft with video, that's a big degradation in performance!

  • Well, I must say I've only machined two of the C-mount lenses (the sharp TV ones). Maybe I'll take the plunge, destroy some cinema history and also grind away at some of my relic lenses which are still too soft (and without infinity focus).