Personal View site logo
Make sure to join PV on Telegram or Facebook! Perfect to keep up with community on your smartphone.
Please, support PV!
It allows to keep PV going, with more focus towards AI, but keeping be one of the few truly independent places.
Differential compression methods
  • zip compression does compression then archival. tar+gzip cats the files then compresses. I am looking for a compression that takes similar files in a directory, for example raw image files, then does a delta with the first file, then concatenates, then compresses.

    The filtering stage is to ensure that there is a reduction in entropy.

  • 2 Replies sorted by
  • If you "tar" + "xz" a directory of files the large(r than one image) memory region xz will look at to find repeating byte sequences kind of does what you ask for, but repetitions of precisely the same bytes are rarely found in raw images, so don't expect much from that.

    No general-purpose non-lossy compression (like zip, gzip, xz) will get you as good compression for images than compression algorithms specifically designed for a certain kind of media files.

    You could theoretically encode all your images into one H.264 file in "lossless" mode, and decompress by extracting single frames from the encoded files.

    But I'm not sure that's worth the hassle in comparison to e.g. just using webP lossless compression on each image, independently.

  • webp seems to be used only for rgb images. I didnt see any examples of raw compression