This is a lossy image compression program that uses wavelets.
It was written for the final project in a Fourier series and wavelets course.
The program achieves compression of up to 20% with acceptable loss of detail.
This scheme ensures that low-frequency features are compressed before high-frequency features.
This way, fine details are preserved even at high compression rates.
Here's how the compression algorithm works:
And decompression applies the same steps but in reverse.
One of the benefits of wavelet-based compression is that the process can be treated as a pipeline.
So we can switch out different wavelets (e.g., Haar, Daubechies, etc.) without needing to adjust any other part of the pipeline.
Or we could use an adaptive arithmetic coder instead of a Huffman coder.
- Original image is decomposed using Haar wavelet to generate wavelet coefficients (the wavelet image pyramid)
- Wavelet coefficients are culled based on threshold value (this is why the compression is lossy)
- Remaining wavelet coefficients are encoded into a tree structure (the zerotree)
- Zerotree is compressed using Huffman coding
The Haar wavelet is a good starting point for working with wavelets.
It's a lot simpler to work with than higher-order wavelets and doesn't require special consideration for nasty edge effects.
Click here for a downloadable executable and an article with more information about the compression algorithm.
And check out my textures page to download the textures used to generate the images above.