Second compression for pixelated images under edge-based compression algorithms: JPEG-LS as an example
This paper details the examination of a particular case of data compression, where the compression algorithm removes the redundancy from data, which occurs when edge-based compression algorithms compress (previously com- pressed) pixelated images. The newly created redundancy can be removed using an...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | en en en |
| Published: |
IOS Press
2021
|
| Subjects: | |
| Online Access: | http://irep.iium.edu.my/93344/2/93344_Second%20compression%20for%20pixelated%20images_SCOPUS.pdf http://irep.iium.edu.my/93344/3/93344_Second%20compression%20for%20pixelated%20images_WoS.pdf http://irep.iium.edu.my/93344/11/93344_Second%20compression%20for%20pixelated%20images.pdf http://irep.iium.edu.my/93344/ https://www.iospress.com/catalog/journals/journal-of-intelligent-fuzzy-systems |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | This paper details the examination of a particular case of data compression, where the compression algorithm removes the redundancy from data, which occurs when edge-based compression algorithms compress (previously com- pressed) pixelated images. The newly created redundancy can be removed using another round of compression. This work utilized the JPEG-LS as an example of an edge-based compression algorithm for compressing pixelated images. The output of this process was subjected to another round of compression using a more robust but slower compressor (PAQ8f). The compression ratio of the second compression was, on average, 18%, which is high for random data. The results of the second compression were superior to the lossy JPEG. Under the used data set, lossy JPEG needs to sacrifice 10% on average to realize nearly total lossless compression ratios of the two-successive compressions. To generalize the results, fast general-purpose compression algorithms (7z, bz2, and Gzip) were used too. |
|---|
