We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Distributed compression and decompression for big image data: JPEG and CCITT Group-3.
- Authors
Barman, Hillol; Kishor, Netalkar Rohan; Kothuri, Satya Sai Karthik; Kukudala, Mounika; Raju, U. S. N.
- Abstract
In today's era, digital data is being created and transmitted majorly in the form of images and videos. Storing such a huge number of images and transmitting them requires a lot of computer resources. Instead of storing the image data as is, if we compress and store it, it saves a lot of resources. Image compression is the act of removing the maximum possible redundant data from an image and maintaining only the non-redundant data. In this paper, to compress and decompress such big image data, a distributed environment with a map-reduce paradigm using Hadoop Distributed File System (HDFS) and Spark is used. In addition to these, Microsoft Azure cloud environment is also used. Various setups like a single system, 1 + 4, 1 + 15, and 1 + 18 node clusters are used to show the time comparisons among these setups with the self-created large image dataset. On these four self-made clusters, more than 200 million (219,340,800) images are compressed and decompressed; the execution times are compared with two of the traditional image compression methods: JPEG and CCITT Group-3. To evaluate the efficiency of these two compression methods: Compression Ratio, Root Mean Square Error (RMSE), and Peak Signal to Noise Ratios (PSNR) are used.
- Subjects
IMAGE compression; JPEG (Image coding standard); STANDARD deviations; SIGNAL-to-noise ratio; MICROSOFT Azure (Computing platform)
- Publication
Multimedia Tools & Applications, 2024, Vol 83, Issue 17, p50783
- ISSN
1380-7501
- Publication type
Article
- DOI
10.1007/s11042-023-17266-w