Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Performance Analysis of Multimedia Compression Algorithms  [PDF]
Eman Abdelfattah,Asif Mohiuddin
International Journal of Computer Science & Information Technology , 2010,
Abstract: In this paper, we are evaluating the performance of Huffman and Run Length Encoding compressionalgorithms with multimedia data. We have used different types of multimedia formats such as images andtext. Extensive experimentation with different file sizes was used to compare both algorithms evaluatingthe compression ratio and compression time. Huffman algorithm showed consistent performancecompared to Run Length encoding.
Comparative Analysis of Various Cryptographic Algorithms  [cached]
Satish Kumar
International Journal of Computer and Distributed System , 2012,
Abstract: Does increased security provide comfort to fearful people? Or does security provide some very basic protections that we are inexperienced to believe that we don't need? During this time when the Internet provides essential communication between millions of people and is being increasingly used as a tool for commerce, trading, research & banking, security becomes a tremendously important issue to deal with. There are many aspects to security and many applications, ranging from secure commerce and payments to private communications and protecting passwords. This paper has two major purposes. The first is to define some of the terms and concepts behind basic cryptographic methods, and second is to offer a way to compare the myriad cryptographic algorithms in use today.
Dr. V. Radha
Journal of Global Research in Computer Science , 2010,
Abstract: Image compression addresses the problem of reducing the amount of data required to present a digital image with acceptable image quality. The underlying basis of the reduction process is the removal of redundant data. Medical image compression plays a key role as healthcare industry move towards filmless imaging and goes completely digital. The problem of medical image compression is a continuing research field and most of the researches being proposed concentrate either on developing a new technique or enhance the existing techniques. The medical community has been reluctant to adopt lossless methods for image compression. The main goal has been to produce an exact replica of the original image, suffering high file size. Only recently, attention to use lossy image compression, which maximizes compression while maintaining clinical relevance data, has been probed. Four solutions to answer the above problem statement have been selected, namely, Block Truncation Coding (BTC), Discrete Cosine Transformation (DCT), Discrete Wavelet Transformation (DWT) and Singular Value Decomposition (SVD) were selected because of their predominant place in general image processing field. Various experiments were conducted to analyze the performance of the four image compression models on medical image compression.
Analysis of Lossless Reversible Transformation Algorithms to Enhance Data Compression  [cached]
Jeyanthi Perumal,Anuratha Vinodkumar
Journal of Global Research in Computer Science , 2012,
Abstract: In this paper we analyze and present the benefits offered in the lossless compression by applying a choice of preprocessing methods that exploits the advantage of redundancy of the source file. Textual data holds a number of properties that can be taken into account in order to improve compression. Preprocessing cope up with these properties by applying a number of transformations that make the redundancy “more visible” to the compressor. Many preprocessing algorithms come into being for text files which complement each other and are performed prior to actual compression. Here our focus is on the Length-Index Preserving Transform (LIPT), its derivatives ILPT, NIT & LIT and StarNT Transformation algorithm. The algorithms are briefly presented before calling attention to their analysis.
A Complexity Analysis and Entropy for Different Data Compression Algorithms on Text Files  [PDF]
Mohammad Hjouj Btoush, Ziad E. Dawahdeh
Journal of Computer and Communications (JCC) , 2018, DOI: 10.4236/jcc.2018.61029
Abstract: In this paper, we analyze the complexity and entropy of different methods of data compression algorithms: LZW, Huffman, Fixed-length code (FLC), and Huffman after using Fixed-length code (HFLC). We test those algorithms on different files of different sizes and then conclude that: LZW is the best one in all compression scales that we tested especially on the large files, then Huffman, HFLC, and FLC, respectively. Data compression still is an important topic for research these days, and has many applications and uses needed. Therefore, we suggest continuing searching in this field and trying to combine two techniques in order to reach a best one, or use another source mapping (Hamming) like embedding a linear array into a Hypercube with other good techniques like Huffman and trying to reach good results.
Analysis of cipher text size produced by various Encryption Algorithms
International Journal of Engineering Science and Technology , 2011,
Abstract: In the digital world, it is the need of the hour to secure the data communication from unauthorized access. Over the time a number of techniques and algorithms have came into operation for its security. This research paper is done qualitatively to emphasize the need for securing the data as well as the fast transmission of the encryptedtext. It concerns the analysis of selected symmetric cipher block encryption algorithms from cipher text size point of view.
Analysis of the various key management algorithms and new proposal in the secure multicast communications  [PDF]
Joe Prathap P M.,V. Vasudevan
Computer Science , 2009,
Abstract: With the evolution of the Internet, multicast communications seem particularly well adapted for large scale commercial distribution applications, for example, the pay TV channels and secure videoconferencing. Key management for multicast remains an open topic in secure Communications today. Key management mainly has to do with the distribution and update of keying material during the group life. Several key tree based approach has been proposed by various authors to create and distribute the multicast group key in effective manner. There are different key management algorithms that facilitate efficient distribution and rekeying of the group key. These protocols normally add communication overhead as well as computation overhead at the group key controller and at the group members. This paper explores the various algorithms along with the performances and derives an improved method.
Analysis Of Spiht Algorithm For Satellite Image Compression  [PDF]
K Nagamani,AG Ananth
Advanced Computing : an International Journal , 2011,
Abstract: Wavelets offer an elegant technique for representing the levels of details present in an image. When an image is decomposed using wavelets, the high pass component carry less information, and vice-versa. The possibility of elimination of the high pass components gives higher compression ratio in the case of wavelet based image compression. To achieve higher compression ratio, various coding schemes have been used. Some of the well known coding algorithms are EZW (Embedded Zero-tree Wavelet), SPIHT (Set Partitioning in Hierarchical Tree) and EBCOT (Embedded Block Coding with Optimal Truncation). SPIHT has been one of the popular schemes used for image compression. In this paper the performance of the SPIHT (Set Partitioning in Hierarchical Trees) compression technique for satellite images are studied. The satellite rural and urban images have been used for the present analysis. The standard Lena image is used for the purpose of comparison. For a given compression ratio, the PSNR (peak signal to noise ratio) values are computed to evaluate the quality of the reconstructed image. The analysis carried out clearly suggests that the PSNR values increases with the level of decomposition. For the satellite images the PSNR values achievable are less compared to that of Standard Lena Image and the SPIHT Algorithm are better suited for compression of Satellite urban Images.
Harish Jindal
Journal of Global Research in Computer Science , 2012,
Abstract: Image compression is a widely discovered area of research & application. Many compression techniques have been developed including few mixed methods. The Discrete Cosine Transform (DCT) is one of the widely used compression method. Also the Discrete Wavelet Transform (DWT) provides substantial improvements in picture quality due to its multi resolution nature. Application of both methods together for image compression can provide a sustained Peak Signal to Noise Ratio (PSNR) along with a better overall compression ratio. This paper aims at the analysis of Color Image compression using DCT and DWT for better PSNR & Compression ratio. A comparative study of performance of DCT & different discrete wavelets is made in terms of Peak signal-to-noise ratio (PSNR), Mean Square Error (MSE) and overall Compression Ratio to illustrate the effectiveness of this method in Image compression. Extensive analysis has been carried out before arrival at the conclusion
Compression Performance Analysis of a Sort of Fast Binary Image Encryption Algorithms

Zhou Qing,Liao Xiao-feng,Hu Yue,

电子与信息学报 , 2009,
Abstract: Multimedia encryption algorithms are widely studied in recent years. However, the performances of those algorithms are usually checked by qualitative analysis and simulations. In this paper, the compression performance of a sort of encryption algorithms for binary image is analyzed quantitatively using information theories. It shows that those algorithms are good at performances including security, speed, compression ratio, robustness and format-compliance, and therefore suitable for practical use.
Page 1 /100
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.