全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Shannon's Random-Cipher Result and the Generalized -Norm Entropy of Type

DOI: 10.1155/2013/768384

Full-Text   Cite this paper   Add to My Lib

Abstract:

Using the Fano inequality for the generalized -norm entropy and Bayess probability of error, a generalized random-cipher result is proved by taking into account the generalized -norm entropy of type . 1. Introduction It is known that a good cryptosystem can be built provided that the key rate is greater than the message redundancy [1]. Shannon obtained this result by considering the equivocation of the key over a random cipher. By counting the average number of spurious decipherments over a restricted class of random ciphers, Hellman [2] obtained the same result. A similar result was proved by Lu [3] by using the average probability of correct decryption of a message digit as a measure of performance and the Fano inequality for a class of cryptosystems. The analysis done by Lu is precise, whereas in [1] approximations are used. All of these results are obtained by taking into account the Shannon entropy. Sahoo [4] generalized the results of Lu by considering Renyi’s entropy and the Bayes probability of error. But in the literature of information theory, there exist various generalizations of Shannon’s entropy. One of these is the R-norm information, which was introduced by Arimoto [5] and extensively studied by Boekee and Van der Lubbe [6]. The objective of this paper is to generalize the results of Lu by considering the generalized R-norm entropy of type and Bayess probability of error. 2. Generalization of Shannon’s Random-Cipher Result Consider a discrete random variable X, which takes values , having the complete probability distribution . Also consider the set of positive real numbers not equal to 1; that is, . Then the R-norm information [5] is defined as This measure is different from the entropies of Shannon [1], Renyi [7], Havrda and Charvát [8], and Daroczy [9]. The most interesting property of this measure is that when R 1, it approaches to Shannon’s [1] entropy and in case . The measure (1) can be generalized in so many ways; however, Hooda and Ram [10] studied a parametric generalization as follows: where . The measure (2) may be called the generalized R-norm entropy of type and it reduced to (1) when . In case R = 1, (2) reduces to Setting in (3), we get The information measure (4) has also been mentioned by Arimoto [5] as an example of a generalized class of information measures. Although (4) and (1) are the same form, yet these differ as the ranges of R and are different. However, (2) is a joint representation of (1) and (4). So it is interesting to study the applications of the generalized R-norm entropy of type . Let us consider now

References

[1]  C. E. Shannon, “Communication theory of secrecy systems,” The Bell System Technical Journal, vol. 28, pp. 656–715, 1949.
[2]  M. E. Hellman, “An extension of the Shannon theory approach to cryptography,” IEEE Transactions on Information Theory, vol. 23, no. 3, pp. 289–294, 1977.
[3]  S. C. Lu, “The existence of good cryptosystems for key rates greater than the message redundancy,” IEEE Transactions on Information Theory, vol. 25, no. 4, pp. 475–479, 1979.
[4]  P. K. Sahoo, “Renyi's entropy of order α and Shannon's random cipher result,” Journal of Combinatorics, Information & System Sciences, vol. 8, no. 4, pp. 263–270, 1983.
[5]  S. Arimoto, “Information-theoretical considerations on estimation problems,” Information and Computation, vol. 19, pp. 181–194, 1971.
[6]  D. E. Boekee and J. C. A. Van der Lubbe, “The R-norm information measure,” Information and Control, vol. 45, no. 2, pp. 136–155, 1980.
[7]  A. Renyi, “On measure of entropy and information,” in Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 547–561, Berkeley, Calif, USA, June 1961.
[8]  J. Havrda and F. Charvát, “Quantification method of classification processes. Concept of structural α-entropy,” Kybernetika, vol. 3, pp. 30–35, 1967.
[9]  Z. Daroczy, “Generalized information functions,” Information and Computation, vol. 16, pp. 36–51, 1970.
[10]  D. S. Hooda and A. Ram, “Characterization of the generalized R-norm entropy,” Caribbean Journal of Mathematical and Computer Science, vol. 8, no. 1, 2, pp. 18–31, 1998.
[11]  E. F. Beckenbach and R. Bellman, Inequalities, Springer, New York, NY, USA, 1971.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133