>
Fa   |   Ar   |   En
   Entropy and Higher Moments of Information  
   
نویسنده Jurgensen Helmut ,Matthews David E.
منبع journal of universal computer science - 2010 - دوره : 16 - شماره : 5 - صفحه:749 -794
چکیده    The entropy of a finite probability space or, equivalently, a memoryless source is the average information content of an event. the fact that entropy is an expectation suggests that it could be quite important in certain applications to take into account higher moments of information and parameters derived from these like the variance or skewness. in this paper we initiate a study of the higher moments of information for sources without memory and sources with memory. we derive prop erties of these moments for information defined in the sense of shannon and indicate how these considerations can be extended to include the concepts of information in the sense of aczél or rényi. for memoryless sources, these concepts are immediately supported by the usual definitions of moments; for general stationary sources, let alone general sources, no such applicable framework seems to exist; on the other hand, the special properties of stationary markov sources suggest such definitions which are both, well-motivated and mathematically meaningful.
کلیدواژه information ,entropy ,moments of information ,variance of information
آدرس The University of Western, Department of Computer Science, Canada, University of Waterloo Waterloo, Department of Statistics and Actuarial Science, Canada
 
     
   
Authors
  
 
 

Copyright 2023
Islamic World Science Citation Center
All Rights Reserved