Glossary (en)‎ > ‎

Entropy or Amount of Information

Article
 
This column should only be modified by the corresponding editor.
- For discussion about any aspect of this article, please, use the comments section at page bottom.
- Any document or link, considered of interest for this article, is welcomed.
 
 Editor
J.M. Díaz-Nafría e-mail
 Incorporated contributions
J.M. Díaz-Nafría (25/8/2016)
 Usage domain
Mathematical Theory of Communication 
 Type
Concept
 French
entropie/quantité d’information
 German Entropie/ Informationsgehalt
 
The entropy of amount of information of a discreet information source, characterised by the probability pj, of sending each of its symbols, j, is the statistical average:
 
being bounded within the limits:
 
where N is the number of symbols.

In case the source might adopt various states i, being Pi the state probability, and pi(j) the probability of sending symbol j when the source is in state i, then the entropy is defined as the average of the entropies of each state:


According to Floridi (2005), the entropy H might designate three equivalent quantities in the ideal case of a noiseless channel: 
1) “the average amount of information per symbol produced by the informer”; 2) the “average amount of data deficit (Shannon´s uncertain-ty) that the informee has before inspection of the output of the informer”; 3) “informational potentiality”.

Since the first two interpretations assume that a defined uncertainty corresponds to each symbol (whether it is in the emission or reception), it implies a certain tactical agreement regarding to the →alphabet or the informational game in which the agents are immersed. In both cases, the information can be quantified under the condition that the prob-ability distribution can be specified.

Concerning the third interpretation, entropy might be understood in terms of a physical magnitude related to the amount of disorder in processes or systems conveying energy or information. The larger the entropy, the high-er the number of physical states in which the system can be found, consequently, the more information it can refer to, or in other words, the specification of the state in which a certain system is requires more information as its entropy increases. Numerically, this is equivalent to the amount of information or data that has to be given in order to specify the state. 
 
References
Entries
New entry. For doing a new entry: (1) the user must be identified as an authorized user(to this end, the "sign inlink at the page bottom left can be followed). (2) After being identified, press the "edit page" button at he upper right corner. (3) Being in edition mode, substitute -under this blue paragraph- "name" by the authors' names, "date" by the date in which the text is entered; and the following line by the proposed text. At the bottom of the entry, the references -used in the proposed text- must be given using the normalized format. (4) To finish, press the "save" button at the upper right corner.
The entry will be reviewed by the editor and -at least- another peer, and subsequently articulated in the article if elected.

Name (date)
 
[Entry text]



Incorporated entries

Whenever an entry is integrated in the article (left column) the corresponding entry is reflected in this section.

Díaz-Nafría, J.M. (25/8/2016)
 
The entropy of amount of information of a discreet information source, characterised by the probability pj, of sending each of its symbols, j, is the statistical average:
 
being bounded within the limits:
 
where N is the number of symbols.

In case the source might adopt various states i, being Pi the state probability, and pi(j) the probability of sending symbol j when the source is in state i, then the entropy is defined as the average of the entropies of each state:


According to Floridi (2005), the entropy H might designate three equivalent quantities in the ideal case of a noiseless channel: 
1) “the average amount of information per symbol produced by the informer”; 2) the “average amount of data deficit (Shannon´s uncertain-ty) that the informee has before inspection of the output of the informer”; 3) “informational potentiality”.

Since the first two interpretations assume that a defined uncertainty corresponds to each symbol (whether it is in the emission or reception), it implies a certain tactical agreement regarding to the →alphabet or the informational game in which the agents are immersed. In both cases, the information can be quantified under the condition that the prob-ability distribution can be specified.

Concerning the third interpretation, entropy might be understood in terms of a physical magnitude related to the amount of disorder in processes or systems conveying energy or information. The larger the entropy, the high-er the number of physical states in which the system can be found, consequently, the more information it can refer to, or in other words, the specification of the state in which a certain system is requires more information as its entropy increases. Numerically, this is equivalent to the amount of information or data that has to be given in order to specify the state. 

References
  
Comments