Information is a very abstract concept. People often say there is a lot of information, or there is less information, but it is difficult to say exactly how much information there is. For example, how much information does a Chinese book of 500,000 words contain?
It was not until 1948 that Shannon proposed the concept of "information entropy" to solve the problem of quantitative measurement of information. The term information entropy is C. E. Shannon borrowed it from thermodynamics. Thermal entropy in thermodynamics is a physical quantity that expresses the degree of disorder of a molecular state. Shannon used the concept of information entropy to describe the uncertainty of the information source. (Recommended study: PHP Video Tutorial)
Claude Elwood Shannon, the father of information theory, used mathematical language to clarify the relationship between probability and information redundancy for the first time.
The father of information theory, C. E. Shannon, pointed out in the paper "A Mathematical Theory of Communication" published in 1948 that there is redundancy in any information, and the size of the redundancy is related to each symbol in the information ( numbers, letters, or words), or uncertainty.
Shannon drew on the concept of thermodynamics and called the average amount of information after excluding redundancy "information entropy", and gave a mathematical expression for calculating information entropy.
Information meaning
Modern definition
Information is the indication of matter, energy, information and its properties. [Inverse Wiener information definition]
Information is an increase in certainty. [Inverse Shannon Information Definition]
Information is a collection of things and their attribute identifiers. 【2002】
Initial definition
Claude E. Shannon, one of the originators of information theory, defined information (entropy) as the probability of occurrence of discrete random events.
The so-called information entropy is a rather abstract concept in mathematics. Here, we might as well understand information entropy as the probability of occurrence of a certain kind of information. Information entropy and thermodynamic entropy are closely related. According to Charles H. Bennett's reinterpretation of Maxwell's Demon, the destruction of information is an irreversible process, so the destruction of information is consistent with the second law of thermodynamics. Generating information is the process of introducing negative (thermodynamic) entropy into the system. Therefore, the sign of information entropy should be opposite to that of thermodynamic entropy.
Generally speaking, when a piece of information has a higher probability of appearing, it means that it has been spread more widely, or that it has been cited to a higher extent. We can think that from the perspective of information dissemination, information entropy can represent the value of information. In this way, we have a standard to measure the value of information and can make more inferences about knowledge circulation issues.
Calculation formula
H(x) = E[I(xi)] = E[ log(2,1/P(xi)) ] = -∑ P(xi)log(2,P(xi)) (i=1,2,..n)
where, x represents a random variable, corresponding to which is the set of all possible outputs, defined is a symbol set, and the output of the random variable is represented by x. P(x) represents the output probability function. The greater the uncertainty of the variable, the greater the entropy, and the greater the amount of information required to figure it out.
For more PHP related technical articles, please visitPHP graphic tutorial column for learning!
The above is the detailed content of Information entropy calculation formula. For more information, please follow other related articles on the PHP Chinese website!