论文部分内容阅读
熵(Entropy),最先是在经典热力学中引入的一个函数,它为研究具有特殊方向的自发过程提供了定量的基础。在统计力学中,熵是一个系统所能采取的微观态数的量度。在当今的信息论里,熵是信息的量度。我们知道,信息的度量应与消息所代表的事件的随机性,或各事件发生的概率有关系。设用n个消息一一对应地代表n个可能事件,各事件出现的概率分别是P_1,P_2,…P_n,那么可以证明每一个消息的平均信息量是:
Entropy, first introduced into classical thermodynamics, provides a quantitative basis for studying spontaneous processes with special orientations. In statistical mechanics, entropy is a measure of the microscopic states that a system can take. In today's information theory, entropy is a measure of information. We know that the measure of information should be related to the randomness of the events represented by the message or the probability of occurrence of each event. Suppose there are n possible events one by one corresponding to n messages, and the probabilities of occurrences of each event are P_1, P_2, ..., P_n, then it can be proved that the average amount of information of each message is: