In this paper, we develop dynamic statistical information theory established by the author. Starting from the ideas that the state variable evolution equations of stochastic dynamic systems, classical and quantum none...In this paper, we develop dynamic statistical information theory established by the author. Starting from the ideas that the state variable evolution equations of stochastic dynamic systems, classical and quantum nonequilibrium statistical physical systems and special electromagnetic field systems can be regarded as their information symbol evolution equations and the definitions of dynamic information and dynamic entropy, we derive the evolution equations of dynamic information and dynamic entropy that describe the evolution laws of dynamic information. These four kinds of evolution equations are of the same mathematical type. They show in unison when information transmits in coordinate space outside the systems that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes, and that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes. When space noise can be neglected, an information wave will appear. If we only consider the information change inside the systems, dynamic information evolution equations reduce to information equations corresponding to the dynamic equations which describe evolution laws of the above dynamic systems. This reveals that the evolution laws of respective dynamic systems can be described by information equations in a unified fashion. Hence, the evolution processes of these dynamic systems can be abstracted as the evolution processes of information. Furthermore, we present the formulas for information flow, information dissipation rate, and entropy production rate. We prove that the information production probably emerges in a dynamic system with internal attractive interaction between the elements, and derive a formula for this information production rate. Thereby, we obtain an expression for the 展开更多
In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and d...In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fok- ker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dy- namic entropy density and dynamic information density and the nonlinear evolution equa- tions of Boltzmann dynamic entropy density and dynamic information density, that de- scribe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic infor- mation densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and in- formation have been combined with the state and its law of motion of the systems. Fur- thermore we presented the formulas of two kinds of entropy production rates and infor- mation dissipation rates, the expressions of two kinds of drift information flows and diffu- sion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy produc- tion rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynam展开更多
文摘In this paper, we develop dynamic statistical information theory established by the author. Starting from the ideas that the state variable evolution equations of stochastic dynamic systems, classical and quantum nonequilibrium statistical physical systems and special electromagnetic field systems can be regarded as their information symbol evolution equations and the definitions of dynamic information and dynamic entropy, we derive the evolution equations of dynamic information and dynamic entropy that describe the evolution laws of dynamic information. These four kinds of evolution equations are of the same mathematical type. They show in unison when information transmits in coordinate space outside the systems that the time rate of change of dynamic information densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes, and that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes. When space noise can be neglected, an information wave will appear. If we only consider the information change inside the systems, dynamic information evolution equations reduce to information equations corresponding to the dynamic equations which describe evolution laws of the above dynamic systems. This reveals that the evolution laws of respective dynamic systems can be described by information equations in a unified fashion. Hence, the evolution processes of these dynamic systems can be abstracted as the evolution processes of information. Furthermore, we present the formulas for information flow, information dissipation rate, and entropy production rate. We prove that the information production probably emerges in a dynamic system with internal attractive interaction between the elements, and derive a formula for this information production rate. Thereby, we obtain an expression for the
文摘In recent years we extended Shannon static statistical information theory to dynamic processes and established a Shannon dynamic statistical information theory, whose core is the evolution law of dynamic entropy and dynamic information. We also proposed a corresponding Boltzmman dynamic statistical information theory. Based on the fact that the state variable evolution equation of respective dynamic systems, i.e. Fok- ker-Planck equation and Liouville diffusion equation can be regarded as their information symbol evolution equation, we derived the nonlinear evolution equations of Shannon dy- namic entropy density and dynamic information density and the nonlinear evolution equa- tions of Boltzmann dynamic entropy density and dynamic information density, that de- scribe respectively the evolution law of dynamic entropy and dynamic information. The evolution equations of these two kinds of dynamic entropies and dynamic informations show in unison that the time rate of change of dynamic entropy densities is caused by their drift, diffusion and production in state variable space inside the systems and coordinate space in the transmission processes; and that the time rate of change of dynamic infor- mation densities originates from their drift, diffusion and dissipation in state variable space inside the systems and coordinate space in the transmission processes. Entropy and in- formation have been combined with the state and its law of motion of the systems. Fur- thermore we presented the formulas of two kinds of entropy production rates and infor- mation dissipation rates, the expressions of two kinds of drift information flows and diffu- sion information flows. We proved that two kinds of information dissipation rates (or the decrease rates of the total information) were equal to their corresponding entropy produc- tion rates (or the increase rates of the total entropy) in the same dynamic system. We obtained the formulas of two kinds of dynamic mutual informations and dynamic channel capacities reflecting the dynam