two equations for the calculation of surface tension of pure liquids are derived based on statistical thermodyanics by expressing the intermolecular potential as lennard-Jones pair potential.This method is simple,accu...two equations for the calculation of surface tension of pure liquids are derived based on statistical thermodyanics by expressing the intermolecular potential as lennard-Jones pair potential.This method is simple,accurate and easy to use.The calculation results for the 22 pure liquids show that these equations are superior to the empirical forula and other theoretical equations.The average relative deviations for these two equations are within 5% and 1.5% respectively.展开更多
Does non-transitivity in information theory have an analog in thermodynamics? A non-transitive game, “Swap”, is used as a toy thermodynamic model to explore concepts such as temperature, heat flow, equilibrium and e...Does non-transitivity in information theory have an analog in thermodynamics? A non-transitive game, “Swap”, is used as a toy thermodynamic model to explore concepts such as temperature, heat flow, equilibrium and entropy. These concepts, found to be inadequate for non-transitive thermodynamic, need to be generalized. Two kinds of temperatures, statistical and kinetic, are distinguished. Statistical temperature is a parameter in statistical distributions. Kinetic temperature is proportional to the expected kinetic energy based on its distribution. Identical for Maxwell-Boltzmann statistics, these temperatures differ in non-Maxwellian statistics when a force is present. Fourier’s law of conduction and entropy should be expressed using statistical temperature, not kinetic temperature. Kinetic temperature is always scalar but statistical temperature and statistical entropy in non-transitive systems have circulation, thereby allowing continuous and circular heat flow. Entropy is relative to underlying statistics, in analogy to the Kullback-Leibler divergence in information theory. The H-theorem, limited by assumptions of homogeneity and indistinguishability, only covers statistically homogeneous systems. The theorem does not cover non-transitive, statistically heterogeneous systems combining different distributions such as Maxwell-Boltzmann, biased half-Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein. The second law can be preserved if generalized by expressing it in terms of statistical temperature and statistical entropy.展开更多
文摘two equations for the calculation of surface tension of pure liquids are derived based on statistical thermodyanics by expressing the intermolecular potential as lennard-Jones pair potential.This method is simple,accurate and easy to use.The calculation results for the 22 pure liquids show that these equations are superior to the empirical forula and other theoretical equations.The average relative deviations for these two equations are within 5% and 1.5% respectively.
文摘Does non-transitivity in information theory have an analog in thermodynamics? A non-transitive game, “Swap”, is used as a toy thermodynamic model to explore concepts such as temperature, heat flow, equilibrium and entropy. These concepts, found to be inadequate for non-transitive thermodynamic, need to be generalized. Two kinds of temperatures, statistical and kinetic, are distinguished. Statistical temperature is a parameter in statistical distributions. Kinetic temperature is proportional to the expected kinetic energy based on its distribution. Identical for Maxwell-Boltzmann statistics, these temperatures differ in non-Maxwellian statistics when a force is present. Fourier’s law of conduction and entropy should be expressed using statistical temperature, not kinetic temperature. Kinetic temperature is always scalar but statistical temperature and statistical entropy in non-transitive systems have circulation, thereby allowing continuous and circular heat flow. Entropy is relative to underlying statistics, in analogy to the Kullback-Leibler divergence in information theory. The H-theorem, limited by assumptions of homogeneity and indistinguishability, only covers statistically homogeneous systems. The theorem does not cover non-transitive, statistically heterogeneous systems combining different distributions such as Maxwell-Boltzmann, biased half-Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein. The second law can be preserved if generalized by expressing it in terms of statistical temperature and statistical entropy.