统计物理学习讲义内容摘要:

al Phenomena amp。 Powerlaw  相变  SOC, HOT/COLD理论 Entropy Microstate r: a specific configuration of system Macrostate R: an evaluation value Ω(R): number of microstates related to a macrostate Microcanonical entropy: S(R)=k log Ω(R) More General forms: A macrostate R: {pi} for system be found in a microstate i A distribution of microstates. Gibbs Entropy: S(R) =k ∑pi logpi Maximum  the most possible distribution of microstates Without constraint on pi, pi=1/N  S is maximized Ω({ni})=M!/n1!n2!...nN!, pi=ni/M With Constraint on pi: Partition Function Z Observable quantity E (Hamiltonian) Ergodic Hypothesis (time average=ensemble average) We know:  From experiments: E’,  Ei for all ri, and E’= E= ∑piEi, ∑pi=1. We want to know the most probable distribution of microstates Maximize S=k∑pilogpi and we get: pi=eβEi/Z, Z=∑ieβEi (β=(kT)1) So, {pi} and β is decided by {Ei} and E Knowing βor T and {Ei}, we can define the most possible distribution of microstates {pi} and Z β  T  E  Z distribution is less symmetrical Toy Example Three microstates: E1=0, E2=2, E3=3 We have p1E1+p2E2+p3E3=E . 2p2+3p3=E, and p1+p2+p3=1 3 temperatures: decreasing order of T E β Z p1 p2 p3 1 2 1 3 Important concepts  Partition function: Z(T,E)=∑re E(r)/T Knowing this, we can do a lot of things! Variance of E, sol, …  Free Energy:。
阅读剩余 0%
本站所有文章资讯、展示的图片素材等内容均为注册用户上传(部分报媒/平媒内容转载自网络合作媒体),仅供学习参考。 用户通过本站上传、发布的任何内容的知识产权归属用户或原始著作权人所有。如有侵犯您的版权,请联系我们反馈本站将在三个工作日内改正。