In classical statistical mechanics , the H -theorem , introduced by Ludwig Boltzmann in 1872, describes the tendency to decrease in the quantity H (defined below) in a nearly- ideal gas of molecules. As this quantity H was meant to represent the entropy of thermodynamics, the H -theorem was an early demonstration of the power of statistical mechanics as it claimed to derive the second law of thermodynamics —a statement about fundamentally irreversible processes —from reversible microscopic mechanics. It is thought to prove the second law of thermodynamics , albeit under the assumption of low-entropy initial conditions.
76-404: The H -theorem is a natural consequence of the kinetic equation derived by Boltzmann that has come to be known as Boltzmann's equation . The H -theorem has led to considerable discussion about its actual implications, with major themes being: Boltzmann in his original publication writes the symbol E (as in entropy ) for its statistical function . Years later, Samuel Hawksley Burbury , one of
152-687: A ∭ p o s i t i o n s f ( x , y , z , p x , p y , p z , t ) d x d y d z d p x d p y d p z {\displaystyle {\begin{aligned}N&=\int \limits _{\mathrm {momenta} }d^{3}\mathbf {p} \int \limits _{\mathrm {positions} }d^{3}\mathbf {r} \,f(\mathbf {r} ,\mathbf {p} ,t)\\[5pt]&=\iiint \limits _{\mathrm {momenta} }\quad \iiint \limits _{\mathrm {positions} }f(x,y,z,p_{x},p_{y},p_{z},t)\,dx\,dy\,dz\,dp_{x}\,dp_{y}\,dp_{z}\end{aligned}}} which
228-426: A basis for many other specific fields of scientific understanding and engineering application . Examples of thermalisation include: The hypothesis, foundational to most introductory textbooks treating quantum statistical mechanics , assumes that systems go to thermal equilibrium (thermalisation). The process of thermalisation erases local memory of the initial conditions. The eigenstate thermalisation hypothesis
304-519: A central role in a recent controversy called the Black hole information paradox . Richard C. Tolman 's 1938 book The Principles of Statistical Mechanics dedicates a whole chapter to the study of Boltzmann's H theorem, and its extension in the generalized classical statistical mechanics of Gibbs . A further chapter is devoted to the quantum mechanical version of the H -theorem. We let q i and p i be our generalized canonical coordinates for
380-476: A fluid consisting of only one kind of particle, the number density n is given by n = ∫ f d 3 p . {\displaystyle n=\int f\,d^{3}\mathbf {p} .} The average value of any function A is ⟨ A ⟩ = 1 n ∫ A f d 3 p . {\displaystyle \langle A\rangle ={\frac {1}{n}}\int Af\,d^{3}\mathbf {p} .} Since
456-1023: A force F instantly acts on each particle, then at time t + Δ t their position will be r + Δ r = r + p m Δ t {\displaystyle \mathbf {r} +\Delta \mathbf {r} =\mathbf {r} +{\frac {\mathbf {p} }{m}}\,\Delta t} and momentum p + Δ p = p + F Δ t . Then, in the absence of collisions, f must satisfy f ( r + p m Δ t , p + F Δ t , t + Δ t ) d 3 r d 3 p = f ( r , p , t ) d 3 r d 3 p {\displaystyle f\left(\mathbf {r} +{\frac {\mathbf {p} }{m}}\,\Delta t,\mathbf {p} +\mathbf {F} \,\Delta t,t+\Delta t\right)\,d^{3}\mathbf {r} \,d^{3}\mathbf {p} =f(\mathbf {r} ,\mathbf {p} ,t)\,d^{3}\mathbf {r} \,d^{3}\mathbf {p} } Note that we have used
532-455: A set of r {\displaystyle r} particles. Then we consider a function f {\displaystyle f} that returns the probability density of particles, over the states in phase space . Note how this can be multiplied by a small region in phase space, denoted by δ q 1 . . . δ p r {\displaystyle \delta q_{1}...\delta p_{r}} , to yield
608-573: A subtle sense, and therefore begs the question . Once the particles are allowed to collide, their velocity directions and positions in fact do become correlated (however, these correlations are encoded in an extremely complex manner). This shows that an (ongoing) assumption of independence is not consistent with the underlying particle model. Boltzmann's reply to Loschmidt was to concede the possibility of these states, but noting that these sorts of states were so rare and unusual as to be impossible in practice. Boltzmann would go on to sharpen this notion of
684-487: A time-symmetric formalism. If the H decreases over time in one state, then there must be a matching reversed state where H increases over time ( Loschmidt's paradox ). The explanation is that Boltzmann's equation is based on the assumption of " molecular chaos ", i.e., that it follows from, or at least is consistent with, the underlying kinetic model that the particles be considered independent and uncorrelated. It turns out that this assumption breaks time reversal symmetry in
760-448: A very small region of momentum space d 3 p {\displaystyle d^{3}\mathbf {p} } ), at an instant of time. The Boltzmann equation can be used to determine how physical quantities change, such as heat energy and momentum , when a fluid is in transport. One may also derive other properties characteristic to fluids such as viscosity , thermal conductivity , and electrical conductivity (by treating
836-424: Is a 6-fold integral . While f is associated with a number of particles, the phase space is for one-particle (not all of them, which is usually the case with deterministic many-body systems), since only one r and p is in question. It is not part of the analysis to use r 1 , p 1 for particle 1, r 2 , p 2 for particle 2, etc. up to r N , p N for particle N . It
SECTION 10
#1732855780776912-690: Is a hypothesis about when quantum states will undergo thermalisation and why. Not all quantum states undergo thermalisation. Some states have been discovered which do not (see below), and their reasons for not reaching thermal equilibrium are unclear as of March 2019 . The process of equilibration can be described using the H-theorem or the relaxation theorem , see also entropy production . Broadly-speaking, classical systems with non-chaotic behavior will not thermalise. Systems with many interacting constituents are generally expected to be chaotic , but this assumption sometimes fails. A notable counter example
988-761: Is a shorthand for the momentum analogue of ∇ , and ê x , ê y , ê z are Cartesian unit vectors . Dividing ( 3 ) by dt and substituting into ( 2 ) gives: ∂ f ∂ t + p m ⋅ ∇ f + F ⋅ ∂ f ∂ p = ( ∂ f ∂ t ) c o l l {\displaystyle {\frac {\partial f}{\partial t}}+{\frac {\mathbf {p} }{m}}\cdot \nabla f+\mathbf {F} \cdot {\frac {\partial f}{\partial \mathbf {p} }}=\left({\frac {\partial f}{\partial t}}\right)_{\mathrm {coll} }} In this context, F ( r , t )
1064-785: Is assumed the particles in the system are identical (so each has an identical mass m ). For a mixture of more than one chemical species , one distribution is needed for each, see below. The general equation can then be written as d f d t = ( ∂ f ∂ t ) force + ( ∂ f ∂ t ) diff + ( ∂ f ∂ t ) coll , {\displaystyle {\frac {df}{dt}}=\left({\frac {\partial f}{\partial t}}\right)_{\text{force}}+\left({\frac {\partial f}{\partial t}}\right)_{\text{diff}}+\left({\frac {\partial f}{\partial t}}\right)_{\text{coll}},} where
1140-422: Is called the phase space of the system; in other words a set of three coordinates for each position coordinate x, y, z , and three more for each momentum component p x , p y , p z . The entire space is 6- dimensional : a point in this space is ( r , p ) = ( x, y, z, p x , p y , p z ) , and each coordinate is parameterized by time t . The small volume ("differential volume element ")
1216-515: Is in question, at the heart of the equation is a quantity f which gives this probability per unit phase-space volume, or probability per unit length cubed per unit momentum cubed, at an instant of time t . This is a probability density function : f ( r , p , t ) , defined so that, d N = f ( r , p , t ) d 3 r d 3 p {\displaystyle dN=f(\mathbf {r} ,\mathbf {p} ,t)\,d^{3}\mathbf {r} \,d^{3}\mathbf {p} }
1292-411: Is not conserved, then like any other such variable (pressure, etc.) it will show thermal fluctuations . This means that H regularly shows spontaneous increases from the minimum value. Technically this is not an exception to the H theorem, since the H theorem was only intended to apply for a gas with a very large number of particles. These fluctuations are only perceptible when the system is small and
1368-418: Is often used in a more general sense, referring to any kinetic equation that describes the change of a macroscopic quantity in a thermodynamic system, such as energy, charge or particle number. The equation arises not by analyzing the individual positions and momenta of each particle in the fluid but rather by considering a probability distribution for the position and momentum of a typical particle—that is,
1444-638: Is the Fermi–Pasta–Ulam–Tsingou problem , which displays unexpected recurrence and will only thermalise over very long time scales. Non-chaotic systems which are pertubed by weak non-linearities will not thermalise for a set of initial conditions, with non-zero volume in the phase space, as stated by the KAM theorem , although the size of this set decreases exponentially with the number of degrees of freedom. Many-body integrable systems , which have an extensive number of conserved quantities, will not thermalise in
1520-409: Is the force field acting on the particles in the fluid, and m is the mass of the particles. The term on the right hand side is added to describe the effect of collisions between particles; if it is zero then the particles do not collide. The collisionless Boltzmann equation, where individual collisions are replaced with long-range aggregated interactions, e.g. Coulomb interactions , is often called
1596-768: Is the gradient operator, · is the dot product , ∂ f ∂ p = e ^ x ∂ f ∂ p x + e ^ y ∂ f ∂ p y + e ^ z ∂ f ∂ p z = ∇ p f {\displaystyle {\frac {\partial f}{\partial \mathbf {p} }}=\mathbf {\hat {e}} _{x}{\frac {\partial f}{\partial p_{x}}}+\mathbf {\hat {e}} _{y}{\frac {\partial f}{\partial p_{y}}}+\mathbf {\hat {e}} _{z}{\frac {\partial f}{\partial p_{z}}}=\nabla _{\mathbf {p} }f}
SECTION 20
#17328557807761672-2357: Is the total change in f . Dividing ( 1 ) by d 3 r d 3 p Δ t {\displaystyle d^{3}\mathbf {r} \,d^{3}\mathbf {p} \,\Delta t} and taking the limits Δ t → 0 and Δ f → 0 , we have d f d t = ( ∂ f ∂ t ) c o l l {\displaystyle {\frac {df}{dt}}=\left({\frac {\partial f}{\partial t}}\right)_{\mathrm {coll} }} The total differential of f is: d f = ∂ f ∂ t d t + ( ∂ f ∂ x d x + ∂ f ∂ y d y + ∂ f ∂ z d z ) + ( ∂ f ∂ p x d p x + ∂ f ∂ p y d p y + ∂ f ∂ p z d p z ) = ∂ f ∂ t d t + ∇ f ⋅ d r + ∂ f ∂ p ⋅ d p = ∂ f ∂ t d t + ∇ f ⋅ p m d t + ∂ f ∂ p ⋅ F d t {\displaystyle {\begin{aligned}df&={\frac {\partial f}{\partial t}}\,dt+\left({\frac {\partial f}{\partial x}}\,dx+{\frac {\partial f}{\partial y}}\,dy+{\frac {\partial f}{\partial z}}\,dz\right)+\left({\frac {\partial f}{\partial p_{x}}}\,dp_{x}+{\frac {\partial f}{\partial p_{y}}}\,dp_{y}+{\frac {\partial f}{\partial p_{z}}}\,dp_{z}\right)\\[5pt]&={\frac {\partial f}{\partial t}}dt+\nabla f\cdot d\mathbf {r} +{\frac {\partial f}{\partial \mathbf {p} }}\cdot d\mathbf {p} \\[5pt]&={\frac {\partial f}{\partial t}}dt+\nabla f\cdot {\frac {\mathbf {p} }{m}}dt+{\frac {\partial f}{\partial \mathbf {p} }}\cdot \mathbf {F} \,dt\end{aligned}}} where ∇
1748-420: Is the differential cross-section, as before, between particles i and j . The integration is over the momentum components in the integrand (which are labelled i and j ). The sum of integrals describes the entry and exit of particles of species i in or out of the phase-space element. The Boltzmann equation can be used to derive the fluid dynamic conservation laws for mass, charge, momentum, and energy. For
1824-399: Is the energy distribution function of molecules at time t . The value f ( E , t ) dE is the number of molecules that have kinetic energy between E and E + dE . H itself is defined as For an isolated ideal gas (with fixed total energy and fixed total number of particles), the function H is at a minimum when the particles have a Maxwell–Boltzmann distribution ; if the molecules of
1900-524: Is the magnitude of the relative momenta (see relative velocity for more on this concept), and I ( g , Ω) is the differential cross section of the collision, in which the relative momenta of the colliding particles turns through an angle θ into the element of the solid angle d Ω , due to the collision. Since much of the challenge in solving the Boltzmann equation originates with the complex collision term, attempts have been made to "model" and simplify
1976-413: Is the mass density, and V i = ⟨ v i ⟩ {\displaystyle V_{i}=\langle v_{i}\rangle } is the average fluid velocity. Letting A = m ( v i ) 1 = p i {\displaystyle A=m(v_{i})^{1}=p_{i}} , the momentum of the particle, the integrated Boltzmann equation becomes
2052-1066: Is the molecular collision frequency, and f 0 {\displaystyle f_{0}} is the local Maxwellian distribution function given the gas temperature at this point in space. This is also called "relaxation time approximation". For a mixture of chemical species labelled by indices i = 1, 2, 3, ..., n the equation for species i is ∂ f i ∂ t + p i m i ⋅ ∇ f i + F ⋅ ∂ f i ∂ p i = ( ∂ f i ∂ t ) coll , {\displaystyle {\frac {\partial f_{i}}{\partial t}}+{\frac {\mathbf {p} _{i}}{m_{i}}}\cdot \nabla f_{i}+\mathbf {F} \cdot {\frac {\partial f_{i}}{\partial \mathbf {p} _{i}}}=\left({\frac {\partial f_{i}}{\partial t}}\right)_{\text{coll}},} where f i = f i ( r , p i , t ) , and
2128-413: Is the number of molecules which all have positions lying within a volume element d 3 r {\displaystyle d^{3}\mathbf {r} } about r and momenta lying within a momentum space element d 3 p {\displaystyle d^{3}\mathbf {p} } about p , at time t . Integrating over a region of position space and momentum space gives
2204-399: Is the particle velocity vector. Define A ( p i ) {\displaystyle A(p_{i})} as some function of momentum p i {\displaystyle p_{i}} only, whose total value is conserved in a collision. Assume also that the force F i {\displaystyle F_{i}} is a function of position only, and that f
2280-408: Is the pressure tensor (the viscous stress tensor plus the hydrostatic pressure ). Letting A = m ( v i ) 2 2 = p i p i 2 m {\displaystyle A={\frac {m(v_{i})^{2}}{2}}={\frac {p_{i}p_{i}}{2m}}} , the kinetic energy of the particle, the integrated Boltzmann equation becomes
2356-453: Is the probability distribution. Using the Boltzmann equation one can prove that H can only decrease. For a system of N statistically independent particles, H is related to the thermodynamic entropy S through: So, according to the H -theorem, S can only increase. In quantum statistical mechanics (which is the quantum version of classical statistical mechanics), the H-function is
H-theorem - Misplaced Pages Continue
2432-594: Is therefore modified to the BGK form: ∂ f ∂ t + p m ⋅ ∇ f + F ⋅ ∂ f ∂ p = ν ( f 0 − f ) , {\displaystyle {\frac {\partial f}{\partial t}}+{\frac {\mathbf {p} }{m}}\cdot \nabla f+\mathbf {F} \cdot {\frac {\partial f}{\partial \mathbf {p} }}=\nu (f_{0}-f),} where ν {\displaystyle \nu }
2508-416: Is undistinguishable from the capital version of Latin letter h ( H ) . Discussions have been raised on how the symbol should be understood, but it remains unclear due to the lack of written sources from the time of the theorem. Studies of the typography and the work of J.W. Gibbs seem to favour the interpretation of H as Eta . The H value is determined from the function f ( E , t ) dE , which
2584-512: Is written d 3 r d 3 p = d x d y d z d p x d p y d p z . {\displaystyle d^{3}\mathbf {r} \,d^{3}\mathbf {p} =dx\,dy\,dz\,dp_{x}\,dp_{y}\,dp_{z}.} Since the probability of N molecules, which all have r and p within d 3 r d 3 p {\displaystyle d^{3}\mathbf {r} \,d^{3}\mathbf {p} } ,
2660-2023: Is zero for p i → ± ∞ {\displaystyle p_{i}\to \pm \infty } . Multiplying the Boltzmann equation by A and integrating over momentum yields four terms, which, using integration by parts, can be expressed as ∫ A ∂ f ∂ t d 3 p = ∂ ∂ t ( n ⟨ A ⟩ ) , {\displaystyle \int A{\frac {\partial f}{\partial t}}\,d^{3}\mathbf {p} ={\frac {\partial }{\partial t}}(n\langle A\rangle ),} ∫ p j A m ∂ f ∂ x j d 3 p = 1 m ∂ ∂ x j ( n ⟨ A p j ⟩ ) , {\displaystyle \int {\frac {p_{j}A}{m}}{\frac {\partial f}{\partial x_{j}}}\,d^{3}\mathbf {p} ={\frac {1}{m}}{\frac {\partial }{\partial x_{j}}}(n\langle Ap_{j}\rangle ),} ∫ A F j ∂ f ∂ p j d 3 p = − n F j ⟨ ∂ A ∂ p j ⟩ , {\displaystyle \int AF_{j}{\frac {\partial f}{\partial p_{j}}}\,d^{3}\mathbf {p} =-nF_{j}\left\langle {\frac {\partial A}{\partial p_{j}}}\right\rangle ,} ∫ A ( ∂ f ∂ t ) coll d 3 p = ∂ ∂ t coll ( n ⟨ A ⟩ ) = 0 , {\displaystyle \int A\left({\frac {\partial f}{\partial t}}\right)_{\text{coll}}\,d^{3}\mathbf {p} ={\frac {\partial }{\partial t}}_{\text{coll}}(n\langle A\rangle )=0,} where
2736-591: The Vlasov equation . This equation is more useful than the principal one above, yet still incomplete, since f cannot be solved unless the collision term in f is known. This term cannot be found as easily or generally as the others – it is a statistical term representing the particle collisions, and requires knowledge of the statistics the particles obey, like the Maxwell–Boltzmann , Fermi–Dirac or Bose–Einstein distributions. A key insight applied by Boltzmann
2812-479: The mass of the particle, the integrated Boltzmann equation becomes the conservation of mass equation: ∂ ∂ t ρ + ∂ ∂ x j ( ρ V j ) = 0 , {\displaystyle {\frac {\partial }{\partial t}}\rho +{\frac {\partial }{\partial x_{j}}}(\rho V_{j})=0,} where ρ = m n {\displaystyle \rho =mn}
2888-424: The probability that the particle occupies a given very small region of space (mathematically the volume element d 3 r {\displaystyle d^{3}\mathbf {r} } ) centered at the position r {\displaystyle \mathbf {r} } , and has momentum nearly equal to a given momentum vector p {\displaystyle \mathbf {p} } (thus occupying
2964-456: The "force" term corresponds to the forces exerted on the particles by an external influence (not by the particles themselves), the "diff" term represents the diffusion of particles, and "coll" is the collision term – accounting for the forces acting between particles in collisions. Expressions for each term on the right side are provided below. Note that some authors use the particle velocity v instead of momentum p ; they are related in
3040-406: The "rarity" of states, resulting in his entropy formula of 1877. As a demonstration of Loschmidt's paradox, a modern counterexample (not to Boltzmann's original gas-related H -theorem, but to a closely related analogue) is the phenomenon of spin echo . In the spin echo effect, it is physically possible to induce time reversal in an interacting system of spins. An analogue to Boltzmann's H for
3116-598: The (average) expected number of particles in that region. Tolman offers the following equations for the definition of the quantity H in Boltzmann's original H theorem. Here we sum over the regions into which phase space is divided, indexed by i {\displaystyle i} . And in the limit for an infinitesimal phase space volume δ q i → 0 , δ p i → 0 ∀ i {\displaystyle \delta q_{i}\rightarrow 0,\delta p_{i}\rightarrow 0\;\forall \,i} , we can write
H-theorem - Misplaced Pages Continue
3192-596: The H-theorem or the relaxation theorem . There are several notable reasons described below why the H -theorem, at least in its original 1871 form, is not completely rigorous. As Boltzmann would eventually go on to admit, the arrow of time in the H -theorem is not in fact purely mechanical, but really a consequence of assumptions about initial conditions. Soon after Boltzmann published his H theorem, Johann Josef Loschmidt objected that it should not be possible to deduce an irreversible process from time-symmetric dynamics and
3268-473: The charge carriers in a material as a gas). See also convection–diffusion equation . The equation is a nonlinear integro-differential equation , and the unknown function in the equation is a probability density function in six-dimensional space of a particle position and momentum. The problem of existence and uniqueness of solutions is still not fully resolved, but some recent results are quite promising. The set of all possible positions r and momenta p
3344-755: The collision term is ( ∂ f i ∂ t ) c o l l = ∑ j = 1 n ∬ g i j I i j ( g i j , Ω ) [ f i ′ f j ′ − f i f j ] d Ω d 3 p ′ , {\displaystyle \left({\frac {\partial f_{i}}{\partial t}}\right)_{\mathrm {coll} }=\sum _{j=1}^{n}\iint g_{ij}I_{ij}(g_{ij},\Omega )[f'_{i}f'_{j}-f_{i}f_{j}]\,d\Omega \,d^{3}\mathbf {p'} ,} where f′ = f′ ( p′ i , t ) ,
3420-426: The collision term. The best known model equation is due to Bhatnagar, Gross and Krook. The assumption in the BGK approximation is that the effect of molecular collisions is to force a non-equilibrium distribution function at a point in physical space back to a Maxwellian equilibrium distribution function and that the rate at which this occurs is proportional to the molecular collision frequency. The Boltzmann equation
3496-532: The conservation equations involve tensors, the Einstein summation convention will be used where repeated indices in a product indicate summation over those indices. Thus x ↦ x i {\displaystyle \mathbf {x} \mapsto x_{i}} and p ↦ p i = m v i {\displaystyle \mathbf {p} \mapsto p_{i}=mv_{i}} , where v i {\displaystyle v_{i}}
3572-562: The conservation of energy equation: Thermalisation In physics , thermalisation (or thermalization ) is the process of physical bodies reaching thermal equilibrium through mutual interaction. In general, the natural tendency of a system is towards a state of equipartition of energy and uniform temperature that maximizes the system's entropy . Thermalisation, thermal equilibrium, and temperature are therefore important fundamental concepts within statistical physics , statistical mechanics , and thermodynamics ; all of which are
3648-757: The conservation of momentum equation: ∂ ∂ t ( ρ V i ) + ∂ ∂ x j ( ρ V i V j + P i j ) − n F i = 0 , {\displaystyle {\frac {\partial }{\partial t}}(\rho V_{i})+{\frac {\partial }{\partial x_{j}}}(\rho V_{i}V_{j}+P_{ij})-nF_{i}=0,} where P i j = ρ ⟨ ( v i − V i ) ( v j − V j ) ⟩ {\displaystyle P_{ij}=\rho \langle (v_{i}-V_{i})(v_{j}-V_{j})\rangle }
3724-424: The container. The gas will quickly attain its equilibrium value of entropy, but given enough time, this same situation will happen again. For practical systems, e.g. a gas in a 1-liter container at room temperature and atmospheric pressure, this time is truly enormous, many multiples of the age of the universe, and, practically speaking, one can ignore the possibility. Since H is a mechanically defined variable that
3800-407: The critics of the theorem, wrote the function with the symbol H, a notation that was subsequently adopted by Boltzmann when referring to his "H- theorem". The notation has led to some confusion regarding the name of the theorem. Even though the statement is usually referred to as the " Aitch theorem " , sometimes it is instead called the " Eta theorem", as the capital Greek letter Eta ( Η )
3876-492: The definition of momentum by p = m v . Consider particles described by f , each experiencing an external force F not due to other particles (see the collision term for the latter treatment). Suppose at time t some number of particles all have position r within element d 3 r {\displaystyle d^{3}\mathbf {r} } and momentum p within d 3 p {\displaystyle d^{3}\mathbf {p} } . If
SECTION 50
#17328557807763952-476: The discrete counterpart of the quantity H , known as the information entropy or information uncertainty (with a minus sign). By extending the discrete information entropy to the continuous information entropy , also called differential entropy , one obtains the expression in the equation from the section above, Definition and Meaning of Boltzmann's H , and thus a better feel for the meaning of H . The H -theorem's connection between information and entropy plays
4028-403: The entropy of an isolated system always increases to a maximum equilibrium value. This is strictly true only in the thermodynamic limit of an infinite number of particles. For a finite number of particles, there will always be entropy fluctuations. For example, in the fixed volume of the isolated system, the maximum entropy is obtained when half the particles are in one half of the volume, half in
4104-1706: The fact that the phase space volume element d 3 r d 3 p {\displaystyle d^{3}\mathbf {r} \,d^{3}\mathbf {p} } is constant, which can be shown using Hamilton's equations (see the discussion under Liouville's theorem ). However, since collisions do occur, the particle density in the phase-space volume d 3 r d 3 p {\displaystyle d^{3}\mathbf {r} \,d^{3}\mathbf {p} } changes, so d N c o l l = ( ∂ f ∂ t ) c o l l Δ t d 3 r d 3 p = f ( r + p m Δ t , p + F Δ t , t + Δ t ) d 3 r d 3 p − f ( r , p , t ) d 3 r d 3 p = Δ f d 3 r d 3 p {\displaystyle {\begin{aligned}dN_{\mathrm {coll} }&=\left({\frac {\partial f}{\partial t}}\right)_{\mathrm {coll} }\Delta t\,d^{3}\mathbf {r} \,d^{3}\mathbf {p} \\[5pt]&=f\left(\mathbf {r} +{\frac {\mathbf {p} }{m}}\Delta t,\mathbf {p} +\mathbf {F} \Delta t,t+\Delta t\right)d^{3}\mathbf {r} \,d^{3}\mathbf {p} -f(\mathbf {r} ,\mathbf {p} ,t)\,d^{3}\mathbf {r} \,d^{3}\mathbf {p} \\[5pt]&=\Delta f\,d^{3}\mathbf {r} \,d^{3}\mathbf {p} \end{aligned}}} where Δ f
4180-461: The function: where summation runs over all possible distinct states of the system, and p i is the probability that the system could be found in the i -th state. This is closely related to the entropy formula of Gibbs , and we shall (following e.g., Waldram (1985), p. 39) proceed using S rather than H . First, differentiating with respect to time gives (using the fact that Σ dp i / dt = 0, since Σ p i = 1, so
4256-490: The ideal gas are distributed in some other way (say, all having the same kinetic energy), then the value of H will be higher. Boltzmann's H -theorem, described in the next section, shows that when collisions between molecules are allowed, such distributions are unstable and tend to irreversibly seek towards the minimum value of H (towards the Maxwell–Boltzmann distribution). (Note on notation: Boltzmann originally used
4332-425: The integral over velocity space : H = d e f ∫ P ( ln P ) d 3 v = ⟨ ln P ⟩ {\displaystyle \displaystyle H\ {\stackrel {\mathrm {def} }{=}}\ \int {P({\ln P})\,d^{3}v}=\left\langle \ln P\right\rangle } where P ( v )
4408-509: The introduction of generalized statistical ensembles . The kinetic equation and in particular Boltzmann's molecular chaos assumption inspired a whole family of Boltzmann equations that are still used today to model the motions of particles, such as the electrons in a semiconductor. In many cases the molecular chaos assumption is highly accurate, and the ability to discard complex correlations between particles makes calculations much simpler. The process of thermalisation can be described using
4484-431: The jumps will make contributions where the reversibility of the dynamics ensures that the same transition constant ν αβ appears in both expressions. So The two differences terms in the summation always have the same sign. For example: then so overall the two negative signs will cancel. Therefore, Boltzmann%27s equation The Boltzmann equation or Boltzmann transport equation ( BTE ) describes
4560-426: The last term is zero, since A is conserved in a collision. The values of A correspond to moments of velocity v i {\displaystyle v_{i}} (and momentum p i {\displaystyle p_{i}} , as they are linearly dependent). Letting A = m ( v i ) 0 = m {\displaystyle A=m(v_{i})^{0}=m} ,
4636-401: The letter E for quantity H ; most of the literature after Boltzmann uses the letter H as here. Boltzmann also used the symbol x to refer to the kinetic energy of a particle.) Boltzmann considered what happens during the collision between two particles. It is a basic fact of mechanics that in the elastic collision between two particles (such as hard spheres), the energy transferred between
SECTION 60
#17328557807764712-398: The magnitude of the relative momenta is g i j = | p i − p j | = | p ′ i − p ′ j | , {\displaystyle g_{ij}=|\mathbf {p} _{i}-\mathbf {p} _{j}|=|\mathbf {p'} _{i}-\mathbf {p'} _{j}|,} and I ij
4788-411: The mechanics of energy transfer, the energies of the particles after the collision will obey a certain new random distribution that can be computed. Considering repeated uncorrelated collisions, between any and all of the molecules in the gas, Boltzmann constructed his kinetic equation ( Boltzmann's equation ). From this kinetic equation, a natural outcome is that the continual process of collision causes
4864-477: The momenta of any two particles (labeled as A and B for convenience) before a collision, p′ A and p′ B are the momenta after the collision, g = | p B − p A | = | p ′ B − p ′ A | {\displaystyle g=|\mathbf {p} _{B}-\mathbf {p} _{A}|=|\mathbf {p'} _{B}-\mathbf {p'} _{A}|}
4940-528: The other, but sometimes there will be temporarily a few more particles on one side than the other, and this will constitute a very small reduction in entropy. These entropy fluctuations are such that the longer one waits, the larger an entropy fluctuation one will probably see during that time, and the time one must wait for a given entropy fluctuation is always finite, even for a fluctuation to its minimum possible value. For example, one might have an extremely low entropy condition of all particles being in one half of
5016-499: The particles varies depending on initial conditions (angle of collision, etc.). Boltzmann made a key assumption known as the Stosszahlansatz ( molecular chaos assumption), that during any collision event in the gas, the two particles participating in the collision have 1) independently chosen kinetic energies from the distribution, 2) independent velocity directions, 3) independent starting points. Under these assumptions, and given
5092-522: The quantity H to decrease until it has reached a minimum. Although Boltzmann's H -theorem turned out not to be the absolute proof of the second law of thermodynamics as originally claimed (see Criticisms below), the H -theorem led Boltzmann in the last years of the 19th century to more and more probabilistic arguments about the nature of thermodynamics. The probabilistic view of thermodynamics culminated in 1902 with Josiah Willard Gibbs 's statistical mechanics for fully general systems (not just gases), and
5168-465: The second term vanishes. We will see later that it will be useful to break this into two sums.) Now Fermi's golden rule gives a master equation for the average rate of quantum jumps from state α to β; and from state β to α. (Of course, Fermi's golden rule itself makes certain approximations, and the introduction of this rule is what introduces irreversibility. It is essentially the quantum version of Boltzmann's Stosszahlansatz .) For an isolated system
5244-419: The spin system can be defined in terms of the distribution of spin states in the system. In the experiment, the spin system is initially perturbed into a non-equilibrium state (high H ), and, as predicted by the H theorem the quantity H soon decreases to the equilibrium value. At some point, a carefully constructed electromagnetic pulse is applied that reverses the motions of all the spins. The spins then undo
5320-407: The statistical behaviour of a thermodynamic system not in a state of equilibrium ; it was devised by Ludwig Boltzmann in 1872. The classic example of such a system is a fluid with temperature gradients in space causing heat to flow from hotter regions to colder ones, by the random but biased transport of the particles making up that fluid. In the modern literature the term Boltzmann equation
5396-402: The sum as an integral. H can also be written in terms of the number of molecules present in each of the cells. An additional way to calculate the quantity H is: where P is the probability of finding a system chosen at random from the specified microcanonical ensemble . It can finally be written as: where G is the number of classical states. The quantity H can also be defined as
5472-400: The system's H is at any time not a minimum, then by Poincaré recurrence , the non-minimal H must recur (though after some extremely long time). Boltzmann admitted that these recurring rises in H technically would occur, but pointed out that, over long times, the system spends only a tiny fraction of its time in one of these recurring states. The second law of thermodynamics states that
5548-403: The time evolution from before the pulse, and after some time the H actually increases away from equilibrium (once the evolution has completely unwound, the H decreases once again to the minimum value). In some sense, the time reversed states noted by Loschmidt turned out to be not completely impractical. In 1896, Ernst Zermelo noted a further problem with the H theorem, which was that if
5624-418: The time interval over which it is observed is not enormously large. If H is interpreted as entropy as Boltzmann intended, then this can be seen as a manifestation of the fluctuation theorem . H is a forerunner of Shannon's information entropy . Claude Shannon denoted his measure of information entropy H after the H-theorem. The article on Shannon's information entropy contains an explanation of
5700-416: The total number of particles which have positions and momenta in that region: N = ∫ m o m e n t a d 3 p ∫ p o s i t i o n s d 3 r f ( r , p , t ) = ∭ m o m e n t
5776-1272: Was to determine the collision term resulting solely from two-body collisions between particles that are assumed to be uncorrelated prior to the collision. This assumption was referred to by Boltzmann as the " Stosszahlansatz " and is also known as the " molecular chaos assumption". Under this assumption the collision term can be written as a momentum-space integral over the product of one-particle distribution functions: ( ∂ f ∂ t ) coll = ∬ g I ( g , Ω ) [ f ( r , p ′ A , t ) f ( r , p ′ B , t ) − f ( r , p A , t ) f ( r , p B , t ) ] d Ω d 3 p B , {\displaystyle \left({\frac {\partial f}{\partial t}}\right)_{\text{coll}}=\iint gI(g,\Omega )[f(\mathbf {r} ,\mathbf {p'} _{A},t)f(\mathbf {r} ,\mathbf {p'} _{B},t)-f(\mathbf {r} ,\mathbf {p} _{A},t)f(\mathbf {r} ,\mathbf {p} _{B},t)]\,d\Omega \,d^{3}\mathbf {p} _{B},} where p A and p B are
#775224