Non-Equilibrium Statistical Mechanics: Partition Functions and Steepest Entropy Increase
Non-Equilibrium Statistical Mechanics: Partition Functions and Steepest Entropy Increase
Non-Equilibrium Statistical Mechanics: Partition Functions and Steepest Entropy Increase
Related content
- Entropy production and the arrow of time
Non-equilibrium statistical mechanics: partition J M R Parrondo, C Van den Broeck and R
Kawai
functions and steepest entropy increase - Maximum entropy production rate in
quantum thermodynamics
Gian Paolo Beretta
To cite this article: Sergio Bordel J. Stat. Mech. (2011) P05013
- Non-equilibrium Dynamics, Thermalization
and Entropy Production
Haye Hinrichsen, Christian Gogolin and
Peter Janotta
View the article online for updates and enhancements.
Online at stacks.iop.org/JSTAT/2011/P05013
doi:10.1088/1742-5468/2011/05/P05013
2011
c IOP Publishing Ltd and SISSA 1742-5468/11/P05013+12$33.00
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
Contents
1. Introduction 2
2. The non-equilibrium partition function 3
3. The steepest entropy ascent and Onsager’s relations are equivalent 6
4. The Fisher–Rao metric is the natural metric 9
1. Introduction
doi:10.1088/1742-5468/2011/05/P05013 2
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
increase, a metric in the space of density operators has to be introduced. Beretta chose a
Fisher–Rao metric [10, 11].
In a recent work [12] we argued, using information-theoretical concepts, that the
most probable direction of evolution of a system is the direction of maximal entropy
ascent and showed that Onsager’s reciprocity relations appear as a consequence of this
assumption. However we did not pay attention to how the choice of metric for the space
of probability distributions would affect these conclusions. In section 3 of this work
we make a stronger case for the steepest entropy ascent and show that independently
of the choice of metric, the maximal entropy ascent is a condition both necessary and
sufficient for the satisfaction of Onsager’s reciprocity relations. This result shows a clear
In this section we adopt again most of the arguments developed in a previous work [12] in
order to derive a probability distribution for states far from equilibrium. We consider an
isolated system far away from equilibrium and analyse the time evolution of the system’s
thermodynamic entropy towards its maximum, taking in account the fact that in an
isolated system, energy and matter are conserved.
Given the microscopic definition of entropy [6], the rate of entropy increase in an
isolated system, can be defined as follows:
dS dpi dpi
= −kB ln(pi ) − kB . (1)
dt i
dt i
dt
doi:10.1088/1742-5468/2011/05/P05013 3
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
The sum of the probabilities of all the accessible microstates must remain constant
and equal to 1; therefore its time derivative is equal to 0:
dpi
= 0. (2)
i
dt
Note that thermodynamic equilibrium implies that πk is equal to zero for every k, or
in other words, that the vector of components ln(pi ) is perpendicular to the kernel of the
doi:10.1088/1742-5468/2011/05/P05013 4
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
matrix C and therefore belongs to the vector space generated by its rows. This condition
implies the following relation:
ln(pi ) = λ + λC r C r . (10)
r
The previous expression is identical to the grand canonical distribution obtained from
Gibbs’ formalism.
As we argued in a previous work [12], the thermodynamic fluxes can be defined as
the rates of change of non-conserved quantities:
Jf = φf i ṗi .
Expressing the rates of change of the probabilities according to equation (6) we obtain
the following expression:
Jf = k eki φf i = k akf . (12)
k i k
Comparing equations (8) and (13) and considering that when all the thermodynamic
forces become zero (in equilibrium), all the elements πk should also be zero, the following
relation between thermodynamic forces and the elements πk arises:
πk = − akf Xf . (14)
f
By combining equations (9) and (14) we obtain the following relation between the
thermodynamic forces and the non-equilibrium probability distribution:
kB eki ln(pi ) = − akf Xf ∀k. (15)
i f
kB eki ln(pi ) = − eki φ f i Xf ∀k. (16)
i i f
If we rewrite the previous equation using matrix formulation, with E being the matrix
formed by the row vectors {ek }, we obtain the following relation:
−−→ →
kB E ln(p) = −E Xf φ f . (17)
f
doi:10.1088/1742-5468/2011/05/P05013 5
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
The vector ψ can be any vector orthogonal to the subspace generated by {ek };
therefore it can be generated as a linear combination of the rows of the matrix C:
kB ln(pi ) = λ + λC r Cir − Xf φ f i . (19)
r f
The average values of the non-conserved variables can be obtained from the partial
derivatives of the partition function with respect to their associated thermodynamic forces:
∂ ln Z
φf = −kB (22)
∂Xf
and the average values of the conserved quantities (energy or chemical species):
∂ ln Z
C r = kB . (23)
∂λC r
However, the Lagrange multipliers corresponding to the conserved quantities can no
longer be directly identified with the inverse of the temperature (for the energy) or with the
negative chemical potentials divided by the temperature (for conserved chemical species),
as these variables are difficult to define in a system far away from equilibrium that could
be inhomogeneous.
A similar non-equilibrium partition function was already derived using different
methods by Zubarev and Kalashnikov [2], who also identified the parameters equivalent
to Xf in their expression as thermodynamic forces. The most interesting feature of the
non-equilibrium probability distribution obtained here is that it corresponds to the same
distribution as would be obtained by maximizing the entropy of the system with the values
φf constrained to be constant. However this condition was not used in our derivation
and the expression arises only on defining the rate of entropy increase as the sum of the
products of thermodynamic fluxes with their respective forces.
In a previous work [12] we argued from geometric considerations that the most probable
path of evolution of a system towards equilibrium is the direction of steepest entropy ascent
and showed that Onsager’s reciprocity relations can be deduced from this consideration
and hold also far from equilibrium. However, in order to define a gradient and a direction
doi:10.1088/1742-5468/2011/05/P05013 6
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
of steepest entropy ascent, it is necessary to define a distance in the space of vectors p.
In our previous work we implicitly assumed the distance to be defined as follows:
d 2 = dp2i . (24)
i
This decision appears to be arbitrary. Beretta [10, 11] argued that the natural metric
of a space of discrete probability distributions is the Fisher–Rao metric. The same metric
was used by Crooks [18] to define what he called the thermodynamic distance between
two states, although he only applied it to equilibrium states:
1 1 2
In this section we aim to prove that independently of the choice of metric, Onsager’s
relations can be deduced from the steepest entropy ascent and hold even far away from
equilibrium. Given an arbitrary metric for our space of probability distributions,
d 2 = gij (
p) dpi dpj . (26)
ij
The bilinear form with components gij is positive definite, and locally dependent on
the probability distribution.
By choosing the basis {ek } in such a way that it is orthonormal with respect to
the scalar product defined by the bilinear form gij , the metric with respect to this basis
becomes
d 2 = d2k . (27)
k
Looking at equation (8) we can see that the direction of maximal entropy ascent
corresponds to
1
= − π . (28)
τ
The parameter τ has units of time and has been chosen (instead of β = 1/τ in our
previous work [12]) in order to follow Beretta’s notation [10, 11]. By substituting the
last expression in equation (12) we obtain the following equation for the thermodynamic
fluxes:
1
Jf = − πk akf (p). (29)
τ k
In this case the basis {ek } is chosen to be orthonormal with respect to the bilinear
form gij (
p); therefore, its components are functions of the probability distribution and so
are the components akf ( p):
akf (
p) = eki (
p)φf i . (30)
i
doi:10.1088/1742-5468/2011/05/P05013 7
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
If we combine equations (14) and (29) we arrive at the following relationship between
the fluxes and forces:
1
Jf = akg (
p)akf (p)Xg . (31)
τ k g
p)2 ≤ lf f (p)lgg (
lf g ( p). (34)
We can conclude that Onsager’s relations can be obtained from the condition of
maximal entropy ascent independently of the metric chosen for the space of probability
distributions and hold also far away from equilibrium. Beretta also obtained the same
result using the Fisher–Rao metric [10, 11].
Now we will show that the condition of maximal entropy ascent is not only consistent
with Onsager’s relations but also necessary. Let us assume that the vectors and π are
not anti-parallel but are related by the following equation:
1
= − Uπ . (35)
τ
The matrix U is unitary and different from the identity matrix (and therefore it is
asymmetric). Equations (29) and (31) get transformed in the following way:
1
Jf = − ukq πq akf (
p) (36)
τ q k
1
Jf = ukq aqg (
p)akf (
p)Xg . (37)
τ k g q
The reciprocity relations are satisfied only if the unitary matrix U becomes the identity
matrix; therefore Onsager’s reciprocity relations are satisfied only if the system evolves
in the direction of steepest entropy increase for any particular metric of the space of
probability distributions.
doi:10.1088/1742-5468/2011/05/P05013 8
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
doi:10.1088/1742-5468/2011/05/P05013 9
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
−−→ →
For the vector ln(p) to remain in the vector subspace generated by {1, C r , φf }, its
time derivatives must also belong to this subspace. Rewriting equation (6) in matrix
notation we obtain
p˙ = E t.
→
(41)
Now we write the time derivatives of the probabilities as a function of the time
derivatives of their logarithms
−→˙
P ln(p) = E t. (42)
The matrix P is a diagonal matrix that contains the values of the probabilities and E
A = EΦ (46)
−→
˙ 1
EGP ln(p) = − EΦX. (47)
τ
→ belongs
The matrix Φ contains the vectors {φf } in its columns; therefore the vector ΦX
→
to the subspace generated by {φf }.
Following the same argument as in equation (17) we can see that
−→˙ 1
GP ln(p) = − ΦX + ψ. (48)
τ
belongs to the subspace generated by {1, C r }. Therefore, for an arbitrary
The vector ψ
−→˙
metric defined by G we have concluded that the vector GP ln(p) belongs to the subspace
→ −→˙
generated by {1, C r , φf }. Now, if we want to assure that ln(p) also belongs to the same
subspace we need the following condition to be satisfied:
G ∝ P −1 . (49)
doi:10.1088/1742-5468/2011/05/P05013 10
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
This means that for a system to be constantly describable, the metric of the space
of probability distributions has to be of the Fisher–Rao type, exactly as Beretta had
proposed.
An interesting characteristic of Beretta’s solution of the problem of thermodynamic
irreversibility (which makes it different from other approaches to solving the same
problem) is that the density operator is defined both for positive and negative times [10].
Therefore, previous states of the system under study can be deduced from the present
state. A system being constantly describable implies that the same set of macroscopic
variables can describe the state of the system at all the points of its trajectory towards
thermodynamic equilibrium. There may be an epistemological relationship between these
5. Conclusions
doi:10.1088/1742-5468/2011/05/P05013 11
Non-equilibrium statistical mechanics: partition functions and steepest entropy increase
doi:10.1088/1742-5468/2011/05/P05013 12