0% found this document useful (0 votes)
59 views22 pages

Statistical Mechanical Ensembles: 1. Microscopic Origin of Entropy

Uploaded by

manish verma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views22 pages

Statistical Mechanical Ensembles: 1. Microscopic Origin of Entropy

Uploaded by

manish verma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Statistical

 Mechanical  Ensembles†  

1.    Microscopic  Origin  of  Entropy  


A  common  definition  of  entropy  from  a  macroscopic  thermodynamic  viewpoint  is  
ΔS  =  Qrev/T,  where  Qrev  is  the  amount  of  heat  exchanged  at  temperature  T  between  
a  system  and  its  environment.  But  what  is  the  origin  of  entropy  in  the  microscopic,  
molecular   world?   Qualitatively,   we   expect   that   irreversible   processes   (which   re-­‐
sult   in   an   increase   in   total   entropy)   also   increase   the   degree   of   “disorder”   in   a   sys-­‐
tem.  It  turns  out  that  a  quantitative  measure  of  disorder,  specifically  the  number  of  
microscopic   states   Ω   available   to   a   system   at   a   given   internal   energy   U   and   for  
specified   number   of   molecules   (or   moles)   N   and   volume   V,   can   be   quantitatively  
linked   to   the   entropy.   A   distinct   microscopic   state   (microstate)   is   defined   by   all  
microscopic   degrees   of   freedom   –   e.g.   positions   and   velocities   of   molecules   in   a  
gas.  A  set  of  microstates  with  specified  common  properties  (e.g.  number  of  parti-­‐
cles  N,  volume  V  and  energy  U)  defines  an  ensemble.  The  expression  of  S  in  terms  of  
microstates   is   provided   by   the   famous   1872   entropy   formula   of   Ludwig   Boltz-­‐
mann,    

S = kB ln Ω(N ,V ,U )   (1)  
!
 
Boltzmann’s   entropy   formula   is   the   foundation   of   statistical   mechanics,   con-­‐
necting   macroscopic   and   microscopic   points   of   view.   It   allows   calculations   of   mac-­‐
roscopic   thermodynamic   properties   by   determining   properties   of   microscopic  
configurations.   The   constant   kB   is   called   Boltzmann’s   constant;   it   is   the   ratio   of   the  
ideal-­‐gas  constant  to  Avogadro’s  number,  or  equivalently  the  gas  constant  on  a  per  
molecule  basis:  kB  =  R/NA  =1.38065·10-­‐23  J/K.  
The   concept   of   microstates   arises   naturally   in   quantum   mechanics,   but   can   be   also  
introduced  in  classical  systems,  if  positions  and  velocities  can  be  grouped  so  that,  
when  their  values  differ  less  than  some  selected  (but  arbitrary)  small  value,  then  
they   are   considered   to   be   equal.   This   quantization   effect   also   arises   naturally   in  
computations,  which  are  performed  with  a  finite  numerical  accuracy,  so  that  two  
quantities  cannot  differ  by  less  than  the  machine  precision.    

                                                                                                                         
†  Draft  material  from  “Statistical  Thermodynamics”  ©  2014,  A.  Z.  Panagiotopoulos  
2    

 
To   illustrate   the   concept  
of   counting   microstates,   we  
will   use   the   simple   system  
shown   in   Fig.   1.   In   the   gen-­‐
eral   case,   it   consists   of   N  
slots,   each   containing   a   ball    
that   can   be   at   energy   levels   Figure  1  A  system  of  10  spheres  with  3  energy  levels.  
0,  +1,  +2,  +3,  …,  measured  in    
units   of   kBT0,   where   T0   is   a    
reference  temperature.  The  specific  case  of  N  =  10  and  3  energy  levels  is  shown  in  
the  figure.  The  concept  of  discrete  energy  levels  arises  very  naturally  in  quantum  
mechanics.   For   this   simple   system,   we   can   count   states   by   using   the   combinatorial  
formula  giving  the  number  of  ways  we  can  pick  M  specific  distinguishable  objects  
out  of  N  total  objects:  

⎛ N ⎞ N! N ⋅(N −1)⋅⋅⋅(N − M +1)


⎜ ⎟ = M!(N − M)! =   (2)  
! ⎝ M ⎠ 1⋅2⋅⋅⋅M
 
There  is  only  1  state  with  internal  energy  U  =  0.  States  with  U  =  1  have  one  ball  
at  level  +1  and  the  others  at  level  0,  so  for  N  =  10,  there  are  10  such  states.  States  
with  energy  U  =  2  may  have  1  ball  at  +2  and  the  others  at  0,  or  2  balls  at  +1,  so  there  
are:  
⎛ 10 ⎞
    ⎜ ⎟ +10 = 55    
!⎝ 2 ⎠
such   states.   We   can   similarly   obtain   Ω (3 )= 220;   Ω (4 )= 715,   Ω (5 )= 2002,   and   so  
on.  Note  that  the  number  of  microstates  increases  rapidly  with  the  total  energy  of  
the  system.  This  is  generally  the  case  for  most  systems.  
Now  we  are  in  a  position  to  show  that  S  defined  microscopically  from  Eq.  1  has  
the  two  key  properties  associated  with  the  entropy  of  classical  thermodynamics:  
1. S  is  extensive:    For  two  independent  subsystems,  A  and  B,  

S = k ln(Ω A+B ) = kB ln(Ω A ⋅ΩB ) = kB lnΩ A + kB lnΩB  


! A+B B
The   reason   is   that   each   microstate   of   system   A   can   be   combined   with   a  
microstate  of  system  B  to  give  a  microstate  of  the  combined  system.  This  is  
clearly  true  for  the  simple  system  illustrated  on  the  previous  page.  However,  
when   mixing   two   gases   or   liquids,   we   only   get   the   above   expression   if   we  
assume  that  the  particles  in  the  systems  are  indistinguishable.  If  particles  are  
distinguishable,   additional   states   are   available   to   the   combined   system  
resulting   from   the   possibility   of   exchanging   the   “labels”   of   particles.  
Although  the  indistinguishability  of  particles  is  really  of  quantum  mechanical  
  3  

origin,   it   was   introduced   ad   hoc   by   Gibbs   before   the   development   of  


quantum  mechanics,  in  order  to  make  entropy  an  extensive  property.  
2. S  is  maximized  at  equilibrium:    For  a  system  with  internal  constraints  (e.g.  
internal   rigid   walls   or   barriers   to   energy   transfer),   the   number   of   possible  
microstates   is   always   smaller   than   the   number   of   microstates   after   the  
constraints  are  removed.    
  S  (N,V,  U)    >    S  (N,V,  U;  internal  constraints)  

To  demonstrate  this  second  property,  consider  the  box  with  particles  of  the  
example   above,   and   think   of   any   constraint   to   the   system   at   a   given   total  
energy  (say  U=+2).  An  example  of  a  "constraint"  would  be  to  have  that  the  
first   five   slots   have   exactly   1   unit   of   energy.   The   number   of   microstates   in  
this  case  is  (5x5=25),  less  than  the  55  states  available  to  the  unconstrained  
system.  
One  clear  distinction  between  macroscopic  and  microscopic  definitions  of  en-­‐
tropy   is   that   the   former   is   physically   meaningful   only   as   a   difference   of   entropy  
between  specified  states,  while  the  latter  appears  to  provide  a  measure  of   absolute  
entropy.  This  apparent  discrepancy  results  from  the  inability  of  classical  physics  to  
define   uniquely   when   two   nearby   states   (e.g.   positions   of   a   particle   in   free   space  
differing   by   a   fraction   of   a   nm)   are   sufficiently   different   to   justify   distinguishing  
them  from  each  other.  Quantum  mechanical  methods,  on  the  other  hand,  provide  
precise  ways  to  count  states.    
At  low  temperatures,  the  number  of  microstates  available  to  any  physical  sys-­‐
tem  decreases  rapidly.  At  the  limit  of  absolute  zero  temperature,  T  →  0 ,  most  sys-­‐
tems  adopt  a  unique  “ground  state”  for  which  Ω  =  1  ⇒  S  =  kBlnΩ  =  0.  This  is  the  basis  
of  the  “Third  Law  of  thermodynamics”  postulated  by  Nerst  in  the  early  1900’s.  The  
NIST   Chemistry   WebBook   lists   absolute   entropies   for   pure   components   and   chem-­‐
ical  elements  in  the  thermochemistry  data  section.  However,  using  entropy  values  
calculated   with   respect   to   an   arbitrary   reference   state   gives   the   same   results   as  
absolute  entropies  for  heat  and  work  amounts.  

Example  1  –  Entropy  of  a  lattice  chain    


A  common  model  for  polymers  is  the  Flory  lattice  model,  which  represents  chains  
as   “self-­‐avoiding   random   walks”   on   a   lattice   (grid).   Self-­‐avoiding   means   that   two  
beads   cannot   occupy   the   same   position   on   the   lattice.   When   two   non-­‐bonded  
beads   occupy   adjacent   positions,   they   have   energy   of   interaction   equal   to   –kBT0,  
where   T0   is   a   reference   temperature.   Obtain   the   number   of   microstates   Ω   for   a  
two-­‐dimensional   square-­‐lattice   chain   of   5   beads,   as   a   function   of   the   energy   U   of  
the  chain.  
   
Fig.   2   shows   the   number   of   configurations   for   a   square-­‐lattice   chain   of   5   beads.  
Without  loss  of  generality,  we  have  fixed  the  configuration  of  the  first  two  beads  of  
4    

the   chain   to   be   in   the   horizontal   direction,   with   the   second   bead   to   the   right   of   the  
first.  This  reduces  the  total  number  of  configurations  by  a  factor  of  4;  such  a  multi-­‐
plicative  factor  simply  shifts  the  value  of  S  obtained  from  Eq.  1  by  a  constant  factor,  
akin   to   the   reference   state   for   the   entropy.   The   last   bond   is   shown   in   multiple   con-­‐
figurations  (arrows),  along  with  their  number  and  energy:   !2×(−1)  for  the  top  left  
image  means  there  are  2  configurations,  each  of  energy  –1.    
Overall,  counting  configurations  of  the  same  energy:    
Ω(U=0)  =  3+2+2+3+2+2+3=17    ;  Ω(U=–1)  =  2+1+1+1+1+2=8  
The   number   of   microscopic   configurations   and   energy   levels   increases   rapidly  
with   chain   length.   Theoretical   and   Monte   Carlo   computer   simulation   techniques  
are  used  for  determining  the  properties  of  models  of  this  type  for  longer  chains.  
 
 
 
 
 
 
 
 
 
 
 
 
 
   
    Figure  2  Configurations  for  a  two-­‐dimensional  chain  of  5  beads  
 

A  key  issue  in  statistical  mechanics  is  the  frequency  of  occurrence  of  different  
microstates  when  the  overall  constraints  of  a  thermodynamic  system  are  specified.  
A   basic   postulate,   comparable   in   importance   to   the   postulate   about   the   existence  
of  equilibrium  states  in  classical  thermodynamics,  is  that  all  microstates  of  a  sys-­‐
tem  at  a  given  U,  N  and  V  are  equally  probable.    

The  basic  postulate  of  statistical  mechanics  implies  that  the  probability  of  any  
microstate  ν,   Pν ,  is  the  same  as  that  of  any  other  microstate  in  the  constant  N ,V ,U  
ensemble:  
  5  

1
    Pν =    at  constant  N,  V  and  U   (3)  
! Ω
From   this   postulate,   we   can   now   simply   derive   another   famous   expression,   the  
Gibbs  entropy  formula,  by  substituting  Eq.  3  into  Eq.  1:  
 
Gibbs  
  S = − kB ∑ Pν lnPν   (4)   entropy  
formula  
all!micro+
! states!ν

The   Gibbs   entropy   formula   can   be   shown   to   be   valid   even   for   systems   not   at  
constant  energy  U,  volume  V,  and  number  of  particles  N.  This  is  in  contrast  to  Eq.  1,  
which   is   only   valid   at   constant   for   microstates   at   constant   U   V   and   N.   For   example,  
in  §  3  we  prove  Eq.  4  for  systems  at  constant  N,  V,  and  T.  

2.    Phase  Space  and  Statistical  Mechanical  Ensembles  


Boltzmann’s   entropy   formula   links   a   macroscopic   thermodynamic   quantity,   the  
entropy,   to   microscopic   attributes   of   a   system   at   equilibrium.   In   this   section,   we  
introduce  many  similar  relationships  for  systems  under  constraints  different  than  
(N,V,U),   in   a   manner   quite   analogous   to   the   introduction   of   fundamental   equations  
in   different   variables   developed   in   the   previous   chapter   through   the   Legendre  
transform  formalism.    
The   branch   of   physical   science   that   aims   to   connect   macroscopic   properties  
and   microscopic   information   about   a   system   is   called   Statistical   Mechanics.   The  
central   question   in   Statistical   Mechanics   can   be   phrased   as   follows:   If   particles   (at-­‐
oms,   molecules,   electrons,   nuclei,   or   even   living   cells)   obey   certain   microscopic  
laws  with  specified  interparticle  interactions,  what  are  the  observable  properties  
of  a  macroscopic  system  containing  a  large  number  of  such  particles?  Unlike  classi-­‐
cal   thermodynamics,   statistical   mechanics   does   require   input   information   on   the  
microscopic   constitution   of   mater   of   interest,   as   well   as   the   interactions   active  
among   the   microscopic   building   blocks.   The   advantage   of   the   approach   is   that  
quantitative   predictions   of   macroscopic   properties   can   then   be   obtained,   rather  
than   simply   relationships   linking   different   properties   to   each   other.   Another   im-­‐
portant  difference  between  statistical  mechanics  and  macroscopic  thermodynam-­‐
ics  is  that  fluctuations,  which  are  absent  by  definition  in  the  thermodynamics,  can  
be  quantified  and  analyzed  though  the  tools  of  statistical  mechanics.  Fluctuations  
are  temporary  deviations  of  quantities  such  as  the  pressure  or  energy  of  a  system  
from   their   mean   values,   and   are   important   in   small   systems   –   e.g.   those   studied   by  
computer  simulations,  or  present  in  modern  nanoscale  electronic  devices  and   bio-­‐
logical  organelles.  
6      
 

Postulate  I  states  that  macroscopic  systems  at  equilibrium  can  be  fully  characterized  
by  n+2  independent  thermodynamic  variables.  For  a  1-­‐component  isolated  system,  
these   variables   can   always   be   selected   to   be   the   total   mass   N,   total   volume   V   and  
total   energy   U.   However,   at   the   microscopic   level,   molecules   are   in   constant   motion.  
Adopting   temporarily   a   classical   (rather   than   quantum   mechanical)   point   of   view,  
we   can   describe   this   motion   through   the   instantaneous   positions   and   velocities   of  
the  molecules.  Examples  of  microscopic  and  macroscopic  variables  are  given  below  
for  N  molecules  of  a  one-­‐component  monoatomic  gas  obeying  classical  mechanics.  
 

Microscopic  variables  (in  3  dimensions)   Macroscopic  variables  

3N    position  coordinates  (x,  y,  z)   3   independent   thermodynamic   varia-­‐


3N    velocity  components  (ux  ,  uy  ,  uz  )   bles,    e.g.,  N,  V,  and  U.  
 
Given  that  N  is  of  the  order  of  Avogadro’s  number  [NA  =  6.0221×1023  mol–1]  for  
macroscopic   samples,   there   is   a   huge   reduction   in   the   number   of   variables   re-­‐
quired  for  a  full  description  of  a  system  when  moving  from  the  microscopic  to  the  
macroscopic  variables.  Moreover,  the  microscopic  variables  are  constantly  chang-­‐
ing  with  time,  whereas  for  a  system  at  equilibrium,  all  macroscopic  thermodynam-­‐
ic   quantities   are  constant.  The  multidimensional  space  defined  by  the  microscopic  
variables   of   a   system   is   called   the   phase   space.   This   is   somewhat   confusing,   given  
that   the   word   “phase”   has   a   different   meaning   in   classical   thermodynamics   –   the  
term  was  introduced  by  J.  Willard  Gibbs,  no  stranger  to  the  concept  of  macroscopic  
phases.    
In  general,  for  a  system  with  N  molecules  in  3  dimensions,  phase  space  has  6N  
independent  variables  

  (r N ,pN ) ≡ (r1 ,r2 ,…,rN ,p1 ,p2…,pN )     (5)  


!
where  bold  symbols  indicate  vectors,  ri  is  the  position  and  pi  the  momentum  of  the  i-­‐
the   molecule,   (pi  =  mi   ui).   The   evolution   of   such   a   system   in   time   is   described   by  
Newton's  equations  of  motion:  

dri
= ui
dt
    (6)  
du i ∂U(r1 ,r2 ,…,rN )
mi =−
dt ∂ri
!
where U  is  the  total  potential  energy,  which  includes  interactions  among  molecules  
and  any  external  fields  acting  on  the  system.  

  2/17/14  version  
  7  

For  a  much  simpler  system,  a  one-­‐dimensional  


harmonic   oscillator   shown   in   Fig.   3,   phase  
space   is   two-­‐dimensional,   with   coordinates   the  
position   and   the   momentum   variables.   In   the  
absence   of   friction,   the   oscillator   undergoes  
harmonic  motion,  which  can  be  represented  as  
a   circle   in   phase   space   if   position   and  
momentum   are   scaled   appropriately.   The  
diameter   of   the   circle   depends   on   the   total  
energy   (a   constant   of   the   motion),   and   the  
oscillator  moves  around  the  circle  at  a  constant    
velocity.   Figure  3    A  one-­‐dimensional  har-­‐
monic  oscillator  (top),  and  the  cor-­‐
A  statistical  mechanical  ensemble  is  a  collection   responding  phase  space  (bottom).    
of   all   microstates   of   a   system,   consistent   with    
the  constraints  with  which  we  characterize  a  system  macroscopically.  For  example,  
a  collection  of  all  possible  states  of  N  molecules  of  gas  in  the  container  of  volume  V  
with  a  given  total  energy  U  is  a  statistical  mechanical  ensemble.  For  the  frictionless  
one-­‐dimensional   harmonic   oscillator,   the   ensemble   of   states   of   constant   energy   is  
the   circular   trajectory   in   position   and   momentum   space   shown   in   Fig.   3,   bottom  
panel.  

3.    Molecular  Chaos  and  Ergodic  Hypothesis  


What   causes   the   huge   reduction   from   3N   time-­‐
dependent  coordinates  needed  to  fully  character-­‐
ize  a  system  at  the  molecular  level  to  just  a  hand-­‐
ful  of  time-­‐independent  thermodynamic  variables  
at   the   macroscopic   level?   The   answer   turns  out  to  
be   related   to   the   chaotic   behavior   of   systems   with  
many  coupled  degrees  of  freedom.  
To   illustrate   this   concept,   one   can   perform   a  
thought   experiment   on   the   system   shown   in   Fig.      
4.   In   this   system,   a   number   of   molecules   of   a   gas   Figure  4    A  conceptual  experi-­‐
are   given   identical   velocities   along   the   horizontal   ment  in  an  isolated  system.  
coordinate,   and   are   placed   in   an   insulated   box  
with   perfectly   reflecting   walls.   Such   a   system   would   seem   to   violate   the   classical  
thermodynamics   postulate   of   eventually   approaching   an   equilibrium   state.   Since  
there  is  no  initial  momentum  in  the  vertical  direction,  Newton’s  equations  of  mo-­‐
tion  would  suggest  that  the  molecules  will  never  hit  the  top  wall,  which  will  thus  
experience  zero  pressure  at  all  times,  even  though  there  is  a  finite  density  of  gas  in  
the   system.   In   statistical   mechanical   terms,   microstates   with   non-­‐zero   vertical  
momenta   will   never   be   populated   in   such   a   system.   Of   course,   even   tiny   interac-­‐

  2/17/14  version  
8      
 

tions  between  the  molecules,  or  minor  imperfections  of  the  walls,  will  eventually  
result   in   chaotic   motion   in   all   directions;   thermodynamic   equilibrium   will   be   es-­‐
tablished   over   times   significantly   greater   than   the   interval   between   molecular   col-­‐
lisions.    
Most  molecular  systems  are  able  to  “forget”  their  initial  conditions  and  evolve  
towards   equilibrium   states,   by   sampling   all   available   phase   space.   Only   non-­‐
equilibrium  (for  example,  glassy)  systems  violate  this  condition  and  have  proper-­‐
ties   that   depend   on   their   history   –   such   systems   cannot   be   analyzed   with   the  
methods  of  equilibrium  thermodynamics  or  statistical  mechanics.    
At   a   molecular   level,   the   statement   equivalent   to   Postulate   I   of   classical   ther-­‐
modynamics  is  the  ergodic  hypothesis.  The  statement  is  as  follows.  

Ergodic   For   sufficiently   long   times,   systems   evolve   through   all   microscopic   states   con-­‐
hypothesis   sistent  with  the  external  and  internal  constraints  imposed  on  them.  

Experimental  measurements  on  any  macroscopic  system  are  performed  by  ob-­‐
serving   it   for   a   finite   period   of   time,   during   which   the   system   samples   a   very   large  
number   of   possible   microstates.   The   ergodic   hypothesis   suggests   that   for   “long  
enough”  times,   the   entire   ensemble   of   microstates   consistent   with   the   microscopic  
constraints  on  the  system  will  be  sampled.  A  schematic  illustration  of  trajectories  
of   ergodic   and   non-­‐ergodic   systems   in   phase   space   is   shown   in   Fig.   5   –   a   two-­‐
dimensional  representation  of  phase  space  is  given,  whereas  we  know  that  for  re-­‐
alistic   systems   phase   space   has   a   very   large   number   of   dimensions.   The   interior   of  
the  shaded  region  is  the  phase  space  of  the  corresponding  system.  For  the  system  

 
 
Figure  5    Schematic  trajectories  of  ergodic  (left)  a nd  non-­‐ergodic  (right)  systems  in  phase  
space.  
 

on   the   left,   there   are   two   connected   regions   of   phase   space,   so   the   trajectory   even-­‐
tually   passes   from   one   to   the   other   and   samples   the   whole   of   phase   space.   By   con-­‐
trast,   for   the   system   on   the   right,   the   two   regions   of   phase   space   are   disconnected,  
so  that  the  system  cannot  pass  from  one  to  the  other.  The  system  is  non-­‐ergodic.  

  2/17/14  version  
  9  

There  is  a  strong  link  between  time  scales  and  constraints;  a  system  observed  
over   a   short   time   can   appear   to   be   isolated,   while   over   longer   times   it   may   ex-­‐
change   energy   and   mass   with   its   surroundings.   Also,   the   “possible   microstates”  
depend   on   the   time   scales   of   interest,   and   on   internal   constraints   –   for   example,  
chemical   reactions   open   up   additional   microstates,   but   can   only   occur   over   long  
time  scales,  or  in  the  presence  of  a  catalyst.    
For   systems   satisfying   the   ergodic   hypothesis,   experimental   measurements  
(performed   by   time   averages)   and   statistical   mechanical   ensemble   averages   are  
equivalent.  Of  course,  we  have  not  specified  anything  up  to  this  point  about  the  rel-­‐
ative   probabilities   of   specific   states;   the   ergodic   hypothesis   just   states   that   all   states  
will  eventually  be  observed.  A  general  property  F  can  be  formally  written  as:  

  Fobserved = ∑ Pν × Fν = < F >     (7)    


!!!time !!!!!probability !of !finding value!of !property !F ensemble
average the!system!in!microstate!ν !!!!!in!microstate!ν !!average
!

The   objective   of   the   next   few   sections   will   be   to   determine   the   relative  
probabilities,  Pν ,  of  finding  systems  in  given  microstates  of  ensembles  under  vary-­‐
ing  constraints.  This  will  allow  the  prediction  of  properties  by  performing  ensem-­‐
ble  averages,  denoted  by  the  angle  brackets  of  the  rightmost  side  of  Eq.  7.  

4.    Microcanonical  Ensemble:  Constant  U,  V,  and  N  


The  simplest  set  of  macroscopic  constraints  that  can  be  imposed  on  a  system  are  
those  corresponding  to  isolation  in  a  rigid,  insulated  container  of  fixed  volume  V.  
No  energy  can  be  exchanged  through  the  boundaries  of  such  a  system,  so  Newton’s  
equations  of  motion  ensure  that  the  total  energy  U  of  the  system  is  constant.  For  
historical  reasons,  conditions  of  constant  energy,  volume,  and  number  of  particles  
(U,  V,  N)  are  defined  as  the  microcanonical  ensemble.  In  the  present  section  (and  
the  one  that  follows),  we  will  treat  one-­‐component  systems  for  simplicity;  general-­‐
ization  of  the  relationships  to  multicomponent  systems  is  straightforward.  
How   can   we   obtain   the   probabilities   of   microstates   in   a   system   under   constant  
U,   V,   and   N?   Consider   for   a   moment   two   microstates   with   the   same   total   energy,  
depicted   schematically   in   Fig.   6.   One   may   be   tempted   to   say   that   the   microstate   on  
the  left,  with  all  molecules  having  the  same  velocity  and  being  at  the  same  horizon-­‐
tal   position   is   a   lot   less   “random”   than   the   microstate   on   the   right,   and   thus   less  
likely   to   occur.   This   is,   however,   a   misconception   akin   to   saying   that   the   number  
“111111”  is  less  likely  to  occur  than  the  number  “845192”  in  a  random  sequence  
of   6-­‐digit   numbers.   However,   in   a   random   (uniformly   distributed)   sample,   all  
numbers  are  equally  likely  to  occur,  by  definition.  For  molecular  systems,  it  is  not  
hard   to   argue   that   any   specific   set   of   positions   and   velocities   of   N   particles   in   a  

  2/17/14  version  
10      
 

volume   V   that   has   a   given  


total   energy,   should   be  
equally   probable   as   any  
other   specific   set.   There   are,  
of   course,   a   lot   more   states  
that   “look   like”   the   right-­‐
hand   side   of   Fig.   6   relative  
to   the   left-­‐hand   side,   the  
same   way   that   there   are   a    
Figure  6    Two  microstates  in  an  isolated  system.  
lot   more   6-­‐digit   numbers  
with  non-­‐identical  digits  than  there  are  with  all  digits   equal.  The  statement  that  all  
microstates  of  a  given  energy  are  equally  probable  cannot  be  proved  for  the  gen-­‐
eral  case  of  interacting  systems.  Thus,  we  adopt  it  as  the  basic  Postulate  of  statisti-­‐
cal  mechanics:    

Basic  Postulate   For   an   isolated   system   at   constant   U,   V,   and   N,   all   microscopic   states   of   a   system  
of  Statistical   are  equally  likely  at  thermodynamic  equilibrium.  
Mechanics  

Just  as  was  the  case  for  the  postulates  of  classical  thermodynamics,  the  justification  
for  this  statement  is  that  predictions  using  the  tools  of  statistical  mechanics  that  rely  
on   this   postulate   are   in   excellent   agreement   with   experimental   observations   for  
many  diverse  systems,  provided  that  the  systems  are  ergodic.    
As   already   suggested,   given   that   Ω(U,V,N)   is   the   number   of   microstates   with  
energy  U,  the  probability  of  microstate  ν  in  the  microcanonical  ensemble,  accord-­‐
ing  to  the  postulate  above,  is:    
1
  Pν =     (8)    
! Ω(U ,V ,N)
The  function  Ω(U,V,N)  is  called  the  density  of  states,  and  is  directly  linked  to  the  
entropy,   via   Botzmann’s   entropy   formula,   S = −kB lnΩ .   It   is   also   related   to   the   fun-­‐
!
damental  equation  in  the  entropy  representation,  with  natural  variables  (U,V,N).  
Writing   the   differential   form   of   the   fundamental   equation   for   a   one-­‐component  
system  in  terms  of  Ω,  we  obtain:  
dS
  = d lnΩ = β dU + βP dV − βµ dN   (9)  
kB
!
In   Eq.   9,   we   have   introduced   for   the   first   time   the   shorthand   notation  
β ≡ 1/(kBT ) .   This   combination   appears   frequently   in   statistical   mechanics,   and   is  
!
usually  called  the    “inverse  temperature,”  even  though  strictly  speaking  it  has  units  
of  inverse  energy  [J–1].  Differentiation  of  Eq.  9  provides  expressions  for  the  inverse  

  2/17/14  version  
Microcanonical  Ensemble:  Constant  U,  V,  and  N   11  

temperature,  pressure,  and  chemical  potential,  in  terms  of  derivatives  of  the  loga-­‐
rithm  of  the  number  of  microstates  with  respect  to  appropriate  variables:  
⎛ ∂lnΩ ⎞
  ⎜⎝ ∂U ⎟⎠ = β   (9)  
V ,N

⎛ ∂lnΩ ⎞
    ⎜⎝ ∂V ⎟⎠ = βP   (10)  
U ,N
!
⎛ ∂lnΩ ⎞
  ⎜⎝ ∂N ⎟⎠ = −βµ   (11)  
U ,V
!

Example  2  –  A  system  with  two  states  and  negative  temperatures  


Consider   a   system   of   N   distinguishable   particles   at   fixed   positions,   each   of   which  
can  exist  either  in  a  ground  state  of  energy  0,  or  in  an  excited  state  of  energy  ε.  The  
system  is  similar  to  that  depicted  in  Fig.  1,  except  it  has  only  2  (rather  than  3)  en-­‐
ergy  levels.  Assuming  that  there  are  no  interactions  between  particles,  derive  ex-­‐
pressions  for  the  density  of  states  and  the  temperature  as  a  function  of  the  energy,  
at  the  thermodynamic  limit,  N→∞.  
 
For  a  given  total  energy  !U = Mε ,  the  number  of  possible  states  is  given  by  the  ways  
one  can  pick  M  objects  out  of  N  total  particles:  
⎛ ⎞ N!
Ω(U) = ⎜ N ⎟ =    
⎝ M ⎠ M!(N − M)!
!
At  the  limit  of  large  N,  we  can  use  Stirling’s  approximation,  !ln(N!) ≈ N lnN − N :  

ln ⎡⎣Ω(U)⎤⎦ ≈ N lnN − N − M lnM + M −(N − M)ln(N − M)+ N − M ⇒  


! N→∞

ln ⎡Ω(U)⎤⎦ = N lnN − M lnM −(N − M)ln(N − M)  


! ⎣
The   temperature   as   a   function   of   U   is   obtained   from   Eq.   9,   taking   into   account   that  
the  volume  V  is  not  a  relevant  variable  for  this  system:  
⎛ ∂lnΩ ⎞ ⎛ ∂lnΩ ⎞
β=⎜ = ⎜⎝ ∂(M ε) ⎟⎠ ⇒  
⎝ ∂U ⎟⎠ N N

1⎛ ∂ ⎞ 1
β= ⎜ ( N ln N − M ln M − (N − M )ln(N − M ))⎟ = ( − ln M − 1+ ln(N − M ) + 1) ⇒
ε ⎝ ∂M ⎠N ε
 

  2/17/14  version  
12      
 

N−M ⎛N ⎞ kT 1
βε = ln ⇒ βε = ln ⎜ − 1⎟ ⇒   B =    
M ⎝M ⎠ ε ⎛N ⎞
ln ⎜ −1⎟
⎝M ⎠
!
The  possible  values  of  the  normalized  energy  ratio,   U /Umax = M / N ,  range  from  0  
!
(every  particle  is  in  the  ground  state)  to  1  (every  particle  is  in  the  excited  state).  
The   relationship   between   M/N   and   the   temperature   is   shown   in   Fig.   7.   Low   values  
of  M/N  correspond  to  low  temperatures.  Remarkably,  the  temperature  approaches  
+∞   as   M/N   →   ½   from   below,   and   then   returns   from   negative   infinity   to   just   below  
zero  as  M/N  →  1.    
 
 
 
 
 
 
 
 
 
 
   
Figure  7    Temperature  versus  normalized  energy.  
   
 
Do  these  results  make  sense?  Can  negative  temperatures  exist  in  nature?  It  turns  
out  that  the  existence  of  negative  temperatures  is  entirely  consistent  with  thermo-­‐
dynamics.   The   system   of   this   example   has   a   density   of   states   that   is   a   decreasing  
function  of  the  total  energy  U  when  more  than  half  the  particles  are  in  the  excited  
state.  Most  physical  systems  (e.g.  molecules  free  to  translate  in  space)  have  a  mon-­‐
otonic   increase   in   the   number   of   states   at   higher   energies,   and   thus   cannot   exist   at  
negative  temperatures;  however,  some  spin  systems‡  can  closely  approximate  the  
system  of  this  example.  One  needs  to  realize  that  negative  temperatures  are  effec-­‐
tively  higher  than  all  positive  temperatures,  as  energy  will  flow  in  the  negative  →  
positive   direction   on   contact   between   two   systems   of   opposite   sides   of   the   dashed  
line  M/N  =  ½,  corresponding  to  β  =  0,  or  T  =  ±∞.  

                                                                                                                         
‡   For   a   recent   example   of   such   a   system,   see   Braun,   S.   B.,   et   al.,   “Negative   Absolute   Temper-­‐

ature  for  Motional  Degrees  of  Freedom,”  Science,  339:52-­‐55  (2013).  

  2/17/14  version  
  13  

5.    Canonical  Ensemble:    Constant  N,  V,  and  T  


Just  as  in  macroscopic  thermodynamics,  in  statistical  mechanics  we  are  interested  
in  developing  relationships  for  systems  subject  to  constraints  other  than  constant  
U,   V,   and   N.   In   classical   thermodynamics,   a   change   of   variables   is   performed  
through  the  Legendre  transform  formalism,  which  will  turn  out  to  be  highly  rele-­‐
vant  here  as  well.  The  first  example  to  consider  will  be   a  system  of  fixed  volume  V  
and   number   of   particles   N,   in   thermal   contact   with   a   much   larger   reservoir,   as  
shown   in   Figure   8.   Because   energy   can   be   transferred   between   the   small   system  
and   the   reservoir   without   a   significant   change   in   the   reservoir’s   properties,   the  
small  system  is  effectively  at  constant  temperature,  that  of  the  reservoir.  The  set  of  
microstates   compatible   with   constant-­‐NVT   conditions   is   called   the   canonical   en-­‐
semble.   Going   from   the   microcanonical   (UVN)   to   the   canonical   (NVT)   ensemble   is  
akin  to  taking  the  first  Legendre  transformation  of  the  fundamental  equation  in  the  
entropy  representation.  Note  that  the  order  of  variables  (UVN  →  TVN)  is  important  
when   performing   Legendre   transformations;   however,   convention   dictates   that  
the  canonical  ensemble  is  referred  to  as  the  “NVT”  ensemble  –  the  ordering  of  vari-­‐
ables  is  unimportant  once  a  given  transformation  has  been  performed.  
How  do  we  derive  the  relative  
probabilities   of   microstates   for  
the   constant-­‐temperature   small  
system?   The   total   system   (small  
system  +  reservoir)  is  under  con-­‐
stant-­‐UVN   conditions.   In   the   pre-­‐
vious  sections  of  this  chapter,  we  
suggested   that   all   microstates   Ω  
of   the   total   system,   which   is   at  
constant   energy,   volume   and  
number   of   particles,   are   equally  
probable.   However,   a   given   mi-­‐
crostate   ν   of   the   small   system  
with  energy  Uν   is   consistent   with    
Figure  8    A  small  system  in  contact  with  a  large  
many  possible  microstates  of  the   reservoir.  
reservoir  –  the  only  constraint  is    
that  the  energy  of  the  reservoir  is  UR  =  U  –  Uν .  The  number  of  such  microstates  for  the  
reservoir  is    

  Ω (U ) = ΩR (U −U ν )    
! R R
The  probability  of  finding  the  small  system  in  state  ν  is  proportional  to  the  number  
of  such  microstates,  

  2/17/14  version  
14      
 

  ! ν
(
P ∝ ΩR (U −U ν ) = exp ln ⎡⎣ΩR (U −U ν )⎤⎦   ) (12)  

We  can  Taylor-­‐expand   lnΩR around   ΩR (U)  given  that  Uν  is  much  smaller  than  U:  
! !
∂lnΩR
  ln ⎡⎣ΩR (U −U ν )⎤⎦ = ln ⎡⎣ΩR (U)⎤⎦ − U ν + …!   (13)  
! ∂U
Substituting  13  back  in  Eq.  12  and  using  Eq.  9,  we  can  incorporate  the  term  involv-­‐
ing   ΩRes   (that   does   not   depend   on   the   microstate   ν )   into   the   constant   of   propor-­‐
tionality  for  Pν :    

 
! ν
( )
P ∝ exp −βU ν        at  constant  N V T   (14)  

This   is   a   very   important   result.   The   probability   of   each   microstate   in   the   ca-­‐
nonical   ensemble   (constant   NVT)   decreases   exponentially   for   higher   energies.   The  
probability  distribution  of  Eq.  14  is  known  as  the  Boltzmann  distribution.    
In   order   to   find   the   absolute   probability   of   each   microstate,   we   need   to   nor-­‐
malize  the  probabilities  so  that  their  sum  is  1.  The  normalization  constant  is  called  
the  canonical  partition  function,  Q  and  is  obtained  from  a  summation  over  all  mi-­‐
crostates,  
canonical  
partition  
Q (N ,V ,T ) = ∑ exp −βU ν   ( ) (15)  
function     ! all!microstates!!ν

The  probability  of  each  microstate  can  now  be  written  explicitly  as  an  equality:  

Pν =
(
exp −βU ν )      at  constant  N V T   (16)  
  ! Q    
The  probability  of  all  microstates  with  a  given  energy  U  is  the  a  sum  of  Ω (U )  equal  
terms,  each  at  the  volume  V  and  number  of  molecules  N  of  the  system:  

P(U) =
Ω(U)exp −βU ( ) at  constant  N V T   (17)  
  ! Q        
An  important  implication  of  these  derivations  is  that  the  energy  of  a  system  at  
constant  temperature  is  strictly  fixed.  Instead  it   fluctuates  as  the  system  samples  
different  microstates.  This  is  in  direct  contrast  with  the  postulate  of  classical  ther-­‐
modynamics  that  three  independent  variables  (N,  V,  and  T  in  this  case)  fully  char-­‐
acterize  the  state  of  a  system,  including  its  energy.  As  will  be  analyzed  in  detail  in  
the  following  chapter,  fluctuations  of  quantities  such  as  the  energy  are  present  in  
all   finite   systems,   but   their   relative   magnitude   decreases   with   increasing   system  
size.  For  macroscopic  systems  fluctuations  are  negligible  for  all  practical  purposes,  
except   near   critical   points.   In   any   statistical   mechanical   ensemble,   however,   we  

  2/17/14  version  
  15  

need  to  make  a  clear  distinction  between  quantities  that  are  strictly  constant  (con-­‐
strains,   or   independent   variables   in   the   language   of   Legendre   transformations),  
and  those  that  fluctuate  (derivatives).    
Any   thermodynamic   property   B   can   be   obtained   from   a   summation   over   mi-­‐
crostates  of  the  value  of  the  property  at  a  given  microstate  times  the  probability  of  
observing  the  microstate:  

  <B > = ∑ Bν ⋅Pν            for  any  ensemble   (18)  


! all!microstates!ν

For  example,  the  ensemble  average  energy   !< U >  in  the  canonical  ensemble  is  
given  by:    
1
< U > = ∑U ν exp −βU ν at  constant  N V T  
Q
( ) (19)  
! !ν
  Let  us  calculate  the  temperature  derivative  of  the  canonical  partition  function  
       
Q  .  We  have:  

∂lnQ 1 ∂exp(−βU ν ) U ν exp −βU ν ( )


 
∂β
=
Q
∑ ∂β
= −∑
Q
= − < U >   (20)  
! ν ν

The  fundamental  equation  for  S/kB  ,  Eq.,  has  U,  V  and  N  as  its  variables  and  β  ,    
βP  ,  and  –βµ  as  its  derivatives.  Its  first  Legendre  transform  with  respect  to  U  is:  
S S U U −TS
  − βU = − =− = −βA   (21)  
kB kB kBT kBT
!
This  is  a  function  of  β  ,  V  ,  and  N,  with:  

⎛ ∂(−βA) ⎞
  ⎜⎝ ∂β ⎟⎠ = −U   (22)
! V ,N  
Comparing   Eqs.   20   and   22,   we   see   that   the   former   (obtained   from   statistical  
mechanics)  gives  the  ensemble  average  energy,  recognizing  that  the  energy  fluctu-­‐
ates   under   constant-­‐NVT   conditions.   The   latter   expression,   obtained   from   thermo-­‐
dynamics  using  Legendre  transformations,  does  not  involve  averages  of  fluctuating  
quantities.   At   the   thermodynamic   limit,   N   →   ∞,   we   can   set   the   two   expressions  
equal  ,  and  obtain  a  direct  connection  between  the  canonical  partition  function  and  
the  first  Legendre  transform  of   S /kB ,    
!
  !−βA = lnQ   (23)  

Eq.  23  relates  a  thermodynamic  quantity,  the  Helmholtz  energy  A,  to  a  micro-­‐
scopic   one,   the   partition   function   Q.   This   also   allows   us   to   confirm   the   Gibbs   en-­‐

  2/17/14  version  
16      
 

tropy   formula   for   the   case   of   a   system   at   constant-­‐N V T ,   in   the   thermodynamic  


limit  N→∞  :    

− ∑Pν lnPν = (
 − ∑Pν −lnQ − βU ν = lnQ ∑Pν + β∑PνU ν = )
ν Eq.!5.14 ν ν ν
    (24)    
− A +U S
= lnQ + βU = =
kBT kB
!
Given   that   lnQ   is   the   first   Legendre   transformation   of   lnΩ,   we   can   now   ex-­‐
!
press   all   the   first   derivatives   of   the   canonical   partition   function   Q ,   analogous   to  
Eqs.  9–11  for  the  derivatives  of  the  microcanonical  partition  function  Ω:  
⎛ ∂lnQ ⎞
  ⎜⎝ ∂β ⎟⎠ = −U   (25)
! V ,N  
⎛ ∂lnQ ⎞
  ⎜⎝ ∂V ⎟⎠ = βP   (26)  
T ,N
!
⎛ ∂lnQ ⎞
  ⎜⎝ ∂N ⎟⎠ = −βµ   (27)  
T ,V
!
These  expressions  are  strictly  true  only  in  the  thermodynamic  limit  N→∞;  for  
finite  systems,  for  which  we  need  to  preserve  the  distinction  between  fluctuating  
and  strictly  constant  quantities,  the  proper  expressions  involve  ensemble  averag-­‐
es;  for  example,  the  correct  version  of  Eq.  25  is:  
⎛ ∂lnQ ⎞
  ⎜⎝ ∂β ⎟⎠ = − < U >   (28)
! V ,N  
 

Example  3  –  A  system  with  two  states  in  the  NVT  ensemble  


Consider  the  system  of  N  distinguishable  particles  at  fixed  positions,  each  of  which  
can   exist   either   in   a   ground   state   of   energy   0,   or   in   an  excited   state   of   energy   ε,   in-­‐
troduced  in  Example  2.  Determine  the  mean  energy  <U>  as  a  function  of  tempera-­‐
ture  in  the  canonical  ensemble,  and  compare  the  result  to  the  microcanonical  en-­‐
semble  calculation  of  Example  2.  
 
We   denote   the   state   of   each   particle   i   =   1,2,   …  N   by   a   variable  li     which   can   take   the  
values  0  or  1,  denoting  the  ground  or  excited  state.  The  total  energy  is  
N
U = ε∑ li    
! i=1

The  partition  function  in  the  canonical  ensemble  is:  

  2/17/14  version  
  17  

⎛ N ⎞ N
lnQ = ln ∑ e ∑ ⎜ ∑ i⎟ ∑ ∏ exp(−βεli )  
−βU ν
⇒Q = exp −β ε l =
ν l1 ,l2 ,...,lN =0,1 ⎝ i=1 ⎠ l1 ,l2 ,...,lN =0,1 i=1
!
Now  we  can  use  a  mathematical  identity  that  will  turn  out  to  be  useful  in  all  cases  
in  which  a  partition  function  has  contributions  from  many  independent  particles.  
The   sum   contains   2N   terms,   each   consisting   of   a   product   of   N   exponentials.   We   can  
regroup  the  terms  in  a  different  way,  in  effect  switching  the  order  of  the  summa-­‐
tion  and  multiplication:  
N N
∑ ∏ exp(−βεli ) = ∏ ∑ e
−βεli
= (1+ e −βε )N  
!l1 ,l2 ,...,lN =0,1 i=1 i=1 li =0,1

You  can  easily  confirm  that  the  “switched”  product  contains  the  same  2N  terms  as  
before.  The  final  result  for  the  partition  function  is:  
−βε
!lnQ = N ln(1+ e )  
The  ensemble  average  energy  <U>  is    
⎛ ∂lnQ ⎞ εe −βε Nε
<U > = ⎜ ⎟ = N =  
⎝ ∂(−β) ⎠ N ,V 1+ e −βε 1+ eβε
!
The  microcanonical  ensemble  result  can  be  written  as:  
⎛N ⎞ ε N N Nε
ln ⎜ −1⎟ = = βε ⇒ −1 = eβε ⇒ M = βε
⇒ Mε = U = βε
 
⎝ M ⎠ kBT M 1+ e 1+ e
!
The   only   difference   is   that   in   the   canonical   ensemble   the   energy   is   a   fluctuating  
(rather   than   a   fixed)   quantity.   Note   also   that   the   canonical   ensemble   derivation  
does   not   entail   any   approximations;   the   same   result   is   valid   for   any  N.   By   contrast,  
the   microcanonical   energy   was   obtained   through   the   use   of   Stirling’s   approxima-­‐
tion,  valid  as  N→∞.  Small  differences  between  the  two  ensembles  are  present  for  
finite  N.    

6.    Generalized  Ensembles  and  Legendre  Transforms  


The  previous  two  sections  illustrated  how,  starting  from  the  fundamental  equation  
in   the   entropy   representation   and   Boltzmann’s   entropy   formula,   one   can   obtain  
relationships   between   macroscopic   and   microscopic   quantities,   first   in   the   UVN  
(microcanonical)   ensemble,   with   key   quantity   the   density   of   states,   Ω.   A   first   Le-­‐
gendre   transformation   of  U   to   T   resulted   in   the   canonical   ensemble,   with   partition  
function   Q.   This   process   can   be   readily   generalized   to   obtain   relationships   be-­‐
tween   microscopic   and   macroscopic   quantities   and   partition   functions   for   a   sys-­‐
tem  under  arbitrary  constraints.  

  2/17/14  version  
18      
 

In   our   derivation,   we   start   from   the   multicomponent   version   of   the   fundamen-­‐


tal  equation  in  the  entropy  representation,              
n
dS
= d lnΩ = βdU + βPdV − ∑βµ i dNi   (29)  
kB
  ! i=1

We  have  already  seen  the  first  Legendre  transformation:  


S
  y (1) = − βU = β(TS −U) = −βA     (30)  
kB
!
The   relationships   between   the   original   function   and   the   first   transform   are   de-­‐
picted  in  the  table  below.  
 

y (0) = S /kB = lnΩ   y (1) = − A /kBT = lnQ  


! !
Variable   Derivative   Variable   Derivative  

U   1/(kBT)  =  β   1/(kBT)  =  β   –U  

V   P/(kBT)  =  βP   V   P/(kBT)  =  βP  

Ni   –µi/(kBT)  =  – βµi   Ni   –µi/(kBT)  =  – βµi  


 
These  relationships  link  microscopic  to  macroscopic  properties  and  are  strictly  
valid  at  the  thermodynamic  limit,  N→∞.  One  should  keep  in  mind  that  in  each  en-­‐
semble,   the   variables   of   the   corresponding   transform   are   strictly   held   constant,  
defining  the  external  constraints  on  a  system,  while  the  derivatives  fluctuate  –  they  
take  different  values,  in  principle,  for  each  microstate  of  the  ensemble.    
Probabilities   of   microstates   in   two   statistical   ensembles   have   already   been   de-­‐
rived.  Microstate  probabilities  are  all  equal  in  the  microcanonical  ensemble,  from  
the  basic  postulate  of  statistical  mechanics;  they  are  equal  to  the  Boltzmann  factor,  
exp(–βU),   normalized   by   the   partition   function   for   the   canonical   ensemble   (Eq.  
16).   Note   that   the   factor   –βU   that   appears   in   the   exponential   for   microstate   proba-­‐
bilities   in   the   canonical   ensemble   is   exactly   equal   to   the   difference   between   the  
basis  function  and  its  first  Legendre  transform,   −x1ξ1 .  
!
One  can  continue  this  with  Legendre  transforms  of  higher  order.  The  probabil-­‐
ities  of  microstates  in  the  corresponding  ensembles  can  be  derived  in  a  way  com-­‐
pletely   analogous   to   the   derivation   for   the   canonical   ensemble,   involving   a   subsys-­‐
tem  and  bath  of  constant  temperature,  pressure,  chemical  potential,  etc.  In  general,  
the  kth  transform  of  the  basis  function  is:  

  2/17/14  version  
5.5    Generalized  Ensembles  and  Legendre  Transforms   19  

S S
  y (0) = !!!!!;!!!!!!!!!!y (k ) = − ξ1x1 − ξ2x2 −…− ξ kxk   (31)      
kB kB
!
where   ξ i  is  the  derivative  of   y(0)  with  respect  to  variable xi .    The  probability  of  a  
! !
microstate  in  the  ensemble  corresponding  to  the  kth  transform  is  given  by  

 
! ν
(
P ∝ exp −ξ1x1 − ξ2x2 −…ξ kxk     ) (32)  

where  the  variables   xi  and  derivatives   ξ i refer  to  the  original  function   y (0)  .  The  
! ! !
normalization   factor   (partition   function)   of   the   ensemble   corresponding   to   the  
transformed  function,   y (k )  is:  
!
  Ξ= (
∑ exp −ξ1x1 − ξ2x2 −…ξkxk   ) (33)      
! all!microstates!ν

Using  the  partition  function  Ξ,  the  probability   Pν  can  be  written  as  an  equality:  

  Pν =
(
exp −ξ1x1 − ξ2x2 −…ξ kxk )     (34)  
! Ξ
As  was  the  case  for  the  canonical  ensemble,  the  partition  function  Ξ  is  simply  re-­‐
lated  to  the  transformed  function,   y (k ) :    
!
  lnΞ = y (k )     (35)  
!
Example  4  –  Gibbs  Entropy  Formula  
The  Gibbs  entropy  formula,  
S = −kB ∑Pν lnPν  
! ν
was  derived  earlier  for  the  microcanonical  ensemble.  Show  that  this  relationship  is  
valid  for  all  statistical  ensembles.  
 
We  use  the  expression  for  the  probability  of  microstates,   Pν ,  in  a  generalized  en-­‐
semble,  Eq.  34:  

(
exp −ξ1x1 − ξ2x2 −…ξ kxk )=
∑Pν lnPν = ∑Pν ln Ξ
∑Pν ( −ξ1x1 − ξ2x2 −…ξkxk − lnΞ)
! ν ν ν
 
Now   recall   that   the   variables   for   the   k-­‐th   transform   are   ξ1 ,ξ2 ,,ξ k ,xk+1 ,xn+2 ,  
!
which   are   strictly   constant   in   the   corresponding   ensemble,   while   the   derivatives,  
x ,x ,,xk ,ξ k+1 ,ξ n+2 fluctuate.   We   can   rewrite   the   equation   above   taking   this  
! 1 2
into  account:  

  2/17/14  version  
20      
 

∑Pν ( −ξ1x1 − ξ2x2 −…ξkxk − lnΞ) =


ν
= − ξ1 ∑Pνx1 − ξ2 ∑Pνx2 −− ξ k ∑Pνxk − lnΞ =  
ν ν ν
= − ξ1 < x1 > −ξ2 < x2 > −ξ k < xk > −lnΞ
!
From  Eqs.  31  and  35,  
S
lnΞ = y (k ) = − ξ1x1 − ξ2x2 −…− ξ kxk  
kB
!
At   the   thermodynamic   limit,   N→∞,   so   there   is   no   distinction   between   ensemble  
averages   and   thermodynamic   properties,   < xi > ≡ xi .   Replacing   !lnΞ and   simplify-­‐
!
ing,  
S
∑Pν lnPν = − k  ,    QED  
! ν B

Example  5  –  Grand  Canonical  (μVT)  Ensemble  


The   grand   canonical   (constant–μVT)   is   frequently   used   in   computer   simulations.  
Derive   the   partition   function,   probability   of   microstates,   and   derivative   relation-­‐
ships  in  this  ensemble  for  a  1-­‐component  system.  
 
Starting  from  the  fundamental  equation  in  the  entropy  representation  with  order-­‐
ing  of  variables   y (0) = S(U ,N ,V )/kB = lnΩ ,  the  grand  canonical  ensemble  partition  
!
function  corresponds  to:    
S TS −U + µN
lnΞ = y (2) = − βU + βµN =  
kB kBT
!
The   microstates   possible   in   this   ensemble   include   all   possible   particle   numbers  
from  0  to  ∞,  and  all  possible  energy  levels.  
The   Euler-­‐integrated  form   of   the   fundamental   equation   is  !U = TS − PV + µN ,   so   that  
the   partition   function   of   the   grand   canonical   ensemble   can   be   linked   to   the   follow-­‐
ing  thermodynamic  property  combination:  
PV
lnΞ = = βPV  
kBT
!
 
 

  2/17/14  version  
5.5    Generalized  Ensembles  and  Legendre  Transforms   21  

  y (0) = S /kB = lnΩ   y (2) = βPV = lnΞ  


! !
 
Variable   Derivative   Variable   Derivative  
 
U   β β   –U  
 
N     – βµ     – βµ   –N  
  V   βP   V   βP  
 
For  example,  the  average  number  of  molecules  in  the  system  is  given  by:  
⎛ ∂lnΞ ⎞ ⎛ ∂lnΞ ⎞
⎜⎝ ∂(−βµ) ⎟⎠ = − < N > ⇒ kBT ⎜⎝ ∂µ ⎟⎠ = < N >  
β,V β,V
!
The  probability  of  microstates  in  this  ensemble  is:  
exp(−βU ν + βµN ν )
Pν =  ,      
! Ξ

where     Ξ = ∑ exp(−βU ν + βµN ν ) = ∑ Q(N ,V ,T )exp(βµN)  
! ν N=1

Example  6  –  Constant-­‐pressure  (NPT)  Ensemble  


The  constant-­‐pressure  (NPT)  ensemble  is  also  frequently  used  in  computer  simu-­‐
lations.   Derive   the   partition   functions,   probability   of   microstates,   and   derivative  
relationships  in  this  ensembles  for  a  1-­‐component  system.  
 
Ee   start   from   from   the   fundamental   equation   in   the   entropy   representation   with  
ordering  of  variables   y (0) = S(U ,V ,N)/kB ,  and  obtain  the  second  transform:    
!
S TS −U − PV
lnΞ = y (2) = − βU − βPV = = − βµN = − βG  
kB kBT
!
The   microstates   possible   in   this   ensemble   include   all   possible   volumes   from   0   to  
∞,  and  all  possible  energy  levels.  
The  derivative  table  is  shown  on  the  next  page.  
For  example,  the  average  volume  is  given  by:  
⎛ ∂lnΞ ⎞ ⎛ ∂lnΞ ⎞
⎜⎝ ∂(βP) ⎟⎠ = − < V > ⇒ kBT ⎜⎝ ∂P ⎟⎠ = < V >  
β,N β,N
!
 

  2/17/14  version  
22      
 

 
y (0) = S /kB = lnΩ   y (2) = − βµN = lnΞ  
! !
 
  Variable   Derivative   Variable   Derivative  

  U   β   β   –U  
V   βP   βP   –V  
 
N     – βµ     N   – βµ  
 
 
The  probability  of  microstates  in  this  ensemble  is:  
exp(−βU ν − βPVν )
Pν =  ,    where     Ξ = ∑ exp(−βU ν − βPVν )  
! Ξ ! ν

  2/17/14  version  

You might also like