The main provisions and postulates of statistical thermodynamics. Statistical physics and thermodynamics

Statistical thermodynamics - section of statistical physics, formulating laws that bind the molecular properties of substances with measured on the experience of TD values.

STD is devoted to the rationale for the laws of thermodynamics of equilibrium systems and calculating TD functions on molecular constant. STD base is hypotheses and postulates.

Unlike mechanics, the average values \u200b\u200bof the coordinates and pulses and the probability of the appearance of their values \u200b\u200bare considered. Thermodynamic properties of the macroscopic system are treated as mean values random variables or as probability density characteristics.

There are classic STD (Maxwell, Boltzmann), quantum (Fermi, Dirac, Bose, Einstein).

The main hypothesis of STD: There is a unambiguous combination of the molecular properties of particles constituting the system, and the macroscopic properties of the system.

The ensemble is a large, almost an infinite number of similar TD systems in various microstasters. At the ensemble with constant energy, all microstasses are equally even. The average values \u200b\u200bof the physically observed value in a large period of time is equal to the average value of the ensemble.

§ 1. Micro and macro-stand. Thermodynamic probability (static weight) and entropy. Boltzmann formula. Statistical nature of the second law of TD

To describe a macro-standard indicate a small number of variables (often 2). An description of the specific particles is used to describe the microstation, for each of which six variables are entered.

For graphic image of the microstation, it is convenient to use the phase space. The phase space (molecules) and M-phase space (gas) are distinguished.

For the calculation of the number of microstasses, Boltzman used the cell method, i.e. The phase volume is divided into cells, and the cell's value is large enough to fit several particles, but small compared to the whole volume.

If we believe that one cell corresponds to one microstation, then if the whole volume is divided into the volume of the cell, we obtain the number of microstasters.

We will prepare that the volume of the phase space is broken into three cells. The total number of particles in the system - nine. Let one macrost: 7 + 1 + 1, second: 5 + 2 + 2, third: 3 + 3 + 3. Calculate the number of microstasters that each macro-stand can be implemented. This is the number of ways to equal. In the statistics of Boltzmann, the particles are considered to be distinguishable, i.e. The exchange of particles between the cells gives a new micower, but the macro-stand remains the same.

The largest number of microstasses give a system in which particles are evenly distributed throughout the volume. The most unstable state corresponds to the accumulation of particles in one part of the system.


Calculate the number of microstasses when the Avogadro number is distributed over two cells:

Apply Stirling formula:

If one particle jumps into someone else's cell, get the difference on.

Take the system in which the transition occurred h. Particles. Let we want to. The calculation shows that h. = 10 12 .

As the system goes into an equilibrium state, the thermodynamic probability is growing greatly, entropy is also growing. Consequently,

Find the view of this function, for this we will take the system of two cells. In the first case, Na + 0, in the second 0.5 + 0.5. The temperature is constant. The transition from the first state to the second is isothermal gas expansion.

According to the Boltzmann formula,

So it turns out the Boltzmann's constant. Now we get the formula Boltzmann.

Take two systems

Of the two systems we form the third, then entropy new system will be:

The probability of two independent systems is variable:

Logarithmic function:

But entropy is the size of the dimensional, the proportionality coefficient is needed. And this is the Boltzmann constant.

Here is a slippery transition and the conclusion that the maximum entropy at the point of equilibrium - the law is not absolute, but statistical. As can be seen, the smaller the particles, the less the second law of thermodynamics is performed.

§ 2. Distribution of energy molecules. Boltzmann law

System from n particle ,. How are the molecules are missing in energy? What number of molecules has energy?

Entropy in equilibrium state has the maximum value:

And now we will find something else:

Find differentials:

In equation (2), not all amounts are independent

In order to get rid of dependent variables, we use the method of uncertain Lagrange multipliers:

They are selected so that the coefficients for the dependent variables are equal to zero.

Then the remaining members are in the amount of independent. Finally it turns out that

Potentiate this equation:

Summing up:

Substitute in (3):

Get rid of another multiplier. The UR-E (6) is logarithming, we multiply on and summarize:

The uncertain Lagrange multiplier became definite.

Finally, the Boltzmann law will be recorded:

Substitute in (8) the value

Boltzmann factor

Sometimes the Boltzmann distribution is recorded and so:

Accordingly, at a temperature close to absolute zero,, i.e. No molecules on excited levels. At a temperature seeking infinity, the distribution in all levels is equally.

- amount by states


§ 3. The sum of the states of the molecule and its connection with thermodynamic properties

We find out what properties the amount is posted according to the states of the molecule. First, it is a dimensionless value, and its value is determined by the temperature, the number of particles and the system volume. It also depends on the mass of the molecule and its form of movement.

Further, the amount according to the states is a non-absolute value, it is determined with an accuracy of a constant multiplier. Its value depends on the level of the energy reference. Often, the absolute zero temperature and the state of the molecule with minimal quantum numbers are accepted for this level.

The amount according to states is a monotonously increasing temperature function:

With increasing energy, the amount of states increases.

The amount according to the states of the molecule has a multiplication property. The energy of the molecule can be represented by the sum of the progressive and intramolecular energies. Then the amount according to states will be recorded like this:

You can also:

The excitation of electronic levels requires a high temperature. With relatively ne. high temperatures The contribution of electronic oscillations is close to zero.

Zero level of electronic state

This is all called the bore approach - Oppenheimer.

Suppose that, then the amount can be replaced as follows:

If the rest are also among themselves almost the same, then:

Level degeneracy

This form of recording is called the sum of the energy levels of the molecule.

The amount according to states is associated with the thermodynamic properties of the system.

Take a temperature derivative:

Received an expression for entropy

Energy Helmholts

Find pressure:

Entalpy and Gibbs Energy:

The heat capacity remains:

First, all values \u200b\u200bare increment to zero energy, secondly, all equations are performed for systems where particles can be considered distinguishable. In perfect gas molecules indistinguishable.

§ 4. Canonical distribution of Gibbs

Gibbs proposed a method of statistical, or thermodynamic, ensembles. The ensemble is a large, aspiring infinity, the number of similar thermodynamic systems located in various microstasters. The microcanonic ensemble is characterized by daytime. The canonical ensemble has permanent. The distribution of the Boltzmann was removed for a microcanonic ensemble, we turn to the canonical.

What is the probability of a single microstation in the system in the thermostat?

Gibbs introduced the concept of a statistical ensemble. Imagine a large thermostat, put an ensemble into it - the same systems in various microstasters. Let be M. - The number of systems in the ensemble. Able i. There are systems.

In the canonical ensemble, since states with different energy can be realized, it should be expected that the probabilities will depend on the level of energy to which they belong.

Let there be a state where the energy of the system and its entropy is equal. This system corresponds to microstastes.

The energy of the Helmholtz of the entire ensemble is constant.

If the internal energy equate to energy, then

Then the probability of one state is equal

Thus, the probabilities relating to different energies depend on the energy of the system, and it may be different.

- canonical distribution of Gibbs

- probability of a macro-stand

probably.

§ 5. The amount according to the states of the system and its connection with thermodynamic functions

Amount for system states

The system status function has a multiplication property. If the energy of the system is submitted in the form:

It turned out that this connection is valid for the localized particle system. The number of microstasters for non-silicized particles will be much less. Then:

Using the animation property, we get:

§ 6. Progressive amount according to states.
TD Properties of single buttomic perfect gas

We will consider the one-nosed ideal gas. The molecule is considered a point that has a mass and ability to move in space. The energy of such a particle is equal to:

Such a movement has three degrees of freedom, so imagine this energy in the form of three components. Consider moving along the coordinate h..

From quantum mechanics:

Postulated.

Basic concepts

Basic knowledge.

Statistical interpretation of concepts: internal energy, the work of the subsystem, the amount of heat; Justification of the first start of thermodynamics with the help of the canonical distribution of Gibbs; Statistical substantiation of third thermodynamics; properties of macrosystems at; physical meaning of entropy; Terms of stability of the thermodynamic system.

Basic skills.

Independently work with recommended literature; define the concepts of claim 1; be able to logically justify using the mathematical apparatus elements of knowledge from paragraph 2; According to a well-known statistical amount (statistical integral), determine the internal energy of the system, the free energy of the Helmholtz, the free energy of Gibbs, entropy, the equation of state, etc.; To determine the direction of the evolution of the open system at constant and, constant and, permanent and.

Internal energy of a macroscopic system.

The basis of statistical thermodynamics is the following statement: the internal energy of the macroscopic body is identical to its average energy calculated by the laws of statistical physics:

(2.2.1)

Substituting the canonical distribution of Gibbs, we get:

(2.2.2)

The numerator of the right-hand part of equality (2.2.2) is a derivative of Z. by :

.

Therefore, the expression (2.2.2) can be rewritten in a more compact form:

(2.2.3)

Thus, to find the internal energy of the system, it is enough to know its statistical amount Z..

The second beginning of the thermodynamics and the "time arrow".

Entropy of an isolated system in a non-equilibrium state.

If the system is in an equilibrium state or participates in a quasistatic process, its entropy from a molecular point of view is determined by the number of microstasters corresponding to this macro-standard system with an average value:

.

The entropy of an isolated system in a non-equilibrium state is determined by the number of microstasters corresponding to this system to the system:

and .

The third law of thermodynamics.

The third law of thermodynamics characterizes the properties of the thermodynamic system at very low temperatures (). Let the smallest possible energy of the system -, and the energy of the excited states -. At a very low temperature, the average energy of thermal motion . Therefore, the energy of thermal motion is not enough to switch the system to an excited state. Entropy, where - the number of states of the system with energy (that is, in the main state). Therefore, equal to one, in the presence of degeneration, a small number (divergence). Consequently, the entropy of the system, and in the same case can be considered equal to zero (- very small number). Since entropy is determined with an accuracy of an arbitrary constant sometimes this statement formulates this: when,. The value of this constant does not depend on pressure, volume and other parameters characterizing the state of the system.

Questions for self-test.

1. Formulate postulates of phenomenological thermodynamics.

2. Formulate the second principle of thermodynamics.

3. What is the mental Narlykar experiment?

4. Prove that the entropy of an isolated system in nonequilibrium processes increases.

5. The concept of internal energy.

6. Under what conditions (in what cases) the system status can be considered as an equilibrium?

7. What is the process called reversible and irreversible?

8. What is thermodynamic potential?

9. Record thermodynamic functions.

10. Explain the production of low temperatures during adiabatic demagnetization.

11. The concept of negative temperature.

12. Record the thermodynamic parameters through the amount of states.

13. Record the main thermodynamic equality of the system with a variable number of particles.

14. Explain the physical meaning of chemical potential.


Tasks.

1. Prove the main thermodynamic equality.

2. Find the expression of the thermodynamic potential of free energy F.through the integral of the state Z.systems.

3. Find the expression of entropy S.through the integral of states Z.systems.

4. Find the Entropy Dependency S.ideal single-nuclear gas from N.particles from Energy E.and volume V..

5. Remove the main thermodynamic equality for the system with a variable number of particles.

6. Remove a large canonical distribution.

7. Calculate the free energy of a single oriental ideal gas.

II. Statistical thermodynamics.

Basic concepts

Quasistatic process; zero postulate of phenomenological thermodynamics; The first postulate of phenomenological thermodynamics; the second postulate of phenomenological thermodynamics; The third postulate of phenomenological thermodynamics; The concept of internal energy; status function; process function; Basic thermodynamic equality; The concept of entropy for an isolated non-equilibrium system; The concept of local instability of phase trajectories (particle trajectories); stirring systems; reversible process; irreversible process; thermodynamic potential; free energy of the Helmholtz; free energy of Gibbs; the ratio of Maxwell; summarized coordinates and generalized forces; The principles of extremum in thermodynamics; The principle of lessel-brown.

As a result of the study of the material of chapter 9, the student must: know The main postulates of statistical thermodynamics; be able to count amounts according to states and know their properties; use the terms and definitions given in the chapter;

own special terminology; Skills for calculating the thermodynamic functions of ideal gases by statistical methods.

The main postulates of statistical thermodynamics

The thermodynamic method is not applicable to systems consisting of a small number of molecules, as the difference between the heat and work disappears in such systems. At the same time disappears the unambiguous direction of the process:

For a very small number of molecules, both directions of the process become equivalent. For an isolated system - the increment of entropy or equal to the warmth (for equilibrium-reversible processes), or more of it (for nonequilibrium). Such dualisticity of entropy can be explained from the point of view of ordering - fragileness of motion or state of the particle system components; Consequently, high-quality entropy can be considered as a measure of disorder of the molecular state of the system. These qualitative representations are quantitatively developed by statistical thermodynamics. Statistical thermodynamics is part of a more general section of science - statistical mechanics.

The basic principles of statistical mechanics were developed in late XIX. in. In the works of L. Boltzmann and J. Gibbs.

When describing systems consisting of a large number of particles, two approaches can be used: microscopic and macroscopic. The macroscopic approach is used by classical thermodynamics, where the states of systems containing a single pure substance are determined in general, three independent variables: T. (temperature), V. (volume), N. (number of particles). However, from a microscopic point of view, a system containing 1 mol of substance includes 6.02 10 23 molecules. In addition, the first approach is characterized in detail by the system system,

for example, the coordinates and impulses of each particle at each time. Microscopic description requires solving classical or quantum motion equations for a huge number of variables. So, each perfect gas microstation in classical mechanics is described by 6n variables. (N. - The number of particles): Zn coordinates and zn impulse projections.

If the system is in an equilibrium state, then its macroscopic parameters are constant, while microscopic parameters vary with time. This means that each macro-standard corresponds to several (in fact - infinitely many) microstastes (Fig. 9.1).

Fig. 9.1.

Statistical thermodynamics establishes a link between these two approaches. The basic idea is as follows: if each macrocro corresponds to a lot of micromostasis, each of them contributes its contribution to the macro-stand. Then the properties of the macro-stand can be calculated as the average but all microstasses, i.e. By summing their contributions, taking into account statistical weight.

Averaging on microstation is carried out using the concept of a statistical ensemble. The ensemble is an infinite set of identical systems in all possible microstasters corresponding to one macrostication. Each ensemble system is one microstation. The entire ensemble is described by some distribution function by coordinates and impulses P (P, q. , t), which is defined as follows: P (p, q, t) dpdq - This is the likelihood that the ensemble system is in the volume element. dPDQ. near the point ( r , q) At the time of time t.

The meaning of the distribution function is that it determines the statistical weight of each microstation in macrostication.

The definition follows the elementary properties of the distribution function:

Many macroscopic properties of the system can be defined as the average value of the functions of coordinates and pulses f (p, q) By ensemble:

For example, internal energy is the average value of the Hamilton function N (P, Q):

(9.4)

The existence of the distribution function is the essence of the main postulate of classical statistical mechanics: the macroscopic state of the system is completely given by some distribution function , which satisfies the conditions (9.1) and (9.2).

For equilibrium systems and equilibrium ensembles, the distribution function is explicitly dependent on time: p \u003d p (p, q). An explicit kind of distribution function depends on the type of ensemble. There are three main title ensembles:

where k. \u003d 1.38 10 -23 J / K - Boltzmann's constant. The value of the constant in the expression (9.6) is determined by the condition of normalization.

A special case of canonical distribution (9.6) is the distribution of Maxwell at speeds b which is fair for gases:

(9.7)

where m - Mass of gas molecules. The expression P (V) DV describes the likelihood that the molecule has an absolute value of the speed in the range from v. before v. + D &. Maximum function (9.7) gives the most likely speed of molecules, and the integral

middle molecules.

If the system has discrete energy levels and describes quantum mechanically, then instead of the Hamilton function N (P, Q) Used the operator Hamilton N, And instead of the distribution function - the operator of the density matrix P:

(9.9)

Diagonal elements of the density matrix give the likelihood that the system is in the I-M energy state and has energy E (.

(9.10)

The constant value is determined by the conditions of normalization:

(9.11)

The denominator of this expression is called the amount according to states. It is key to statistical evaluation of the thermodynamic properties of the system. From expressions (9.10) and (9.11) you can find the number of particles N jf. having energy

(9.12)

where N - The total number of particles. The particle size distribution (9.12) is called the boltzmann distribution, and the numerator of this distribution is a Boltzmann factor (multiplier). Sometimes this distribution is recorded in another form: if there are several levels with the same energy £, then they are combined into one group by summing up Boltzmann factors:

(9.13)

where gj - Number of energy levels EJ. , or statistical weight.

Many macroscopic parameters of the thermodynamic system can be calculated using Boltzmann distribution. For example, the average energy is defined as mean by energy levels, taking into account their statistical scales:

(9.14)

3) a large canonical ensemble describes open systemsIn thermal equilibrium and can exchange substance with the environment. Thermal equilibrium is characterized by temperature T, And equilibrium in terms of particles - chemical potential p. Therefore, the distribution function depends on temperature and chemical potential. Explicit expression for the distribution function of a large canonical ensemble we will not use here.

In statistical theory, it is proved that for systems with a large number of particles (~ 10 23), all three types of ensembles are equivalent to each other. The use of any ensemble leads to the same thermodynamic properties, so the selection of a particular ensemble of the thermodynamic system is dictated only by the convenience of mathematical processing of distribution functions.

Thermodynamic system, team and its condition. The ensemble method. Entropy and probability. Gibbs canonical ensemble. Canonical distribution. Gibbs factor. Probability, free energy and statistical amount.

System and subsystems. General properties statistical sums. The statistical sum of the test particle and the team.

Perfect gas. Boltzmann distribution. Boltzmann factor. Quantum states and discrete levels of simple molecular movements. Statistical weight (degeneracy). Amounts in levels and amounts according to states.

Localized and delocalized systems. Translation sum of states, indistinguishability of particles, standard volume. The rotational sum of the levels of the diatomic molecule, the orientation of the indistinguishability and the number of symmetry. Statistical sums for one and several rotational degrees of freedom. The oscillatory statistical amount in the harmonic approximation. Correction of statistical sums of simple movements. Zero level of oscillations, molecular energy scale, and molecular sum of states.

Free Energy A and Statistical Formulas for Thermodynamic Functions: Entropy S, Pressure P, Internal Energy U, Enthalpy H, Gibbs GBS Energy, Chemical Potential M. Chemical reaction and KP equilibrium constant in the system of perfect gases.

1. Introduction. A brief reminder of the basic information from thermodynamics.

... Conveniently thermodynamic arguments and the status function defined with them is presented as a single array of interrelated variables. This method was proposed by Gibbs. So, let's say, entropy, which, by definition, is a function of a state, moves into a category of one of two natural caloric variables, complementing in this quality temperature. And if, in any caloric processes, the temperature looks like an intense (power) variable, the entropy acquires the status of an extensive variable - the thermal coordinate.

This array can always be supplemented with new state functions or, as needed by the equations of a state that binds arguments between their own. The number of arguments minimally necessary for an exhaustive thermodynamic description of the system is called the number of degrees of freedom. It is determined from the fundamental considerations of thermodynamics and can be reduced due to different communication equations.

In such a single array, the arguments and function functions can be changed. This technique is widely used in mathematics when constructing reverse and implicit functions. The purpose of such logical and mathematical techniques (sufficiently thin) is one - the achievement of the maximum compactness and harness of the theoretical scheme.

2. Characteristic functions. Differential equations Mass.

The array of variables P, V, T is conveniently adding to the state function S. There are two equations of communication between them. One of them is expressed in the form of the postulated interdependence of the variables F (p, V, T) \u003d 0. Speaking about the "equation of state", most often it is this dependence in mind. However, any state function corresponds to a new equation of state. Entropy by definition has a state function, i.e. S \u003d S (P, V, T). Therefore, there are two ties between four variables, and only two can be distinguished as independent thermodynamic arguments, i.e. For an exhaustive thermodynamic description of the system, only two degrees of freedom are sufficient. If this array of variables is supplemented with a new function function, then, along with a new variable, another equation of communication appears, and it became, the number of degrees of freedom will not increase.

Historically, the first of the status functions was internal energy. Therefore, with its participation, it is possible to form a source array of variables:

An array of communication equations in this case contains functions of the form

f (p, v, t) \u003d 0, 2) u \u003d u (p, v, t), 3) s \u003d s (p, v, t).

These values \u200b\u200bcan be changed with roles or form new state functions of them, but in any case the essence does not change, and two independent variables will remain. The theoretical scheme will not be out of two degrees of freedom until the need to take into account new physical effects and related energy transformations associated with them, and they will be impossible to characterize without expanding the circle of arguments and the number of status functions. Then the number of freedom degrees can change.

(2.1)

3. Free Energy (Helmholts Energy) and its role.

The state of the isothermal system with a constant volume is appropriate to be described by free energy (the function of the Helmholtz). Under these conditions, it is a characteristic function and isochloro-isothermal potential of the system.

Through private differentiation from it, it is further possible to extract other necessary thermodynamic characteristics, namely:

(3.1)

To construct an explicit form of free energy functions for some relatively simple systems, it is possible by statistical thermodynamics.

4. About equilibrium.

In any naturally flowing (spontaneous or free) process, the free energy system decreases. When a thermodynamic equilibrium system is reached, its free energy reaches a minimum and already in equilibrium further saves a constant value. From equilibrium, the system can be derived due to external forces, increasing its free energy. Such a process can no longer be free - it will be forced.

Microscopic movements of particles and in equilibrium are not terminated, and in a system consisting of a huge number of particles and subsystems of any nature, there are many different particular options and combinations of individual parts and inside them, but they all do not derive the system of equilibrium.

The thermodynamic equilibrium in the macro system does not mean at all that all types of movement disappear in its microscopic fragments. On the contrary, the equilibrium is ensured by the dynamics of these microscopic movements. They are carrying out continuous alignment - smoothing the observed macroscopic signs and properties, not allowing their emissions and excessive fluctuations.

5. On the statistical method.

The main purpose of the statistical method is to establish a quantitative connection between the characteristics mechanical movements Separate particles constituting the equilibrium statistical team, and the averaged properties of this team, which are available for thermodynamic measurements by macroscopic methods.

The goal is to be on the basis of mechanical characteristics The movements of individual microelements of the equilibrium team to derive quantitative laws for the thermodynamic parameters of the system.

6. Equilibrium and fluctuations. Microstation.

According to the Gibbs method, the thermodynamic system is a collective - a set of a very large number of elements - the same type of subsystems.

Each subsystem in turn may also consist of a very large number of other even smaller subsystems and in turn can play the role of a completely independent system.

All natural fluctuations inside the equilibrium equilibrium system are not violated, they are compatible with a stable macroscopic state of a huge particle collective. They simply redistribute signs of individual elements of the collective. There are different microstasses, and they are all essence of the version of the same observed macrostication.

Each individual combination of states of elements of the team generates only one of the tremendous multitude of possible microscopes of the macro system. All of them in a physical sense are equal, everyone leads to the same set of measurable physical parameters of the system and differ only with some parts of the distribution of states between the elements ...

All microstasses are compatible with macroscopic - thermodynamic equilibrium, and numerical scatter of individual components of free energy (its energy and entropy) is quite common. It should be understood that the spread arises due to the continuous exchange of energy between the particles - elements of the collective. It decreases on some elements, but others increase.

If the system is in the thermostat, then the energy exchange and with the environment is still continuously. There is a natural energy mixing of the collective, due to the continuous exchange between the microparticles of the collective. Equilibrium is constantly maintained through thermal contact with an external thermostat. So in the statistics most often called environment.

7. Gibbs method. Statistical ensemble and its elements.

Creating universal scheme Statistical mechanics, Gibbs used an amazingly simple admission.

Any real macroscopic system is a team of a huge set of elements - subsystems. Subsystems may also have macroscopic sizes, and can be microscopic, up to atoms and molecules. It all depends on the problem under consideration and the level of research.

At different points in time at different points of the real system, in different spatial regions of the macroscopic team, the instantaneous characteristics of its small elements may be different. "Heterogeneity" in the team constantly migrate.

Atoms and molecules can be in different quantum states. The team is huge, and it presents various combinations of states of physically identical particles. At the atomic-molecular level, there is always an exchange of states, there is their continuous mixing. Due to this, the properties of various fragments of the macroscopic system are aligned, and the physically observable macroscopic state of the thermodynamic system looks unchanged ...

10. Basic postulates of statistical thermodynamics

When describing systems consisting of a large number of particles, two approaches can be used: microscopic and macroscopic. In the first approach based on classical or quantum mechanics, the system of the system, for example, the coordinates and pulses of each particle at each time of time are characterized in detail. Microscopic description requires solving classical or quantum motion equations for a huge number of variables. Thus, each perfect gas microstation in classical mechanics is described 6 N. variables ( N. - number of particles): 3 N. Coordinates and 3. N. Pulse projections.

A macroscopic approach that uses classical thermodynamics, characterizes only the system macrocro and uses a small number of variables, for example, three: temperature, volume and number of particles. If the system is in an equilibrium state, then its macroscopic parameters are constant, while microscopic parameters vary with time. This means that each macrocro corresponds to several (in fact, infinitely many) microstasses.

Statistical thermodynamics establishes a link between these two approaches. The main idea is as follows: if a lot of microstasters correspond to each macrocro, then each of them makes its contribution to the macro-stand. Then the properties of the macro window can be calculated as the average for all microstasses, i.e. By summing their contributions, taking into account statistical weight.

Averaging on microstation is carried out using the concept of a statistical ensemble. Ensemble - This is an infinite set of identical systems that are in all possible microstasters corresponding to one macrostication. Each ensemble system is one microstation. The entire ensemble is described by some distribution function by coordinates and impulses ( p., q., t.), which is defined as follows:

(p., q., t.) dP DQ. - This is the likelihood that the ensemble system is in the volume element. dP DQ. near the point ( p., q.) at the time of time t..

The meaning of the distribution function is that it determines the statistical weight of each microstation in the macro-stand.

The definition follows the elementary properties of the distribution function:

1. Normalovka

. (10.1)

2. Positive certainty

(p., q., t.) І 0 (10.2)

Many macroscopic system properties can be defined as meanfunctions of coordinates and pulses f.(p., q.) by ensemble:

For example, internal energy is the average value of the Hamilton function H.(p.,q.):

The existence of the distribution function is the essence the main postulate of classical statistical mechanics:

The macroscopic state of the system is fully defined by some distribution function, which satisfies conditions (10.1) and (10.2).

For equilibrium systems and equilibrium ensembles, the distribution function is explicitly on time: \u003d ( p.,q.). An explicit kind of distribution function depends on the type of ensemble. Three main types of ensembles distinguish:

1) Microcanonic The ensemble describes insulated systems and is characterized by variables: E. (energy), V.(volume), N.(number of particles). In an isolated system, all micro-beds are equal to ( adjusting an a priority probability):

2) Canonical ensembledescribes systems that are in thermal equilibrium with the environment. Thermal equilibrium is characterized by temperature T.. Therefore, the distribution function also depends on temperature:

(10.6)

(k. \u003d 1.38 10 -23 J / K - Boltzmann's constant). The value of the constant in (10.6) is determined by the conditions of normalization (see (11.2)).

A special case of canonical distribution (10.6) is distribution of Maxwell Speed \u200b\u200bV, which is fair for gases:

(10.7)

(m. - mass of gas molecules). Expression (V) d.v describes the likelihood that the molecule has an absolute value of the speed in the range from V to V + d.v. Maximum function (10.7) gives the most likely speed of molecules, and integral

Middle molecules.

If the system has discrete energy levels and describes quantum mechanically, then instead of the Hamilton function H.(p.,q.) Used the Hamilton operator H.And instead of the distribution function - the operator of the density matrix:

(10.9)

Diagonal elements of the density matrix give the likelihood that the system is in i.energy state and has energy E I.:

(10.10)

The constant value is determined by the condition of normalization: s i. = 1:

(10.11)

The denominator of this expression is called the amount according to states (see ch. 11). It is key to a statistical assessment of the thermodynamic properties of the system from (10.10) and (10.11) you can find the number of particles N I.having energy E I.:

(10.12)

(N. - Total number of particles). Particle distribution (10.12) in energy levels are called the distribution of Boltzmann, and the numerator of this distribution - the Boltzmann factor (multiplier). Sometimes this distribution is written in another form: if there are several levels with the same energy E I., they are combined into one group by summing up Boltzmann factors:

(10.13)

(g I. - number of energy levels E I., or statistical weight).

Many macroscopic parameters of the thermodynamic system can be calculated using Boltzmann distribution. For example, the average energy is defined as mean by energy levels, taking into account their statistical scales:

, (10.14)

3) Large canonical ensembledescribes open systems that are in thermal equilibrium and can exchange substance with the environment. Thermal equilibrium is characterized by temperature T., and equilibrium by the number of particles - chemical potential. Therefore, the distribution function depends on temperature and chemical potential. Explicit expression for the distribution function of a large canonical ensemble we will not use here.

In statistical theory, it is proved that for systems with a large number of particles (~ 10 23), all three types of ensembles are equivalent to each other. The use of any ensemble leads to the same thermodynamic properties, so the selection of a particular ensemble of the thermodynamic system is dictated only by the convenience of mathematical processing of distribution functions.

Examples

Example 10-1. The molecule can be on two levels with an energy of 0 and 300 cm -1. What is the likelihood that the molecule will be at the top level at 250 o C?

Decision. It is necessary to apply the distribution of the Boltzmann, and the multiplier to translate the spectroscopic unit of energy to the SM -1 hC. (h. \u003d 6.63 10 -34 J. C, c. \u003d 3 10 10 cm / s): 300 cm -1 \u003d 300 6.63 10 -34 3 10 10 \u003d 5.97 10 -21 J.

Answer. 0.304.

Example 10-2. The molecule may be at a level with an energy of 0 or on one of three levels with energy E.. At what temperature a) all molecules will be at the lower level, b) the number of molecules at the lower level will be equal to the number of molecules at the upper levels, c) the number of molecules at the lower level will be three times less than the number of molecules at the top levels?

Decision. We use Boltzmann distribution (10.13):

and) N. 0 / N. \u003d 1; exp (- E./kt.) = 0; T. \u003d 0. With a decrease in temperature, the molecule is accumulated at the lower levels.

b) N. 0 / N. \u003d 1/2; exp (- E./kt.) = 1/3; T. = E. / [k. LN (3)].

in) N. 0 / N. \u003d 1/4; exp (- E./kt.) = 1; T. \u003d. At high temperatures, the molecule is uniformly distributed over energy levels, because All Boltzmann factors are almost the same and equal to 1.

Answer. and) T. \u003d 0; b) T. = E. / [k. ln (3)]; in) T. = .

Example 10-3. When you heated any thermodynamic system, the population of some levels increases, and others decreases. Using the Boltzmann distribution law, determine what the level of the level should be in order for its intensity to increase with increasing temperature.

Decision. Settlement is the proportion of molecules at a certain energy level. By the condition, the derivative of this temperature should be positive:

In the second line, we used the definition of average energy (10.14). Thus, the settlement increases with increasing temperature for all levels exceeding the average energy of the system.

Answer. .

TASKS

10-1. The molecule can be on two levels with an energy of 0 and 100 cm -1. What is the probability that the molecule will be on lower level at 25 o C?

10-2. The molecule can be on two levels with an energy of 0 and 600 cm -1. At what temperature at the top level will be two times less than molecules than on the bottom?

10-3. The molecule may be at a level with an energy of 0 or on one of three levels with energy E.. Find the average energy of molecules: a) at very low temperatures, b) at very high temperatures.

10-4. When cooled by any thermodynamic system, the population of some levels increases, and others decreases. Using the Boltzmann distribution law, determine what the energy energy should be in order for its intensity to increase with a decrease in temperature.

10-5. Calculate the most likely speed of carbon dioxide molecules at a temperature of 300 K.

10-6. Calculate the average velocity of helium atoms under normal conditions.

10-7. Calculate the most likely speed of ozone molecules at a temperature -30 o C.

10-8. At what temperature, the average speed of oxygen molecules is 500 m / s?

10-9. Under some conditions, the average speed of oxygen molecules is 400 m / s. What is the average speed of hydrogen molecules under the same conditions?

10-10. What is the proportion of molecules m.having speed above average at temperatures T.? Does this share depend on the mass of molecules and temperature?

10-11. Using the distribution of Maxwell, calculate the average kinetic energy of the molecules of mass m. at a temperature T.. Is this energy equal to kinetic energy at medium speed?

Similar articles

2021 liveps.ru. Home tasks and ready-made tasks in chemistry and biology.