The Student Room Group

Thermodynamics

what does this statement exactly means "entropy not only refers to the distribution of particles but also to the ways of distributing the energy of the systems in all of the available energy levels"?
Sorry you've not had any responses about this. :frown: Are you sure you've posted in the right place? :smile: Here's a link to our subject forum which should help get you more responses if you post there. :redface:

You can also find the Exam Thread list for A-levels here and GCSE here. :dumbells:


Just quoting in Puddles the Monkey so she can move the thread if needed :h:

Spoiler

Original post by All rounder
what does this statement exactly means "entropy not only refers to the distribution of particles


I'm not sure what this is supposed to mean (it sounds like waffle) but this ..

but also to the ways of distributing the energy of the systems in all of the available energy levels"?


.. is essentially the definition of entropy in statistical physics. You can define entropy as:

SlnΩS \propto \ln \Omega

where Ω\Omega is the number of ways of distributing nn units of energy among mm particles, each of which can exist in kk states of energy. Each different way of distributing the energy is called a microstate.

You can count the number of particles in each energy level for each possible microstate. The number of particles in each energy level is called a macrostate (i.e. a gross description of the system which ignores the precise details of which particle has what energy). You will find that each macrostate can result due to many different microstates. Some macrostates occur due to many, many microstates, other occur due to very few.

Physical systems such as these are not static; the particles are constantly exchanging energy with each other, and consequently the current microstate of the system changes over time, randomly - it is an assumption of physics that the system cycles through all of its microstates (given enough time) spending the same amount of time in each i.e. that each microstate occurs with the same probability. This is the so-called ergodic hypothesis.

As a result of the ergodic hypothesis, we can say that the most likely macrostate of a system is the one with the greatest number of microstates, as the system spends most of its time there. This leads to the conclusion that a system prepared in a macrostate corresponding to few microstates won't stay there long - random exchange of energy will lead to the system ending up in a macrostate corresponding to the greatest number of microstates - since this is what our definition of entropy above measures, then we can see that the system will tend to move from a state of low entropy to a state of high entropy.

To give a more definite example: suppose you prepare a gas in a container somehow so that all the molecules have the same energy. This is a low entropy state. After a shortish while, due to the exchange of energy by inter-molecular collisions, you will find that the energies of the molecules follow a Boltzmann distribution - this is a high entropy state. Due to the fact that there are many (many, many, ..) more microstates for this distribution than for the equal-energy distribution, the system will remain in this distribution, once it has got there, simply because it is very unlikely to reach a microstate where the energy distribution differs greatly from it (there aren't many such microstates where this happens, comparatively speaking). So it is in an "equilbrium" state, and these are the states where the system has reached greatest entropy (or to put it less mysteriously - where the system has entered a macrostate that corresponds to such an overwhelmingly huge number of microstates that it is unlikely ever to escape from it, and if it does, then not for long, despite the fact that the ergodic hypothesis tells us that it will be cycling through all the possible microstates)
Reply 3
Thank u soo much it means a lot to me :smile: u explained really well
(edited 8 years ago)

Quick Reply

Latest