what does this statement exactly means "entropy not only refers to the distribution of particles but also to the ways of distributing the energy of the systems in all of the available energy levels"?
- Thread Starter
- 01-04-2016 13:47
- Official Rep
- 03-04-2016 15:03
Sorry you've not had any responses about this. Are you sure you've posted in the right place? Here's a link to our subject forum which should help get you more responses if you post there.
You can also find the Exam Thread list for A-levels here and GCSE here.
Just quoting in Puddles the Monkey so she can move the thread if needed
Spoiler:Show(Original post by Puddles the Monkey)
(Original post by All rounder)
- 05-04-2016 00:18
what does this statement exactly means "entropy not only refers to the distribution of particles
but also to the ways of distributing the energy of the systems in all of the available energy levels"?
where is the number of ways of distributing units of energy among particles, each of which can exist in states of energy. Each different way of distributing the energy is called a microstate.
You can count the number of particles in each energy level for each possible microstate. The number of particles in each energy level is called a macrostate (i.e. a gross description of the system which ignores the precise details of which particle has what energy). You will find that each macrostate can result due to many different microstates. Some macrostates occur due to many, many microstates, other occur due to very few.
Physical systems such as these are not static; the particles are constantly exchanging energy with each other, and consequently the current microstate of the system changes over time, randomly - it is an assumption of physics that the system cycles through all of its microstates (given enough time) spending the same amount of time in each i.e. that each microstate occurs with the same probability. This is the so-called ergodic hypothesis.
As a result of the ergodic hypothesis, we can say that the most likely macrostate of a system is the one with the greatest number of microstates, as the system spends most of its time there. This leads to the conclusion that a system prepared in a macrostate corresponding to few microstates won't stay there long - random exchange of energy will lead to the system ending up in a macrostate corresponding to the greatest number of microstates - since this is what our definition of entropy above measures, then we can see that the system will tend to move from a state of low entropy to a state of high entropy.
To give a more definite example: suppose you prepare a gas in a container somehow so that all the molecules have the same energy. This is a low entropy state. After a shortish while, due to the exchange of energy by inter-molecular collisions, you will find that the energies of the molecules follow a Boltzmann distribution - this is a high entropy state. Due to the fact that there are many (many, many, ..) more microstates for this distribution than for the equal-energy distribution, the system will remain in this distribution, once it has got there, simply because it is very unlikely to reach a microstate where the energy distribution differs greatly from it (there aren't many such microstates where this happens, comparatively speaking). So it is in an "equilbrium" state, and these are the states where the system has reached greatest entropy (or to put it less mysteriously - where the system has entered a macrostate that corresponds to such an overwhelmingly huge number of microstates that it is unlikely ever to escape from it, and if it does, then not for long, despite the fact that the ergodic hypothesis tells us that it will be cycling through all the possible microstates)
- Thread Starter
- 06-04-2016 20:07
Thank u soo much it means a lot to me u explained really wellLast edited by All rounder; 06-04-2016 at 20:10.