Tuesday, November 3, 2009

What Is Entropy About? By Rudolph Draaisma

Rudolph Draaisma

Some say entropy is a measure for chaos (disorder) and others that it is a measure for the dispersion of energy. It was originally coined by Rudolph Clausius in the 1850-ies. There are several ways to formulate the Second Law and though very different from each other, they are all considered to be equivalent - if one is wrong, all the others are wrong also. A popular, but wrong formulation says that heat cannot flow spontaneously from a colder to a warmer region.


However, when you are in the tropics, where the air temperature can become above body temperature, your sweating skin cools your body, by which heat flows spontaneously from your cooler body to the warmer environment. In 'technology' this effect has been known and practiced since thousands of years, by keeping water cool in jars of porous material. Some of the water exudes (sweats) through the pores of that material and gives off its heat to the warmer surrounding air.


This is not in conflict with the Second Law of Thermodynamics, because Clausius' statement did not include the term 'spontaneously'. His formulation was: A process whose only final result is to transfer thermal energy from a cooler object to a warmer one, is impossible.


Now, the jar loses water and if not replenished, it will become empty and thus the transfer of heat from the jar to the surrounding air is not the only result of the process. Likewise, if you don't drink water, your sweating body will dry out and die in the end. Thus also here the transfer of heat is not the only result and therefore not in conflict with the conventional definition of the Second Law. Nevertheless, as long as the process lasted, heat indeed did flow spontaneously from a colder to a warmer region. Strangely, this simple fact has never been implemented in modern energy technology - I would like to do it, if I get the chance!


So what is entropy? One thing we can all agree upon is that energy disperses, if it is not hindered to do so - perfect insulation does not exist. This can also be seen as increasing disorder, because, as energy disperses, the molecules involved, move in more chaotic patterns. However, it is true that shuffled cards, or a broken glass on the floor, are rather more chaotic conditions than a dispersion of energy. The confusing point is that one has to do work to restore the original order, not so much work to order the shuffled cards, but basically infinite work to restore the broken glass (without using new materials) to its original condition. This work -Delta Q - disperses in the environment and decays to heat at that environment's temperature T. The according change of entropy: Delta S =Delta Q / T


We are thus talking about closed loop, cycle processes here and if these processes are irreversible (they must be driven by an external source), the applied energy will disperse in the surroundings. The sweating jar and human body constitute irreversible processes, because the evaporated water will not by itself return as liquid to the jar or body - it's an open process. To make it a cycle process, work has to be done and then energy disperses again. Hence, if one sees entropy as a measure for disorder, one actually refers to the work done to restore the original order in a cycle process. If such restoration is not done, the shuffled cards and the broken glass on the floor indeed have nothing to do with entropy. But then, you can do things the easy, or the difficult way and thus the effort needed to restore the original condition, is not a given quantity. Therefore entropy cannot be a measure for disorder.


Is it a measure for the dispersion of energy? If so, then the change of entropy should be independent from whether ideal, or real gases are concerned. On the contrary, real gases behave differently from ideal gases. Unlike ideal gases, real gases do not expand freely at constant temperature, which is known as the Joule-Thomson effect. Most real gases expand at decreasing temperatures, but some do at increasing temperatures. Now somebody tells me how to calculate the change of entropy on this? It is not in my physics books and I have found it nowhere on the web.


Anyway, in the same test set-up, with the same initial temperature, the result of the expansion becomes different, depending whether the expanding gas is an ideal, or a real one. Because the end temperatures are not the same, the change of entropy cannot be the same and this rules out entropy to be a measure for the dispersion of energy!


For me this is the end of entropy as a thermodynamic property of matter AND as a measure for the dispersion of energy, let alone to see entropy as an absolute physical dimension of whatever - what options are left?


Around 100 years ago, one of the greatest scientific genius ever, Ludwig Boltzmann, gave the answer. According to him, entropy is the probability for a given number of micro-states to occur spontaneously, written as S = k•Ln(W) and Delta S = k•Ln(W1/W2) where W stands for probability ('Wahrscheinlichkeit' in German language) and k is Boltzmann's constant.


My more simple formulation is: the probability to predict a certain object to be at a certain time in a certain place. In fact this means DISORDER again, but now we can accept it, because entropy is about probability and not about a physical property of matter. In Boltzmann's formula the terms of energy and temperature do not occur, but mind my argumentation above - it takes work to restore order, just that this work is not of a given amount - do it the easy, or the difficult way (f.ex. filling a bottle with water, using a cup, or a funnel).


Also mind that Boltzmann's formula is valid under the condition of spontaneity only, in contrast to Clausius' formulation, where spontaneity is not a condition for entropy (his 'luck', otherwise the sweating body would invalidate the Second Law). There are no spontaneous reversible processes in the real world (an indefinitely oscillating spring, is an ideal one).


Open systems always involve irreversible processes, by which energy disperses and that only means that the quality of this energy (its density) becomes less. The lowest possible quality is when this energy cannot be recovered any more, as is when thermal energy decays to heat at ambient temperature. If the source quality is low, we can't lower it much more and thus we won't be able to make much use of it. The larger the difference between source quality and that of the drain, or sink (the surroundings, or environment) is, the more useful work we can get out of it, meaning a higher efficiency of the process of consideration.


With entropy as a measure for probability, we can now introduce the notions of high and low entropy sources. High entropy sources give a low probability for efficient usage and low entropy sources give a high probability for it. In this context we can see entropy as a measure for the quality of energy, though it is NOT a physical property of energy. Energy is simply energy, regardless its quality, just like a banana remains a banana, whether it is the only one on your table, or one of many on the tree.


With entropy as a measure for the quality of energy, we have a useful tool to judge the viability of certain projects. If we take wind energy for example, it has a low quality (low density), close to that of the environment and so the efficiency of converting it to high density energy (electrical power) becomes very low - it is a high entropy source. Likewise with solar energy that is widely spread in the environment and thus has a low quality. Likewise with energy from biomass, the source of which is widely spread vegetation. We have to do a lot of work to bring it in the location of usage and to prepare it into a useable form (increase its quality) and so the overall efficiency becomes very low. Fuels on the other hand, have a high energy density, stored in chemical, or nuclear form - they are low-entropy sources. This is why we can make high efficient use of them and that gives the economical viability.


The only natural low-entropy source that solar energy provides, is that of hydro-electric power. The solar energy collects water from wide areas (like the oceans) through rain into a high reservoir of limited size (increase of energy density), thus providing a low-entropy source, that we can make efficient use of. Does the water in that reservoir 'have' a low entropy? One could say that, but it has nothing to do with the physical properties of the water. Once it has fallen down, passed throug the turbine(s) and flown away from there, one could in the same manner say that it has got a high entropy then, but it is still the same water.


Boltzmann's formula simply says that the probability to find a certain water molecule in a certain place at a certain moment was higher in the reservoir, than it is in the stream that flows out from the turbine(s). The same could be said from a tiny fish in that water. Indeed, increased disorder, but that is not the essence of entropy, just an effect of it.


Different it is with the potential energy that was converted to heat and mechanical energy in the turbine(s). This was solar energy, that evaporated the original water from mainly the oceans and let it rain into the reservoir. Had it not done that, it would have been just as spread out in Nature as it becomes in and after the turbine(s) - also the mechanical energy will finally decay to heat at ambient temperature.


Thus is terms of dispersion of energy, the total change of entropy was zero - nothing has changed for planet Earth as a whole. In terms of probability, the entropy has increased, but also this has no meaning for planet Earth. It had meaning for us though, as we needed that mechanial energy to make electricity at a reasonable price.


This is what entropy is all about, nothing more!


Finally, being a thermo-engineer myself, I can tell you that no engineer has to know a thing about entropy, to enable him/her to design a thermo machine. Fortunately, because most engineers have only a vague idea of what entropy is and so they can ignore the subject in their work, as James Watt did for example (think!).


Resource: http://www.isnare.com/?aid=169861&ca=Education

No comments:

Post a Comment