Monday, May 11, 2009

The Second Law of Thermodynamics


The Second Law of Thermodynamics is perhaps the most important governing principle of which we are aware in the physical universe. Strangely, of all of the physical laws that we use to describe physical systems, it is the only one that is not -- strictly speaking -- a law.

Let's take Newton's Third Law of Mechanics (for every force, there is an equal force in the opposite direction) as an example of a classically true one. No matter what you do -- if you push on a wall or throw a baseball or kick a dog (note: don't kick dogs) -- the object will exert the same force on you that you exert on it. Don't believe me? Go punch a wall as hard as you can and see what happens (note: don't do this either). It always happens. Always.

Conversely, the Second Law of Thermodynamics is actually just a declaration of statistics. Simply, it states that entropy tends to increase over time. Let's define entropy with a little example.

Imagine that you have a coin and you flip it. What is the probability that it will land heads-up? Right. 50-50. Now flip ten coins. What is the probability that five of them will land heads-up and five of them heads-down? You might be surprised to find out that it's about 25%.
I've included a little table to prove this to you. A microstate is the total number of possible arrangements of the corresponding number of coins showing heads. In other words, there is only one way to show no heads (all tails) and there are ten ways to show 1 heads-up coin (coin 1 heads up and the rest tails, coin 2 up and the rest tails, etc.) and so on. Beneath the "Microstates" column is a sum of all of the possible states. Divide the number of microstates for a certain coin combination by the total number possible and you get the probability of that many heads showing.

So what does that have to do with entropy? Well, what if you flipped all ten coins a billion times? You would expect that the most likely combination (5H and 5T) would show up most often. And that's what happens. Well, that's entropy. If you take the natural logarithm of the numbers in the "Microstate" column, you'll get the entropy of that particular combination of heads and tails. And since the natural logarithm of x gets larger as the value of x increases, the most likely combination has the highest entropy. So when we say that entropy tends to increase, all we mean is that over time, the most likely organization will tend to display itself. In other words, the Second Law of Thermodynamics can be quite simply stated as "Things that are likely to happen are likely to happen."

But you'll notice that this is a statistical certainty, not an absolute one. If you flipped ten coins ten times, you may notice that 6H and 4T showed up more. We can only be absolutely certain of the outcome with really really large numbers. Fortunately, the things we try to describe (heat transfer between two objects, for example) with this law have big numbers in them (objects that transfer heat regularly have billions of trillions of molecules). To illustrate, let's flip 100 coins. The odds of landing 50 heads is relatively small, about 8%. But that's something like eight times more likely than flipping 60 heads, 100 million times more likely than flipping 80 heads and one million billion times more likely than flipping 90 heads. As the number of coins increases, the likelihood of there ever being anything but a 50-50 decreases to zero.

To illustrate, flipping 1023 coins (there are about that many atoms in a just few grams of carbon), you could flip them all once a second every second for the entire age of the universe (1018 seconds) and still never find a result further away from 50-50 than 1%. When you get to the size of things we actually care about, there is absolutely no chance that any other arrangement than the most likely will ever happen ever. So, while the Second Law doesn't absolutely determine the outcome of statistical events, it is still certainly going to happen.

Applications:

So who cares about that? Didn't we already know that likely things happen? Isn't that why they are likely? Why does knowing this mathematically help anyone? To name a few things, this law provides the foundation for our heat transfer models that we use to heat homes, businesses, and cars. It helps determine the size of the fan on your computer or your car needed to keep it from overheating. It helps us know how big of a heat pump your fridge or freezer will need, how solutions tend to mix at a given temperature and when they will freeze or melt (which is how they invented antifreeze, solder and plastic), or at what temperature magnets will cease to be magnetized (more important than you think). It plays a part in determining the conductivity of materials at a given temperature (like the wires in your house or computer), the formation of minerals and rocks in the earth's crust, and the chemical distribution in our atmosphere. In short, the Second Law of Thermodynamics helped us either to develop or manufacture almost every household object that we consider to be common. That is no exaggeration.

To recap, the Second Law is more than just a "description of chaos" as many people choose to describe it. Rather, it is the statistical certainty of likely things to occur over time (which tends towards basic forms of energy—like heat—that could be described as chaotic). As I consider its implications, I am amazed at how such a fundamental law can be so simple while describing so much.

No comments:

Post a Comment