Skip to content

From order to chaos and back again

Making sense of entropy and the concept of time

Last Monday, I sat in a lecture hall in Otto Maass and wrote a thermodynamics midterm. In PHYS 232, the concept of entropy is represented by a specific equation, and as I reached the fourth question, I realized that somewhere between my class notes and reading week, I had forgotten it. I was sure I’d lost that piece of knowledge – what had once been imprinted so neatly on my conscious brain had become dislodged, shoved to some unknown corner.

Every process increases the entropy of the universe; every process brings the system and its surroundings from a state of lower entropy to one of higher entropy. That is: things get messier as time goes on.

Entropy, unlike momentum, unlike mass, and unlike energy, is not conserved. Burn a cord of wood in a campfire, and all of the pieces go somewhere: into the brittle charcoal remains that are left at the end of the evening, into the little flecks of almost-dust that are carried away by the wind, or into vapour, for the last drops of moisture hiding in the corners of the cells of the tree.

If you could collect it all, every little piece dispersed by the wind or left scattered in the fire pit, every molecule dislodged by a reaction, it would weigh just as much as it did when it started: this is conservation of mass, and you cannot destroy mass. But it would be hard to find every little piece – impossible, in fact. And this is entropy. A pine tree goes from a neat tall tree made of cell upon cell, forming ring upon ring, to cords of firewood bundled for purchase, to the ashes in a fire pit and plumes of dust scattered and stirred by wind. Things get less orderly.

Last June, on day two of the Second Annual World Science Festival in New York City, I sat in the 92YTribeca over cocktails and under dim lighting, and listened to Cal Tech physicist Sean Carroll lecture on his own views about entropy. Carroll explained that the laws of physics do not care about past and future, the way that people mark their calendars to go bowling next Friday instead of last night. But then where does this “arrow of time” come from? Carroll explained that entropy can act as this arrow, since the entropy of the universe only ever increases. “You can mix things together very easily. You cannot un-mix them.”

Just as a messy room can be cleaned up at the expense of an afternoon and a few hundred calories, the entropy of a part of the universe can be reduced; from chaos can come order. On a small scale, at least. At the expense of the larger whole becoming messier, systems can be made orderly. Carroll points out that the universe started out at a low-entropy state: a neat and tiny package of forces and matter not yet formed. If entropy is to increase, how did the universe start out in this low-entropy state? Carroll offers this as an argument for a multiverse: if our universe is part of a larger system, if it is just one corner in one room, it could have started out organized – at the expense of the larger multiverse’s own order.

I looked from the blank space on the exam booklet page to the provided sheet of equations that had nothing to do with the exam material for almost 10 whole minutes, checked under all four laws of thermodynamics that I had memorized, bit my fingernails, and upturned every set of variables I had so neatly stored. And at the expense of all that, I was able to write the three variables in the right order, neatly at the top of the page.

Shannon Palus writes every other week. Expend some time to order your thoughts and send her an email: plusorminussigma@mcgilldaily.com.