# What is Entropy

## Entropy

All thermodynamic systems generate waste heat. This waste results in an increase in entropy, which for a closed system is "a quantitative measure of the amount of thermal energy not available to do work," according to the American Heritage Dictionary. Entropy in any closed system always increases; it never decreases. Additionally, moving parts produce waste heat due to friction, and radiative heat inevitably leaks from the system.

in other word

Entropy, the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work. Because work is obtained from ordered molecular motion, the amount of entropy is also a measure of the molecular disorder, or randomness, of a system. The concept of entropy provides deep insight into the direction of spontaneous change for many everyday phenomena. Its introduction by the German physicist Rudolf Clausius in 1850 is a highlight of 19th-century physics.

This makes so-called perpetual motion machines impossible. Siabal Mitra, a professor of physics at Missouri State University, explains, "You cannot build an engine that is 100 percent efficient, which means you cannot build a perpetual motion machine. However, there are a lot of folks out there who still don't believe it, and there are people who are still trying to build perpetual motion machines."

Entropy is also defined as "a measure of the disorder or randomness in a closed system," which also inexorably increases. You can mix hot and cold water, but because a large cup of warm water is more disordered than two smaller cups containing hot and cold water, you can never separate it back into hot and cold without adding energy to the system. Put another way, you can’t unscramble an egg or remove cream from your coffee. While some processes appear to be completely reversible, in practice, none actually are. Entropy, therefore, provides us with an arrow of time: forward is the direction of increasing entropy.