Physics
The second law of thermodynamics in its modern formulation states that the total entropy of an isolated system never decreases.
If any isolated system that is not already in a state of maximum entropy (the equilibrium condition) is left alone, it will evolve towards that state and stay there once reached — this is called the thermodynamic gradient
Statistics / Information Theory
Entropy: measured “randomness” of a set of variables where entropy is and is the proportion of times you have value , range from
- Low entropy means it is very predictable whereas high entropy means it is very unpredictable (roughly, spread)
- Normal distribution has the highest entropy