**Entropy** Entropy S is a measure of the disorder of a system.

When a system's energy is defined as the sum of its "useful" energy, energy to do external work, and its "useless energy", energy which cannot be used to do external work, then entropy may be visualized as the "stray" or "lost" energy whose magnitude over the total energy of a system is directly proportional to the absolute temperature of the system.

Entropy is one of the factors that determines the free energy of the system.

The thermodynamic definition of entropy is only valid for a system in equilibrium (because temperature is defined only for a system in equilibrium).

The statistical definition of entropy applies to any system.

Entropy increase has often been defined as a change to a more disordered state at a molecular level.

In recent years Entropy has been interpreted in terms of the "dispersal" of energy.

Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.

In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is the more fundamental definition, from which all other definitions and all properties of entropy follow.

[Kittel, Thermal physics]

Entropy measures the number of quantum status accessible to the system

Quantum states are either accessible or inaccessible to the system and each accessible quantum state is equally probable.

Given g possible states, entropy is defined as σ ≡ log(g)

σ(U,N,V), entropy is a function of energy U, the number of particles N and the volume V.

(Log for mathematical convenience.)

The fundamental temperature tau τ is defined as 1/τ ≡ (δσ/δU)

_{N,V} τ = k

_{B}T

The conventional entropyS is given by S = k

_{B}σ

__Source:__ Entropy:

http://en.wikipedia.org/wiki/Entropy Kittel, Thermal physics