Linker IT Software
Google
Web www.oraxcel.com
menubar-top-links menubar-top-rechts
Home Help Search Login
Welcome, Guest. Please Login.
SQL*XL: Database to Excel bridge litLIB: Excel power functions pack ExcelLock: Locking and securing your valuable Excel spreadsheets encOffice: Protect your Excel file easy and safe encOffice: Protect your Excel file easy and safe
Pages: 1
Entropy (Read 1777 times)
Gerrit-Jan Linker
YaBB Administrator
*****




Posts: 75
Entropy
26.03.09 at 08:18:58
 
Entropy
 
Entropy S is a measure of the disorder of a system.
 
When a system's energy is defined as the sum of its "useful" energy, energy to do external work, and its "useless energy", energy which cannot be used to do external work, then entropy may be visualized as the "stray" or "lost" energy whose magnitude over the total energy of a system is directly proportional to the absolute temperature of the system.  
 
Entropy is one of the factors that determines the free energy of the system.  
 
The thermodynamic definition of entropy is only valid for a system in equilibrium (because temperature is defined only for a system in equilibrium).
 
The statistical definition of entropy applies to any system.
 
Entropy increase has often been defined as a change to a more disordered state at a molecular level.  
 
In recent years Entropy has been interpreted in terms of the "dispersal" of energy.
 
Entropy is an extensive state function that accounts for the effects of irreversibility in thermodynamic systems.
 
In terms of statistical mechanics, the entropy describes the number of the possible microscopic configurations of the system. The statistical definition of entropy is the more fundamental definition, from which all other definitions and all properties of entropy follow.
 
[Kittel, Thermal physics]
Entropy measures the number of quantum status accessible to the system  
 
Quantum states are either accessible or inaccessible to the system and each accessible quantum state is equally probable.
 
Given g possible states, entropy is defined as σ ≡ log(g)      
σ(U,N,V), entropy is a function of energy U, the number of particles N and the volume V.
(Log for mathematical convenience.)
 
The fundamental temperature tau τ is defined as 1/τ ≡ (δσ/δU)N,V
τ = kBT
 
The conventional entropyS is given by S = kBσ
 
Source:
Entropy: http://en.wikipedia.org/wiki/Entropy
Kittel, Thermal physics
Back to top
 
« Last Edit: 30.05.10 at 13:03:35 by Gerrit-Jan Linker »  

Gerrit-Jan Linker
Linker IT Software
Email WWW Gerrit-Jan Linker   IP Logged
Pages: 1