entropy
IPA: ˈɛntrʌpi
noun
- A measure of the disorder present in a system.
- (Boltzmann definition) A measure of the disorder directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate.
- (information theory) Shannon entropy
- (thermodynamics, countable) A measure of the amount of energy in a physical system that cannot be used to do work.
- The capacity factor for thermal energy that is hidden with respect to temperature.
- The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
- (statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
- (uncountable) The tendency of a system that is left to itself to descend into chaos.
Advertisement
Examples of "entropy" in Sentences
No Sentences Found for entropy
Advertisement
Advertisement