Definitions from Wiktionary ()
|
|
▸ noun: (Boltzmann definition) A measure of the disorder directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate.
▸ noun: (information theory) Shannon entropy
▸ noun: (thermodynamics, countable) A measure of the amount of energy in a physical system that cannot be used to do work.
▸ noun: The capacity factor for thermal energy that is hidden with respect to temperature.
▸ noun: The dispersal of energy; how much energy is spread out in a process, or how widely spread out it becomes, at a specific temperature.
▸ noun: (statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
▸ noun: (uncountable) The tendency of a system that is left to itself to descend into chaos.
Similar:
information,
selective information,
anentropy,
chaoticity,
disentropy,
disorderliness,
disorderedness,
randomness,
chaoticness,
unorderliness,
more...
Opposite:
Types:
Phrases:
Adjectives:
maximum,
negative,
total,
low,
constant,
specific,
configurational,
relative,
high,
thermodynamic,
conditional
Colors:
|
▸ Word origin
▸ Words similar to entropy
▸ Usage examples for entropy
▸ Idioms related to entropy
▸ Wikipedia articles (New!)
▸ Popular adjectives describing entropy
▸ Words that often appear near entropy
▸ Rhymes of entropy
▸ Invented words related to entropy