🇬🇧 en el 🇬🇷

entropy noun

  /ˈɛntɹəpi/
  • (Boltzmann definition) A measure of the disorder directly proportional to the natural logarithm of the number of microstates yielding an equivalent thermodynamic macrostate.
  • (thermodynamics, countable) A measure of the amount of energy in a physical system that cannot be used to do work.
  • (statistics, information theory, countable) A measure of the amount of information and noise present in a signal.
  • (uncountable) The tendency of a system that is left to itself to descend into chaos.
εντροπία
Wiktionary Links