entropy

English

/ˈɛntɹəpi/

noun
Definitions
  • (thermodynamics)
  • (statistics) A measure of the amount of information and noise present in a signal.
  • (uncountable) The tendency of a system that is left to itself to descend into chaos.

Etymology

Derived from German Entropie derived from Ancient Greek τροπή (transformation, turning, turn, a turn, solstice, trope).

Origin

Ancient Greek

τροπή

Gloss

transformation, turning, turn, a turn, solstice, trope

Concept
Semantic Field

Motion

Ontological Category

Action/Process

Emoji
🙃

Timeline

Distribution of cognates by language

Geogrpahic distribution of cognates

Cognates and derived terms