Entropy

Information Theory

Quantitative measure of the unpredictability of a dynamical phenomenon. Corollary: our uncertainty about its behavior over time. Carhart-Harris2019a.

Shannon used the term ‘uncertainty’ interchangeably with entropy — entropy provides a bridge across the physical and experiential divide that demands no special extrapolation Carhart-Harris2019a Friston2010 Ben-Naim2012

Derived from Claude Shannon’s Information Theory.

”When one meets the concept of entropy in communication theory, he has a right to be rather excited — a right to suspect that one has hold of something that may turn out to be basic and important.”

  • Warren Weaver, in reference to Claude Shannon, 1948 Shannon1949

Reflected in the shape of a probability distribution (Ben-Naim2012).

Methods in Neuroscience & Neuroimaging

Entropy findings in Psychedelic neuroimaging are most often reported as “brain entropy” but the methods used correspond to entirely different metrics (McCulloch2022, Shinozuka2023)

Applications


Relevant Notes

entropy is invariant to reshuffling
entropy rate is challenging to estimate
entropy models quantify information dynamics
entropy measures the variability of information
entropy rate quantifies the predictability of a signal
entropy takes into account the relative frequency of values
Information transfer
complexity via state-space entropy rate
Psychedelics increase brain entropy


Appendix

References

Q&A

What is entropy in information theory?

{{c1:: Quantitative measure of the unpredictability of a dynamical phenomenon}}