Entropy takes into account the relative frequency of values
202306161812
Status:
Tags: entropy
Shannon entropy only takes into account the relative frequency of values, not the order in which they appear.
e.g. you have signal A 01010101 and signal B 01110010. A naive plug-in estimator yields the same entropy for both, as they have the same frequency of numbers.
However, if one knows the past trajectory of the sequence it’s clearly easier to predict the next value of signal A than B. Conversely, entropy rate quantifies the predictability of a signal