Entropy, Covariance, and Mutual Information
The concept of covariance of entropies can be understood as a way of quantifying how uncertainties in different signals vary together. Rather than monitoring each metric independently, the focus shifts to the relationships between sources of uncertainty. When signals that typically exhibit aligned behavior diverge, this can provide an early indicator of system anomalies. While this terminology is uncommon, the underlying idea overlaps strongly with established constructs in information theory, particularly mutual information. Mutual information measures the reduction in uncertainty about one random variable given knowledge of another, and has been widely applied to anomaly detection and monitoring tasks. ...