# Mutual information The mutual information measures the correlation between two subsystems. For subsystems $A$ and $B$, the mutual information $I(A:B$) between them is defined as $I(A: B)_{\rho}=S\left(\rho_{A B} \| \rho_{A} \otimes \rho_{B}\right)=S(A)_{\rho}+S(B)_{\rho}-S(A B)_{\rho}.$ ## Properties - always positive - for pure states, non-zero mutual information implies two regions are entangled; not true for mixed states ## Applications The mutual information is related to 2-point function: $I(A, B) \geqslant \frac{\left(\left\langle\mathcal{O}_{L} \mathcal{O}_{R}\right\rangle-\left\langle\mathcal{O}_{L}\right\rangle\left\langle\mathcal{O}_{R}\right\rangle\right)^{2}}{2\left\langle\mathcal{O}_{L}^{2}\right\rangle\left\langle\mathcal{O}_{R}^{2}\right\rangle}.$ See e.g. [[2018#Jahnke (Review)]] and [[WolfVerstraeteHastingsCirac2007]][](https://arxiv.org/abs/0704.3906). ## Refs - parent [[0290 Quantum information measures]] - generalisations - [[2024#Glorioso, Qi, Yang]]: separated in time ## Examples - between two quantum maximally entangled qubits: 2 log 2 - between two classical correlated bits: log 2 ## Analysis - shape dependence (in the OPE limit): [[ChenWang2022]][](https://arxiv.org/pdf/2207.05268.pdf)