Kullback–Leibler (KL) divergence (also called relative entropy and I-divergence)
denoted ,is a type of statistical distance.
measure how one probability distribution P is different from a second, reference probability distribution Q

一种概率分布距离的定义

D_{KL}(P||D) = \sum_{x\in X}P(x)log(\frac{P(x)}{D(x)}) \tag{1}

  • it is not symmetric
  • not satisfy triangle inequality
  • 不太算是一种距离