1. Accoding to relative information formula I=log[P(ei|hj)/P(ei)] in classical information theory, if P(ei|hj)2. A lie or wrong prediction is worse than a tautology or contradiction. For example, after master tells kitchener "Three guests will come to have dinner", actually no guest comes. The master's saying is worse than a tautology or contradiction, because it will bring loss. Yet, according a tautology or contradiction, the kitchener either does nothing or asks for better prediction.
3. If we code P(E|hj) according to a wrong prabability prediction or likelihood, such as P(E| hk is true) (actually hypothesis hk is wrong), the average codeword length will be H(E|hk)=- sum i P(ei|hj)logP(ei | hk is true)>H(E|hj)- sum i P(ei|hj)logP(ei|hj), which means that the saved average codeword length is negative.
I modify the formula I=log[P(ei|hj)/P(ei)] into I(ei;hj)=log[P(ei|hj is true)/P(ei)]=log[T(hj|ei)/T(hj)
where P is statistical probability, and T(hj|ei) is the true value of p ... (read more