Back    All discussions

2016-02-27
Information Conveyed by Lies or Wrong Predictions Should be Negative
My reasons:
1. Accoding to relative information formula I=log[P(ei|hj)/P(ei)] in classical information theory, if P(ei|hj)2. A lie or wrong prediction is worse than a tautology or contradiction. For example, after master tells kitchener  "Three guests will come to have dinner", actually no guest comes. The master's saying is worse than a tautology or contradiction, because it will bring loss. Yet, according a tautology or contradiction, the kitchener either does nothing or asks for better prediction. 
3. If we code P(E|hj) according to a wrong prabability prediction or likelihood, such as P(E| hk is true) (actually hypothesis hk is wrong), the average codeword length will be H(E|hk)=- sum i P(ei|hj)logP(ei | hk is true)>H(E|hj)- sum i P(ei|hj)logP(ei|hj), which means that the saved average codeword length is negative.

I modify the formula  I=log[P(ei|hj)/P(ei)] into I(ei;hj)=log[P(ei|hj is true)/P(ei)]=log[T(hj|ei)/T(hj) 
where  P is statistical probability, and T(hj|ei) is the true value of proposition hj(ei), and T(hj)=sum i P9ei)T(hj|ei) is the logical probability of hj. 

So, the information of a tautology is I=log(1/1)=0; the information of a contradiction is I=log(0/0)=log1=0. For a lie or a wrong prediction, T(hj|ei) is less than T(hj), so I<0. 


2016-02-29
Information Conveyed by Lies or Wrong Predictions Should be Negative
Reply to Chenguang Lu
Don't agree to mix up physical or mathematical formulas with content-dependent value of real Information.

Look at http://www.plbg.at to find out latest meanings about about definition of the term Information.  

2016-03-07
Information Conveyed by Lies or Wrong Predictions Should be Negative
This is philosophical website. The "information" means semantic information or general information instead of Shannon information. The mathematical formulas can also be used for measuring semantic information.I know that Shannon information and Kullback-Leibler information are must positive. But semantic information of a prediction or a hypothesis is different because it is related to truth and falsity of hypothesis. 

Shannon information or Kullback-Leibler information is objectively conveyed information, yet semantic information is subjectively understood information. I have conclusion that Shannon information or Kullback-Leibler information is the upper limit of semantic information. Only when predicted probability distribution P(E|hj is true)  is the smae as factual probability distribution P(E|hj), the semantic information I(E;hj)= sum i P(ei|hj)log[P(ei|hj is true)/P(ei)] is equal to  Kullback-Leibler information I(E;hj)= sum i P(ei|hj)log[P(ei|hj)/P(ei)].


2016-03-08
Information Conveyed by Lies or Wrong Predictions Should be Negative
Reply to Chenguang Lu
Dear Chenguang Lu,

thanks for your response. In my Information scientific Axioms I attach 3 proposals: if you are real interested in the term Information read

http://www.plbg.at/Werke/english/Foundations%20of%20HO.pdf

On slide 5 I postulate "IP 3: Using the word Information needs more scientific differentiations to get valid"


That is to explain: The word Information is a very much used word since 2000 years. It makes no sense to differentiate it more than in the 3 axioms above. 

That's so often that I postulate, if you use it in scientific surrounding, that's not enough, you should differentiate it more explicitly. The result is normale a new term or a inclusion in legacy terms.

If you don't reach this new term it's only "something new, not known or interesting"!

2016-03-08
Information Conveyed by Lies or Wrong Predictions Should be Negative
I can only agree with you in some aspects. My opinions are:First, generally, information theories do not research the content of information, it only research the amount of information, which is related to probability and truth-falsity. 
Second, interesting thing is also related to information value. You are right, we should research information value. But that is another task. I also research portfolio which is related to information value. I published a book for this. There is an abstract:
http://survivor99.com/lcg/english/portfolio/ENG_page.html
Third, animals also receive information!

2016-03-11
Information Conveyed by Lies or Wrong Predictions Should be Negative
Reply to Chenguang Lu
Dear Chenguang LU,

thanks for your response.

I too can agree in some details only to your text.

a) The research of the amount of Information is here the content of digital messages, which are called in Information Sciences "stored Information" in form of digital Data. So SHANNON's Information Theory and all following mathematically formulas using probability are enumberations of Data not of Information as used before him.

b) I agree with necessity of research of Information Value. I call it "how to get Information precious?". See    http://www.plbg.at/Werke/english/Information%20how%20it%20gets%20precious(2014).pdf

Thanks for your link. I will give you a comment when I read it

2016-04-06
Information Conveyed by Lies or Wrong Predictions Should be Negative
Yes, the information before Shannon is different from that defined by Shannon. The information, such as used by Popper as criterion to evaluate scientific hypotheses or thoeires, is related to truth-falsity or semantic meaning. Shannon's information is irrelative to semantic meaning. Distinction lies in semantic meaning insted of mathematics.

2016-09-13
Information Conveyed by Lies or Wrong Predictions Should be Negative
Reply to Chenguang Lu
4. Popper uses information criterion to test or falsify hypotheses. It is natural that positive information supports a hypothesis and negative information falsify a hypothesis. If semantic information is always positive, how can we use information criterion to falsify a hypothesis?