Kolmogorov complexity and information theory. With an interpretation in terms of questions and answers

We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, and we explain howit relates to sequential question-answer sessions.
Keywords algorithmic information theory  data compression  Kolmogorov complexity  mutual information  prefix codes  rate distortion theory  Shannon information theory  universal codes
Categories (categorize this paper)
Reprint years 2004
DOI 10.1023/A:1025011119492
 Save to my reading list
Follow the author(s)
My bibliography
Export citation
Find it on Scholar
Edit this record
Mark as duplicate
Revision history
Request removal from index
Download options
Our Archive

Upload a copy of this paper     Check publisher's policy     Papers currently archived: 26,167
Through your library
References found in this work BETA

No references found.

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles

Monthly downloads

Added to index


Total downloads

41 ( #123,281 of 2,153,485 )

Recent downloads (6 months)

2 ( #280,610 of 2,153,485 )

How can I increase my downloads?

My notes
Sign in to use this feature

There  are no threads in this forum
Nothing in this forum yet.

Other forums