A Semantic Information Formula Compatible with Shannon and Popper's Theories

Abstract

Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s probability defined by Zadeh is treated as the logical probability sought by Popper, and the membership grade is treated as the truth-value of a proposition and also as the posterior logical probability. The classical relative information formula (Information=log(Posterior probability / Prior probability) is revised into SIF by replacing the posterior probability with the membership grade and the prior probability with the fuzzy set’s probability. The SIF can be explained as “Information=Testing severity – Relative square deviation” and hence can be used as Popper's information criterion to test scientific theories or propositions. The information measure defined by the SIF also means the spared codeword length as the classical information measure. This paper introduces the set-Bayes’ formula which establishes the relationship between statistical probability and logical probability, derives Fuzzy Information Criterion (FIC) for the optimization of semantic channel, and discusses applications of SIF and FIC in areas such as linguistic communication, prediction, estimation, test, GPS, translation, and fuzzy reasoning. Particularly, through a detailed example of reasoning, it is proved that we can improve semantic channel with proper fuzziness to increase average semantic information to reach its upper limit: Shannon mutual information.

Links

PhilArchive

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

  • Only published works are available at libraries.

Similar books and articles

On Quantifying Semantic Information.Simon D'Alfonso - 2011 - Information 2 (1):61-101.
Logical pluralism and semantic information.Patrick Allo - 2007 - Journal of Philosophical Logic 36 (6):659 - 694.
Search for syllogistic structure of semantic information.Marcin J. Schroeder - 2012 - Journal of Applied Non-Classical Logics 22 (1-2):83-103.
An introduction to logical entropy and its relation to Shannon entropy.David Ellerman - 2013 - International Journal of Semantic Computing 7 (2):121-145.
A Critical Analysis of Floridi’s Theory of Semantic Information.Pieter Adriaans - 2010 - Knowledge, Technology & Policy 23 (1-2):41-56.
Probability as a Measure of Information Added.Peter Milne - 2012 - Journal of Logic, Language and Information 21 (2):163-188.
Pre-cognitive Semantic Information.Orlin Vakarelov - 2010 - Knowledge, Technology & Policy 23 (1-2):193-226.
The transmission sense of information.Carl T. Bergstrom & Martin Rosvall - 2011 - Biology and Philosophy 26 (2):159-176.
The Metaphilosophy of Information.Sebastian Sequoiah-Grayson - 2007 - Minds and Machines 17 (3):331-344.

Analytics

Added to PP
2016-02-18

Downloads
817 (#18,616)

6 months
229 (#11,271)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Citations of this work

No citations found.

Add more citations

References found in this work

Semantic conceptions of information.Luciano Floridi - 2008 - Stanford Encyclopedia of Philosophy.
A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.

Add more references