Semantic Information G Theory and Logical Bayesian Inference for Machine Learning

Information 10 (8):261 (2019)
  Copy   BIBTEX

Abstract

An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic.

Links

PhilArchive

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Does information inform confirmation?Colin Howson - 2016 - Synthese 193 (7):2307-2321.
Logical pluralism and semantic information.Patrick Allo - 2007 - Journal of Philosophical Logic 36 (6):659 - 694.
Logical ignorance and logical learning.Richard Pettigrew - 2021 - Synthese 198 (10):9991-10020.
The Problem of Measure Sensitivity Redux.Peter Brössel - 2013 - Philosophy of Science 80 (3):378-397.
Bayes in the Brain—On Bayesian Modelling in Neuroscience.Matteo Colombo & Peggy Seriès - 2012 - British Journal for the Philosophy of Science 63 (3):697-723.
Bayesian Informal Logic and Fallacy.Kevin Korb - 2004 - Informal Logic 24 (1):41-70.
Bayesian Confirmation or Ordinary Confirmation?Yongfeng Yuan - 2020 - Studia Logica 108 (3):425-449.
Confirmation of Scientific Hypotheses as Relations.Aysel Dogan - 2005 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 36 (2):243-259.
Confirmation of scientific hypotheses as relations.Aysel Dogan - 2005 - Journal for General Philosophy of Science / Zeitschrift für Allgemeine Wissenschaftstheorie 36 (2):243 - 259.

Analytics

Added to PP
2020-03-25

Downloads
334 (#61,358)

6 months
88 (#54,401)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

References found in this work

The Logic of Scientific Discovery.K. Popper - 1959 - British Journal for the Philosophy of Science 10 (37):55-57.
The semantic conception of truth and the foundations of semantics.Alfred Tarski - 1943 - Philosophy and Phenomenological Research 4 (3):341-376.
Truth and meaning.Donald Davidson - 1967 - Synthese 17 (1):304-323.
Truth and meaning.Donald Davidson - 1967 - Synthese 17 (1):304-323.

View all 8 references / Add more references