Authors

Thimm, M.

Publication date

# of pages

352

Cover

Softcover

ISBN print

978-1-60750-960-8

Description

Reasoning with inaccurate information is a major topic within the fields of artificial intelligence in general and knowledge representation and reasoning in particular. This publication deals with information that can be incomplete, uncertain and contradictory. Probabilistic conditional logic is employed which allows for the representation of uncertain pieces of information by using probabilistic conditionals, i.e. if-then-rules. Uncertainty can be expressed by means of probabilities attached to those rules and incompleteness can be handled in this framework by reasoning based on the principle of maximum entropy.

This book focuses on two major issues that arise when representing knowledge with probabilistic conditional logic. On the one hand, we look at the problem of contradictory information that, e.g., arises when multiple experts share their knowledge in order to come up with a common knowledge base consisting of probabilistic conditionals. As in classical logic this is a severe problem because inconsistency of a knowledge base forbids application of model-based inductive inference approaches such as reasoning based on the principle of maximum entropy. On the other hand, we investigate an extension of the syntactical and semantical notions of probabilistic conditional logic to the relational case. We also extend the approach of reasoning based on the principle of maximum entropy to the framework of relational probabilistic conditional logic and investigate its properties.

Abstracted / Indexed in