AI/ML

Iterated Belief Revision Under Resource Constraints: Logic as Geometry. (arXiv:1812.08313v1 [cs.AI])



We propose a variant of iterated belief revision designed for settings with
limited computational resources, such as mobile autonomous robots. The proposed
memory architecture—called the {em universal memory architecture}
(UMA)—maintains an epistemic state in the form of a system of default rules
similar to those studied by Pearl and by Goldszmidt and Pearl (systems $Z$ and
$Z^+$). A duality between the category of UMA representations and the category
of the corresponding model spaces, extending the Sageev-Roller duality between
discrete poc sets and discrete median algebras provides a two-way dictionary
from inference to geometry, leading to immense savings in computation, at a
cost in the quality of representation that can be quantified in terms of
topological invariants. Moreover, the same framework naturally enables
comparisons between different model spaces, making it possible to analyze the
deficiencies of one model space in comparison to others. This paper develops
the formalism underlying UMA, analyzes the complexity of maintenance and
inference operations in UMA, and presents some learning guarantees for
different UMA-based learners. Finally, we present simulation results to
illustrate the viability of the approach, and close with a discussion of the
strengths, weaknesses, and potential development of UMA-based learners.

Source link




Related posts

Exploring the nature of intelligence

Newsemia

Facebook hosts 2019 Networking and Communications Faculty Summit

Newsemia

More than a million people missed meningitis jabs after GP software alert was left switched off

Newsemia

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy