Iterated Belief Revision Under Resource Constraints: Logic as Geometry. (arXiv:1812.08313v1 [cs.AI])

We propose a variant of iterated belief revision designed for settings with
limited computational resources, such as mobile autonomous robots. The proposed
memory architecture—called the {em universal memory architecture}
(UMA)—maintains an epistemic state in the form of a system of default rules
similar to those studied by Pearl and by Goldszmidt and Pearl (systems $Z$ and
$Z^+$). A duality between the category of UMA representations and the category
of the corresponding model spaces, extending the Sageev-Roller duality between
discrete poc sets and discrete median algebras provides a two-way dictionary
from inference to geometry, leading to immense savings in computation, at a
cost in the quality of representation that can be quantified in terms of
topological invariants. Moreover, the same framework naturally enables
comparisons between different model spaces, making it possible to analyze the
deficiencies of one model space in comparison to others. This paper develops
the formalism underlying UMA, analyzes the complexity of maintenance and
inference operations in UMA, and presents some learning guarantees for
different UMA-based learners. Finally, we present simulation results to
illustrate the viability of the approach, and close with a discussion of the
strengths, weaknesses, and potential development of UMA-based learners.

Source link

Related posts

Computational Approaches in Theranostics: Mining and Predicting Cancer Data.


HHS Releases Draft Strategy on Addressing Clinician Burden Issues




This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy


COVID-19 (Coronavirus) is a new illness that is having a major effect on all businesses globally LIVE COVID-19 STATISTICS FOR World