A self-help guide for autonomous systems

Authors
Michael Anderson
University of Arizona
Abstract
When things go badly, we notice that something is amiss, figure out what went wrong and why, and attempt to repair the problem. Artificial systems depend on their human designers to program in responses to every eventuality and therefore typically don’t even notice when things go wrong, following their programming over the proverbial, and in some cases literal, cliff. This article describes our work on the Meta-Cognitive Loop, a domain-general approach to giving artificial systems the ability to notice, assess, and repair problems. The goal is to make artificial systems more robust.
Keywords No keywords specified (fix it)
Categories (categorize this paper)
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

Our Archive


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 33,190
External links

Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
Through your library

References found in this work BETA

No references found.

Add more references

Citations of this work BETA

No citations found.

Add more citations

Similar books and articles

Analytics

Added to PP index
2009-01-28

Total downloads
57 ( #106,519 of 2,242,512 )

Recent downloads (6 months)
2 ( #231,950 of 2,242,512 )

How can I increase my downloads?

Monthly downloads

My notes

Sign in to use this feature