Author
Giancola, Michael
Other Contributors
Bringsjord, Selmer; Nirenburg, Sergei; Varela, Carlos; Bello, Paul; Sundar Govindarajulu, Naveen;
Date Issued
2023-05
Subject
Computer science
Degree
PhD;
Terms of Use
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute (RPI), Troy, NY. Copyright of original work retained by author.;
Abstract
Human beings routinely encounter situations containing informal, non-quantitative uncertainty. Consider for example the following scenario: Driving toward a four-way intersection, you stop at a red light. Eventually, the light turns green, but you perceive a driver approaching from your left, their light having turned red moments ago, and subsequently perceive their car accelerate. What can we say about this situation? It certainly seems likely that the driver will drive straight through the light. Of course, it's entirely possible that the driver will change their trajectory at the last second and slam on the brakes. How can we quantify this uncertainty (assuming this is what we desired)? We could compute a probability over all recorded instances of drivers accelerating toward red lights and either going through or stopping. But clearly humans don’t engage in anything like this computation when they reason about other drivers on the road. We use likelihoods to express qualities (as opposed to quantities e.g. probabilities) of the uncertainty of beliefs. In this way, one may reason that "I believe it's highly likely that the driver will drive through the red light" and subsequently come to the conclusion that, despite having the legal right-of-way, one should wait to avoid an accident. Autonomous agents, in order to effectively interact with humans that reason this way, will need to possess and exploit the ability to model reasoning with notions of qualitative uncertainty. The present dissertation introduces Cognitive Likelihood, a framework for reasoning with uncertain beliefs. The framework is implemented within a novel logic -- the Inductive Deontic Cognitive Event Calculus (IDCEC) -- which includes a formal grammar and semantics which dictate how agents can reason within the framework. These formalisms are implemented in an automated reasoner called ShadowAdjudicator in order to enable the automatic generation of IDCEC proofs. We present the novel algorithm underlying ShadowAdjudicator which enables this automated proof discovery. Finally, we demonstrate how these contributions can be utilized to solve autonomous driving problems and to adjudicate arguments regarding a notorious probability puzzle, the Monty Hall Problem.;
Description
May2023; School of Science
Department
Dept. of Computer Science;
Publisher
Rensselaer Polytechnic Institute, Troy, NY
Relationships
Rensselaer Theses and Dissertations Online Collection;
Access
Restricted to current Rensselaer faculty, staff and students in accordance with the
Rensselaer Standard license. Access inquiries may be directed to the Rensselaer Libraries.;