dc.date.accessioned | 2008-08-01T21:30:16Z | |
dc.date.accessioned | 2018-11-26T22:25:24Z | |
dc.date.available | 2008-08-01T21:30:16Z | |
dc.date.available | 2018-11-26T22:25:24Z | |
dc.date.issued | 2008-07-29 | en_US |
dc.identifier.uri | http://hdl.handle.net/1721.1/41920 | |
dc.identifier.uri | http://repository.aust.edu.ng/xmlui/handle/1721.1/41920 | |
dc.description.abstract | We describe a method to use structured representations of the environmentâ s dynamics to constrain and speed up the planning process. Given a problem domain described in a probabilistic logical description language, we develop an anytime technique that incrementally improves on an initial, partial policy. This partial solution is found by ï¬ rst reducing the number of predicates needed to represent a relaxed version of the problem to a minimum, and then dynamically partitioning the action space into a set of equivalence classes with respect to this minimal representation. Our approach uses the envelope MDP framework, which creates a Markov decision process out of a subset of the full state space as de- termined by the initial partial solution. This strategy permits an agent to begin acting within a restricted part of the full state space and to expand its envelope judiciously as resources permit. | en_US |
dc.format.extent | 17 p. | en_US |
dc.relation | Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory | en_US |
dc.relation | | en_US |
dc.title | Adaptive Envelope MDPs for Relational Equivalence-based Planning | en_US |