Charles Explorer logo
🇬🇧

Planning and acting in dynamic environments: identifying and avoiding dangerous situations

Publication at Faculty of Mathematics and Physics |
2022

Abstract

In dynamic environments, external events might occur and modify the environment without consent of intelligent agents. Plans of the agents might hence be disrupted and, worse, the agents might end up in dead-end states and no longer be able to achieve their goals.

Hence, the agents should monitor the environment during plan execution and if they encounter a dangerous situation they should (reactively) act to escape from it. In this paper, we introduce the notion of dangerous states that the agent might encounter during its plan execution in dynamic environments.

We present a method for computing lower bound of dangerousness of a state after applying a sequence of actions. That method is leveraged in identifying situations in which the agent has to start acting to avoid danger.

We present two types of such behaviour - purely reactive and proactive (eliminating the source of danger). The introduced concepts for planning with dangerous states are implemented and tested in two scenarios - a simple RPG-like game, called Dark Dungeon, and a platform game inspired by the Perestroika video game.

The results show that reasoning with dangerous states achieves better success rate (reaching the goals) than naive planning or rule-based techniques.