A predictive causal model for improving decision making by learning through intervention tactics (PhD)

Learning about the world is a critical step for any intelligent task-solving agent. Understanding the consequences of performed actions helps to improve decisions through the prediction of events following an observed cause. We propose a model capable of empirically learning causal relationships between events through purposeful intervention in a real-world setting. An agent is to devise conditional dependences between actions and events following observed probabilistic correlations. Consequently, algorithms construct a sequence of tactics to explore the nature of the connection by forcing the occurrence of specific events, establishing direct causal links between the performed actions and their effects on monitored variables. The result is a Causal Bayesian Network storing the probability distributions describing the workings of the presented system. This knowledge is used in a Partially-Observable Markov Decision Process so task solutions may be improved. Behavioural experiments with human participants and humanoid robotic platforms will be performed to obtain the necessary data for tactic sampling and observation of variable effects. It is proposed this knowledge will significantly improve the flexibility and speed of learning task solutions in an uncertain complex environment.

Related Themes

Related People