An Improved Admissible Heuristic for Learning Optimal Bayesian Networks

Changhe Yuan and Brandon Maone

Abstract

Recently two search algorithms, A* and breadthfirst branch and bound (BFBnB), were developed based on a simple admissible heuristic for learning Bayesian network structures that optimize a scoring function. The heuristic represents a relaxation of the learning problem such that each variable chooses optimal parents independently. As a result, the heuristic may contain many directed cycles and result in a loose bound. This paper introduces an improved admissible heuristic that tries to avoid directed cycles within small groups of variables. A sparse representation is also introduced to store only the unique optimal parent choices. Empirical results show that the new techniques significantly improved the efficiency and scalability of A* and BFBnB on most of datasets tested in this paper.

PDF

Bibtex

@INPROCEEDINGS{Yuan12improved,
author = {Changhe Yuan and Brandon Maone},
title = {An Improved Admissible Heuristic for Learning Optimal Bayesian Networks},
booktitle = {Proceedings of The 28th Conference on Uncertainty in Artificial Intelligence (UAI-12)},
address = {Catalina Island, CA},
year = {2012}
}