Finding the optimal bayesian network given a constraint graph

Department of Computer Science, University of Washington, Seattle, Washington, United States
Department of Genome Science, University of Washington, Seattle, Washington, United States
DOI
10.7287/peerj.preprints.2872v1
Subject Areas
Artificial Intelligence, Data Mining and Machine Learning, Data Science, Distributed and Parallel Computing
Keywords
Bayesian Network, Structure Learning, Discrete Optimization, Parallel Processing, Big Data
Copyright
© 2017 Schreiber et al.
Licence
This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Preprints) and either DOI or URL of the article must be cited.
Cite this article
Schreiber JM, Noble WS. 2017. Finding the optimal bayesian network given a constraint graph. PeerJ Preprints 5:e2872v1

Abstract

Despite recent algorithmic improvements, learning the optimal structure of a Bayesian network from data is typically infeasible past a few dozen variables. Fortunately, domain knowledge can frequently be exploited to achieve dramatic computational savings, and in many cases domain knowledge can even make structure learning tractable. Several methods have previously been described for representing this type of structural prior knowledge, including global orderings, super-structures, and constraint rules. While super-structures and constraint rules are flexible in terms of what prior knowledge they can encode, they achieve savings in memory and computational time simply by avoiding considering invalid graphs. We introduce the concept of a "constraint graph" as an intuitive method for incorporating rich prior knowledge into the structure learning task. We describe how this graph can be used to reduce the memory cost and computational time required to find the optimal graph subject to the encoded constraints, beyond merely eliminating invalid graphs. In particular, we show that a constraint graph can break the structure learning task into independent subproblems even in the presence of cyclic prior knowledge. These subproblems are well suited to being solved in parallel on a single machine or distributed across many machines without excessive communication cost.

Author Comment

This is a submission to PeerJ Computer Science for review.