Bayesian networks (BNs) have become increasingly popular in recent years due to their wide-ranging applications in modeling uncertain knowledge. An essential problem about discrete BNs is learning conditional probabil...Bayesian networks (BNs) have become increasingly popular in recent years due to their wide-ranging applications in modeling uncertain knowledge. An essential problem about discrete BNs is learning conditional probability table (CPT) parameters. If training data are sparse, purely data-driven methods often fail to learn accurate parameters. Then, expert judgments can be introduced to overcome this challenge. Parameter constraints deduced from expert judgments can cause parameter estimates to be consistent with domain knowledge. In addition, Dirichlet priors contain information that helps improve learning accuracy. This paper proposes a constrained Bayesian estimation approach to learn CPTs by incorporating constraints and Dirichlet priors. First, a posterior distribution of BN parameters is developed over a restricted parameter space based on training data and Dirichlet priors. Then, the expectation of the posterior distribution is taken as a parameter estimation. As it is difficult to directly compute the expectation for a continuous distribution with an irregular feasible domain, we apply the Monte Carlo method to approximate it. In the experiments on learning standard BNs, the proposed method outperforms competing methods. It suggests that the proposed method can facilitate solving real-world problems. Additionally, a case study of Wine data demonstrates that the proposed method achieves the highest classification accuracy.展开更多
基金国家自然科学基金(the National Natural Science Foundation of China under Grant No.60575023)安徽省自然科学基金(the Natural Science Foundation of Anhui Province of China under Grant No.070412064)
基金supported by the National Natural Science Foundation of China(61573285)the Innovation Foundation for Doctor Dissertation of Northwestern Polytechnical University,China(CX201619)
文摘Bayesian networks (BNs) have become increasingly popular in recent years due to their wide-ranging applications in modeling uncertain knowledge. An essential problem about discrete BNs is learning conditional probability table (CPT) parameters. If training data are sparse, purely data-driven methods often fail to learn accurate parameters. Then, expert judgments can be introduced to overcome this challenge. Parameter constraints deduced from expert judgments can cause parameter estimates to be consistent with domain knowledge. In addition, Dirichlet priors contain information that helps improve learning accuracy. This paper proposes a constrained Bayesian estimation approach to learn CPTs by incorporating constraints and Dirichlet priors. First, a posterior distribution of BN parameters is developed over a restricted parameter space based on training data and Dirichlet priors. Then, the expectation of the posterior distribution is taken as a parameter estimation. As it is difficult to directly compute the expectation for a continuous distribution with an irregular feasible domain, we apply the Monte Carlo method to approximate it. In the experiments on learning standard BNs, the proposed method outperforms competing methods. It suggests that the proposed method can facilitate solving real-world problems. Additionally, a case study of Wine data demonstrates that the proposed method achieves the highest classification accuracy.