A probabilistic boosting tree framework for computing two-class and
multi-class discriminative models is disclosed. In the learning stage,
the probabilistic boosting tree (PBT) automatically constructs a tree in
which each node combines a number of weak classifiers (e.g., evidence,
knowledge) into a strong classifier or conditional posterior probability.
The PBT approaches the target posterior distribution by data augmentation
(e.g., tree expansion) through a divide-and-conquer strategy. In the
testing stage, the conditional probability is computed at each tree node
based on the learned classifier which guides the probability propagation
in its sub-trees. The top node of the tree therefore outputs the overall
posterior probability by integrating the probabilities gathered from its
sub-trees. In the training stage, a tree is recursively constructed in
which each tree node is a strong classifier. The input training set is
divided into two new sets, left and right ones, according to the learned
classifier. Each set is then used to train the left and right sub-trees
recursively.