The time taken to learn a model from training examples is often
unacceptable. For instance, training language understanding models with
Adaboost or SVMs can take weeks or longer based on numerous training
examples. Parallelization thought the use of multiple processors may
improve learning speed. The invention describes effective methods to
distributed multiclass classification learning on several processors.
These methods are applicable to multiclass models where the training
process may be split into training of independent binary classifiers.