Techniques are disclosed that implement algorithms for rapidly finding the
leave-one-out (LOO) error for regularized least squares (RLS) problems
over a large number of values of the regularization parameter .lamda..
Algorithms implementing the techniques use approximately the same time
and space as training a single regularized least squares
classifier/regression algorithm. The techniques include a
classification/regression process suitable for moderate sized datasets,
based on an eigendecomposition of the unregularized kernel matrix. This
process is applied to a number of benchmark datasets, to show empirically
that accurate classification/regression can be performed using a Gaussian
kernel with surprisingly large values of the bandwidth parameter .sigma..
It is further demonstrated how to exploit this large .sigma. regime to
obtain a linear-time algorithm, suitable for large datasets, that
computes LOO values and sweeps over .lamda..