A method for generating a net analyte signal calibration model for use in
detecting and/or quantifying the amount of an analyte in a test subject.
The net analyte signal can be generated by providing a set of in vivo
infrared spectra for a test subject during a period in which an analyte
concentration is essentially constant; calculating an optimal subspace of
spectra that at least substantially describes all non-analyte dependent
spectral variance in the in vivo spectra; providing a pure component
infrared spectrum for the analyte; and calculating a net analyte signal
spectrum from a data set comprising the optimal subspace spectra and the
pure analyte spectrum. The net analyte signal calibration model can be
used, for example, in measuring the concentration of analyte in a test
subject, and/or for evaluating the analytical significance of an in vivo
multivariate calibration model.