Multivariate prediction using softly shrunk reduced-rank regression

Publikasjonsdetaljer

Multivariate regression is considered with emphasis on prediction. Ordinary least squares tends to yield unstable estimates and consequently uncertain predictions when there are few observations in the training data compared to the number of parameters to be estimated. Thus, there is a need for methods that work well in such situations. This article presents a new alternative prediction method based on reduced-rank regression. The method of reduced-rank regression uses a certain decomposition of the ordinary least squares estimate of the matrix of regression coefficients, and shrinks the last terms of this decomposition exactly to 0. I suggest a new method with soft shrinkage of the terms in the decomposition. Furthermore, the reduced-rank regression, as well as the new softly shrunk reduced-rank regression, are combined with principal components regression. The methods are modified to handle missing observations in the response variables. The various methods are compared through a simulation study. Softly shrunk reduced-rank regression, possibly combined with principal components, turns out to be the best overall method, and the improvement over ordinary least squares is particularly large in situations with few observations.